Sep 30 13:58:14 crc systemd[1]: Starting Kubernetes Kubelet... Sep 30 13:58:14 crc restorecon[4671]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 13:58:14 crc restorecon[4671]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 13:58:15 crc restorecon[4671]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 13:58:15 crc restorecon[4671]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Sep 30 13:58:17 crc kubenswrapper[4676]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 30 13:58:17 crc kubenswrapper[4676]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Sep 30 13:58:17 crc kubenswrapper[4676]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 30 13:58:17 crc kubenswrapper[4676]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 30 13:58:17 crc kubenswrapper[4676]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 30 13:58:17 crc kubenswrapper[4676]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.044581 4676 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.047337 4676 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.047355 4676 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.047359 4676 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.047364 4676 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.047368 4676 feature_gate.go:330] unrecognized feature gate: Example Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.047372 4676 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.047377 4676 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.047380 4676 feature_gate.go:330] unrecognized feature gate: SignatureStores Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.047384 4676 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.047388 4676 feature_gate.go:330] unrecognized feature gate: OVNObservability Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.047392 4676 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.047398 4676 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.047403 4676 feature_gate.go:330] unrecognized feature gate: GatewayAPI Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.047408 4676 feature_gate.go:330] unrecognized feature gate: InsightsConfig Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.047413 4676 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.047418 4676 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.047422 4676 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.047429 4676 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.047433 4676 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.047436 4676 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.047440 4676 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.047444 4676 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.047447 4676 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.047451 4676 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.047455 4676 feature_gate.go:330] unrecognized feature gate: NewOLM Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.047459 4676 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.047464 4676 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.047467 4676 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.047471 4676 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.047475 4676 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.047479 4676 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.047482 4676 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.047486 4676 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.047490 4676 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.047493 4676 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.047498 4676 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.047503 4676 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.047507 4676 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.047511 4676 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.047514 4676 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.047518 4676 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.047521 4676 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.047525 4676 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.047528 4676 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.047531 4676 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.047535 4676 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.047538 4676 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.047542 4676 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.047546 4676 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.047550 4676 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.047553 4676 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.047564 4676 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.047567 4676 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.047571 4676 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.047575 4676 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.047579 4676 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.047584 4676 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.047587 4676 feature_gate.go:330] unrecognized feature gate: PlatformOperators Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.047592 4676 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.047596 4676 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.047600 4676 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.047603 4676 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.047607 4676 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.047610 4676 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.047613 4676 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.047617 4676 feature_gate.go:330] unrecognized feature gate: PinnedImages Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.047621 4676 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.047625 4676 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.047629 4676 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.047632 4676 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.047636 4676 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051041 4676 flags.go:64] FLAG: --address="0.0.0.0" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051058 4676 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051068 4676 flags.go:64] FLAG: --anonymous-auth="true" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051074 4676 flags.go:64] FLAG: --application-metrics-count-limit="100" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051081 4676 flags.go:64] FLAG: --authentication-token-webhook="false" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051085 4676 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051092 4676 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051097 4676 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051101 4676 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051106 4676 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051110 4676 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051115 4676 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051119 4676 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051123 4676 flags.go:64] FLAG: --cgroup-root="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051127 4676 flags.go:64] FLAG: --cgroups-per-qos="true" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051131 4676 flags.go:64] FLAG: --client-ca-file="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051135 4676 flags.go:64] FLAG: --cloud-config="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051139 4676 flags.go:64] FLAG: --cloud-provider="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051143 4676 flags.go:64] FLAG: --cluster-dns="[]" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051148 4676 flags.go:64] FLAG: --cluster-domain="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051152 4676 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051156 4676 flags.go:64] FLAG: --config-dir="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051160 4676 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051164 4676 flags.go:64] FLAG: --container-log-max-files="5" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051170 4676 flags.go:64] FLAG: --container-log-max-size="10Mi" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051174 4676 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051178 4676 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051183 4676 flags.go:64] FLAG: --containerd-namespace="k8s.io" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051187 4676 flags.go:64] FLAG: --contention-profiling="false" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051191 4676 flags.go:64] FLAG: --cpu-cfs-quota="true" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051195 4676 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051200 4676 flags.go:64] FLAG: --cpu-manager-policy="none" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051205 4676 flags.go:64] FLAG: --cpu-manager-policy-options="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051211 4676 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051215 4676 flags.go:64] FLAG: --enable-controller-attach-detach="true" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051219 4676 flags.go:64] FLAG: --enable-debugging-handlers="true" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051224 4676 flags.go:64] FLAG: --enable-load-reader="false" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051228 4676 flags.go:64] FLAG: --enable-server="true" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051232 4676 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051238 4676 flags.go:64] FLAG: --event-burst="100" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051242 4676 flags.go:64] FLAG: --event-qps="50" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051246 4676 flags.go:64] FLAG: --event-storage-age-limit="default=0" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051251 4676 flags.go:64] FLAG: --event-storage-event-limit="default=0" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051255 4676 flags.go:64] FLAG: --eviction-hard="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051261 4676 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051265 4676 flags.go:64] FLAG: --eviction-minimum-reclaim="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051269 4676 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051273 4676 flags.go:64] FLAG: --eviction-soft="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051278 4676 flags.go:64] FLAG: --eviction-soft-grace-period="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051282 4676 flags.go:64] FLAG: --exit-on-lock-contention="false" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051286 4676 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051290 4676 flags.go:64] FLAG: --experimental-mounter-path="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051294 4676 flags.go:64] FLAG: --fail-cgroupv1="false" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051298 4676 flags.go:64] FLAG: --fail-swap-on="true" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051302 4676 flags.go:64] FLAG: --feature-gates="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051307 4676 flags.go:64] FLAG: --file-check-frequency="20s" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051312 4676 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051317 4676 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051322 4676 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051327 4676 flags.go:64] FLAG: --healthz-port="10248" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051331 4676 flags.go:64] FLAG: --help="false" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051335 4676 flags.go:64] FLAG: --hostname-override="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051339 4676 flags.go:64] FLAG: --housekeeping-interval="10s" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051344 4676 flags.go:64] FLAG: --http-check-frequency="20s" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051348 4676 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051352 4676 flags.go:64] FLAG: --image-credential-provider-config="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051356 4676 flags.go:64] FLAG: --image-gc-high-threshold="85" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051360 4676 flags.go:64] FLAG: --image-gc-low-threshold="80" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051365 4676 flags.go:64] FLAG: --image-service-endpoint="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051369 4676 flags.go:64] FLAG: --kernel-memcg-notification="false" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051373 4676 flags.go:64] FLAG: --kube-api-burst="100" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051377 4676 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051381 4676 flags.go:64] FLAG: --kube-api-qps="50" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051385 4676 flags.go:64] FLAG: --kube-reserved="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051389 4676 flags.go:64] FLAG: --kube-reserved-cgroup="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051393 4676 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051397 4676 flags.go:64] FLAG: --kubelet-cgroups="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051401 4676 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051405 4676 flags.go:64] FLAG: --lock-file="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051409 4676 flags.go:64] FLAG: --log-cadvisor-usage="false" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051413 4676 flags.go:64] FLAG: --log-flush-frequency="5s" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051417 4676 flags.go:64] FLAG: --log-json-info-buffer-size="0" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051423 4676 flags.go:64] FLAG: --log-json-split-stream="false" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051428 4676 flags.go:64] FLAG: --log-text-info-buffer-size="0" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051432 4676 flags.go:64] FLAG: --log-text-split-stream="false" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051435 4676 flags.go:64] FLAG: --logging-format="text" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051440 4676 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051444 4676 flags.go:64] FLAG: --make-iptables-util-chains="true" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051449 4676 flags.go:64] FLAG: --manifest-url="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051453 4676 flags.go:64] FLAG: --manifest-url-header="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051463 4676 flags.go:64] FLAG: --max-housekeeping-interval="15s" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051467 4676 flags.go:64] FLAG: --max-open-files="1000000" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051473 4676 flags.go:64] FLAG: --max-pods="110" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051477 4676 flags.go:64] FLAG: --maximum-dead-containers="-1" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051482 4676 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051486 4676 flags.go:64] FLAG: --memory-manager-policy="None" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051490 4676 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051494 4676 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051499 4676 flags.go:64] FLAG: --node-ip="192.168.126.11" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051503 4676 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051513 4676 flags.go:64] FLAG: --node-status-max-images="50" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051517 4676 flags.go:64] FLAG: --node-status-update-frequency="10s" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051521 4676 flags.go:64] FLAG: --oom-score-adj="-999" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051525 4676 flags.go:64] FLAG: --pod-cidr="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051530 4676 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051537 4676 flags.go:64] FLAG: --pod-manifest-path="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051542 4676 flags.go:64] FLAG: --pod-max-pids="-1" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051546 4676 flags.go:64] FLAG: --pods-per-core="0" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051551 4676 flags.go:64] FLAG: --port="10250" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051556 4676 flags.go:64] FLAG: --protect-kernel-defaults="false" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051560 4676 flags.go:64] FLAG: --provider-id="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051565 4676 flags.go:64] FLAG: --qos-reserved="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051569 4676 flags.go:64] FLAG: --read-only-port="10255" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051573 4676 flags.go:64] FLAG: --register-node="true" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051578 4676 flags.go:64] FLAG: --register-schedulable="true" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051582 4676 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051590 4676 flags.go:64] FLAG: --registry-burst="10" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051594 4676 flags.go:64] FLAG: --registry-qps="5" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051598 4676 flags.go:64] FLAG: --reserved-cpus="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051602 4676 flags.go:64] FLAG: --reserved-memory="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051608 4676 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051612 4676 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051616 4676 flags.go:64] FLAG: --rotate-certificates="false" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051620 4676 flags.go:64] FLAG: --rotate-server-certificates="false" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051624 4676 flags.go:64] FLAG: --runonce="false" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051628 4676 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051632 4676 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051637 4676 flags.go:64] FLAG: --seccomp-default="false" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051641 4676 flags.go:64] FLAG: --serialize-image-pulls="true" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051645 4676 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051650 4676 flags.go:64] FLAG: --storage-driver-db="cadvisor" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051654 4676 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051659 4676 flags.go:64] FLAG: --storage-driver-password="root" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051662 4676 flags.go:64] FLAG: --storage-driver-secure="false" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051666 4676 flags.go:64] FLAG: --storage-driver-table="stats" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051670 4676 flags.go:64] FLAG: --storage-driver-user="root" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051674 4676 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051678 4676 flags.go:64] FLAG: --sync-frequency="1m0s" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051683 4676 flags.go:64] FLAG: --system-cgroups="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051686 4676 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051693 4676 flags.go:64] FLAG: --system-reserved-cgroup="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051698 4676 flags.go:64] FLAG: --tls-cert-file="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051702 4676 flags.go:64] FLAG: --tls-cipher-suites="[]" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051706 4676 flags.go:64] FLAG: --tls-min-version="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051710 4676 flags.go:64] FLAG: --tls-private-key-file="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051714 4676 flags.go:64] FLAG: --topology-manager-policy="none" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051718 4676 flags.go:64] FLAG: --topology-manager-policy-options="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051722 4676 flags.go:64] FLAG: --topology-manager-scope="container" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051726 4676 flags.go:64] FLAG: --v="2" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051732 4676 flags.go:64] FLAG: --version="false" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051738 4676 flags.go:64] FLAG: --vmodule="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051743 4676 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.051748 4676 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.051855 4676 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.051862 4676 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.051867 4676 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.051871 4676 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.051890 4676 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.051894 4676 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.051898 4676 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.051907 4676 feature_gate.go:330] unrecognized feature gate: SignatureStores Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.051911 4676 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.051914 4676 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.051918 4676 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.051922 4676 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.051927 4676 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.051931 4676 feature_gate.go:330] unrecognized feature gate: OVNObservability Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.051935 4676 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.051939 4676 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.051942 4676 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.051946 4676 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.051949 4676 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.051953 4676 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.051957 4676 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.051963 4676 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.051968 4676 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.051974 4676 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.051979 4676 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.051984 4676 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.051988 4676 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.051992 4676 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.051996 4676 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.052001 4676 feature_gate.go:330] unrecognized feature gate: GatewayAPI Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.052005 4676 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.052013 4676 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.052017 4676 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.052022 4676 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.052025 4676 feature_gate.go:330] unrecognized feature gate: InsightsConfig Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.052029 4676 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.052032 4676 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.052036 4676 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.052039 4676 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.052044 4676 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.052048 4676 feature_gate.go:330] unrecognized feature gate: PinnedImages Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.052051 4676 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.052055 4676 feature_gate.go:330] unrecognized feature gate: Example Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.052058 4676 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.052062 4676 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.052065 4676 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.052070 4676 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.052074 4676 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.052078 4676 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.052083 4676 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.052089 4676 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.052095 4676 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.052099 4676 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.052104 4676 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.052109 4676 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.052113 4676 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.052118 4676 feature_gate.go:330] unrecognized feature gate: PlatformOperators Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.052122 4676 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.052126 4676 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.052130 4676 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.052134 4676 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.052138 4676 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.052141 4676 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.052146 4676 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.052150 4676 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.052153 4676 feature_gate.go:330] unrecognized feature gate: NewOLM Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.052157 4676 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.052161 4676 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.052166 4676 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.052170 4676 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.052175 4676 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.052193 4676 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.077182 4676 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.077229 4676 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077295 4676 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077304 4676 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077310 4676 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077315 4676 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077320 4676 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077324 4676 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077328 4676 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077332 4676 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077336 4676 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077341 4676 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077346 4676 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077350 4676 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077354 4676 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077358 4676 feature_gate.go:330] unrecognized feature gate: Example Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077362 4676 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077366 4676 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077370 4676 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077373 4676 feature_gate.go:330] unrecognized feature gate: SignatureStores Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077377 4676 feature_gate.go:330] unrecognized feature gate: OVNObservability Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077380 4676 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077384 4676 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077387 4676 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077391 4676 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077394 4676 feature_gate.go:330] unrecognized feature gate: InsightsConfig Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077398 4676 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077401 4676 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077405 4676 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077408 4676 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077412 4676 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077415 4676 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077419 4676 feature_gate.go:330] unrecognized feature gate: PinnedImages Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077423 4676 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077426 4676 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077432 4676 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077436 4676 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077439 4676 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077443 4676 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077446 4676 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077450 4676 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077453 4676 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077456 4676 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077460 4676 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077464 4676 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077468 4676 feature_gate.go:330] unrecognized feature gate: PlatformOperators Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077472 4676 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077475 4676 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077479 4676 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077483 4676 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077487 4676 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077490 4676 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077494 4676 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077498 4676 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077502 4676 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077506 4676 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077511 4676 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077515 4676 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077519 4676 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077523 4676 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077526 4676 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077530 4676 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077533 4676 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077536 4676 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077540 4676 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077544 4676 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077548 4676 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077551 4676 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077556 4676 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077561 4676 feature_gate.go:330] unrecognized feature gate: NewOLM Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077564 4676 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077568 4676 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077571 4676 feature_gate.go:330] unrecognized feature gate: GatewayAPI Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.077578 4676 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077683 4676 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077689 4676 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077693 4676 feature_gate.go:330] unrecognized feature gate: GatewayAPI Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077697 4676 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077701 4676 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077704 4676 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077708 4676 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077713 4676 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077717 4676 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077721 4676 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077726 4676 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077730 4676 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077734 4676 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077738 4676 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077743 4676 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077748 4676 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077752 4676 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077756 4676 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077760 4676 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077764 4676 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077767 4676 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077771 4676 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077774 4676 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077780 4676 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077784 4676 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077787 4676 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077791 4676 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077795 4676 feature_gate.go:330] unrecognized feature gate: NewOLM Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077798 4676 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077802 4676 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077805 4676 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077809 4676 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077813 4676 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077816 4676 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077820 4676 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077823 4676 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077827 4676 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077830 4676 feature_gate.go:330] unrecognized feature gate: Example Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077833 4676 feature_gate.go:330] unrecognized feature gate: InsightsConfig Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077837 4676 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077840 4676 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077844 4676 feature_gate.go:330] unrecognized feature gate: PinnedImages Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077847 4676 feature_gate.go:330] unrecognized feature gate: SignatureStores Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077851 4676 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077854 4676 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077858 4676 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077861 4676 feature_gate.go:330] unrecognized feature gate: OVNObservability Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077865 4676 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077868 4676 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077871 4676 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077891 4676 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077895 4676 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077900 4676 feature_gate.go:330] unrecognized feature gate: PlatformOperators Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077904 4676 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077908 4676 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077913 4676 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077918 4676 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077924 4676 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077929 4676 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077933 4676 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077938 4676 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077942 4676 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077946 4676 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077950 4676 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077955 4676 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077959 4676 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077963 4676 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077966 4676 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077971 4676 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077975 4676 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.077979 4676 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.077986 4676 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.079902 4676 server.go:940] "Client rotation is on, will bootstrap in background" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.087104 4676 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.087233 4676 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.091660 4676 server.go:997] "Starting client certificate rotation" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.091705 4676 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.096527 4676 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2026-01-15 07:39:14.197631945 +0000 UTC Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.096656 4676 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 2561h40m57.100979653s for next certificate rotation Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.180188 4676 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.182019 4676 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.232759 4676 log.go:25] "Validated CRI v1 runtime API" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.308001 4676 log.go:25] "Validated CRI v1 image API" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.309995 4676 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.321326 4676 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-09-30-13-53-52-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.321384 4676 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:45 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:42 fsType:tmpfs blockSize:0}] Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.337137 4676 manager.go:217] Machine: {Timestamp:2025-09-30 13:58:17.333633674 +0000 UTC m=+1.316722123 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:e459744e-c3c7-46eb-b48a-f9d168ffc645 BootID:1d613b32-e65a-4ccd-985a-b2960fc60b41 Filesystems:[{Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:45 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:42 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:32:3b:ad Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:32:3b:ad Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:0c:77:97 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:46:b4:42 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:d5:80:e6 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:5b:31:09 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:da:72:8f:5d:d1:e5 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:fe:cf:48:e6:c0:2e Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.337401 4676 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.337649 4676 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.341808 4676 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.342091 4676 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.342128 4676 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.342351 4676 topology_manager.go:138] "Creating topology manager with none policy" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.342362 4676 container_manager_linux.go:303] "Creating device plugin manager" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.349271 4676 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.349317 4676 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.349613 4676 state_mem.go:36] "Initialized new in-memory state store" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.349727 4676 server.go:1245] "Using root directory" path="/var/lib/kubelet" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.360544 4676 kubelet.go:418] "Attempting to sync node with API server" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.360579 4676 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.360608 4676 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.360621 4676 kubelet.go:324] "Adding apiserver pod source" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.360634 4676 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.367982 4676 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.369662 4676 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.370589 4676 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.31:6443: connect: connection refused Sep 30 13:58:17 crc kubenswrapper[4676]: E0930 13:58:17.370653 4676 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.31:6443: connect: connection refused" logger="UnhandledError" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.372938 4676 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.373280 4676 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.31:6443: connect: connection refused Sep 30 13:58:17 crc kubenswrapper[4676]: E0930 13:58:17.373337 4676 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.31:6443: connect: connection refused" logger="UnhandledError" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.375539 4676 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.375604 4676 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.375615 4676 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.375624 4676 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.375638 4676 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.375646 4676 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.375655 4676 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.375671 4676 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.375682 4676 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.375691 4676 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.375727 4676 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.375737 4676 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.375759 4676 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.376211 4676 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.31:6443: connect: connection refused Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.376246 4676 server.go:1280] "Started kubelet" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.376475 4676 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.376626 4676 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 30 13:58:17 crc systemd[1]: Started Kubernetes Kubelet. Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.378040 4676 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.378318 4676 server.go:460] "Adding debug handlers to kubelet server" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.388668 4676 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.388735 4676 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.389290 4676 volume_manager.go:287] "The desired_state_of_world populator starts" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.389319 4676 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.389323 4676 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 08:39:30.88331086 +0000 UTC Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.389398 4676 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 1290h41m13.493917114s for next certificate rotation Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.389651 4676 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Sep 30 13:58:17 crc kubenswrapper[4676]: E0930 13:58:17.390347 4676 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.390652 4676 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.31:6443: connect: connection refused Sep 30 13:58:17 crc kubenswrapper[4676]: E0930 13:58:17.390784 4676 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.31:6443: connect: connection refused" logger="UnhandledError" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.391230 4676 factory.go:55] Registering systemd factory Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.391310 4676 factory.go:221] Registration of the systemd container factory successfully Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.391899 4676 factory.go:153] Registering CRI-O factory Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.391918 4676 factory.go:221] Registration of the crio container factory successfully Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.392005 4676 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.392043 4676 factory.go:103] Registering Raw factory Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.392062 4676 manager.go:1196] Started watching for new ooms in manager Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.392932 4676 manager.go:319] Starting recovery of all containers Sep 30 13:58:17 crc kubenswrapper[4676]: E0930 13:58:17.398764 4676 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.31:6443: connect: connection refused" interval="200ms" Sep 30 13:58:17 crc kubenswrapper[4676]: E0930 13:58:17.397382 4676 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.129.56.31:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186a1417e9851372 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-09-30 13:58:17.376215922 +0000 UTC m=+1.359304351,LastTimestamp:2025-09-30 13:58:17.376215922 +0000 UTC m=+1.359304351,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.416148 4676 manager.go:324] Recovery completed Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.417761 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.417894 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.417959 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.418021 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.418096 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.418175 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.418233 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.418293 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.418352 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.418417 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.418498 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.418613 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.418678 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.418744 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.418807 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.418902 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.418966 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.419023 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.419079 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.419135 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.419211 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.419289 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.419344 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.419411 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.419471 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.419534 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.419597 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.419663 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.419724 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.419784 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.419839 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.419919 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.419986 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.420041 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.420101 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.420169 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.420233 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.420290 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.420355 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.420416 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.420473 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.420532 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.420587 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.420646 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.420699 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.420759 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.420819 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.420926 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.420992 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.421284 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.421366 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.421450 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.421970 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.422194 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.422248 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.422278 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.422535 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.422568 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.422582 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.422602 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.422616 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.422632 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.422711 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.422725 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.422749 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.422793 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.422808 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.422832 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.422917 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.422982 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.422999 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.423011 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.423067 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.423103 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.423120 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.423135 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.423149 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.423172 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.423233 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.423258 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.423269 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.423303 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.423321 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.423333 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.423377 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.423417 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.423428 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.423442 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.423483 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.423522 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.423538 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.423551 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.423565 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.423576 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.423587 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.423604 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.423667 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.423710 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.423721 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.423731 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.423775 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.423788 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.423803 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.423813 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.423949 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.423995 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.424006 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.424075 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.424096 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.424109 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.424180 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.424204 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.424219 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.424260 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.424271 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.424311 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.424349 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.424384 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.424404 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.424425 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.424483 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.424518 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.424537 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.424553 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.428652 4676 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.428707 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.428723 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.428736 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.428711 4676 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.428748 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.428760 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.428772 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.428783 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.428794 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.428806 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.428817 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.428829 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.428841 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.428853 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.428919 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.428945 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.428964 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.428979 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.428992 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.429004 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.429018 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.429030 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.429041 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.429054 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.429064 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.429076 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.429089 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.429099 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.429113 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.429126 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.429140 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.429152 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.429197 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.429211 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.429225 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.429239 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.429249 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.429260 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.429271 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.429283 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.429294 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.429303 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.429313 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.429327 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.429339 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.429352 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.429364 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.429375 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.429387 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.429398 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.429409 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.429419 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.429447 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.429457 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.429469 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.429480 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.429491 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.429502 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.429514 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.429524 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.429535 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.429545 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.429557 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.429567 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.429580 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.429589 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.429601 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.429611 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.429622 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.429646 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.429657 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.429667 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.429678 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.429689 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.429699 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.429712 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.429722 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.429732 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.429743 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.429752 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.429764 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.429776 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.429792 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.429820 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.429831 4676 reconstruct.go:97] "Volume reconstruction finished" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.429837 4676 reconciler.go:26] "Reconciler: start to sync state" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.431636 4676 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.431697 4676 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.431723 4676 kubelet.go:2335] "Starting kubelet main sync loop" Sep 30 13:58:17 crc kubenswrapper[4676]: E0930 13:58:17.431825 4676 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.432855 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.434298 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.434329 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.434338 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.434862 4676 cpu_manager.go:225] "Starting CPU manager" policy="none" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.434898 4676 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.434918 4676 state_mem.go:36] "Initialized new in-memory state store" Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.436529 4676 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.31:6443: connect: connection refused Sep 30 13:58:17 crc kubenswrapper[4676]: E0930 13:58:17.436612 4676 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.31:6443: connect: connection refused" logger="UnhandledError" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.465553 4676 policy_none.go:49] "None policy: Start" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.466510 4676 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.466550 4676 state_mem.go:35] "Initializing new in-memory state store" Sep 30 13:58:17 crc kubenswrapper[4676]: E0930 13:58:17.490903 4676 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.525390 4676 manager.go:334] "Starting Device Plugin manager" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.525500 4676 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.525513 4676 server.go:79] "Starting device plugin registration server" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.525870 4676 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.525905 4676 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.526025 4676 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.526984 4676 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.527003 4676 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.532154 4676 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.532233 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.533171 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.533526 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.533538 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.533647 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.534023 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.534044 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.535225 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.535260 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.535270 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.535225 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.535293 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.535305 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.535471 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.535640 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.535671 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:58:17 crc kubenswrapper[4676]: E0930 13:58:17.535642 4676 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.536410 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.536412 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.536440 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.536449 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.536428 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.536510 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.536620 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.536738 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.536763 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.537227 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.537251 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.537259 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.537332 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.537465 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.537493 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.537594 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.537640 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.537664 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.537979 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.538008 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.538020 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.538044 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.538059 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.538069 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.538183 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.538211 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.539289 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.539313 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.539320 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:17 crc kubenswrapper[4676]: E0930 13:58:17.599676 4676 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.31:6443: connect: connection refused" interval="400ms" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.626970 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.628410 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.628447 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.628457 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.628485 4676 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 13:58:17 crc kubenswrapper[4676]: E0930 13:58:17.629267 4676 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.31:6443: connect: connection refused" node="crc" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.631378 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.631421 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.631447 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.631470 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.631492 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.631516 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.631533 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.631550 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.631581 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.631613 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.631633 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.631656 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.631676 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.631710 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.631760 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.733175 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.733261 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.733287 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.733306 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.733324 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.733338 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.733354 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.733370 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.733386 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.733402 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.733417 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.733431 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.733446 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.733464 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.733476 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.733530 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.733559 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.733592 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.733596 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.733621 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.733619 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.733641 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.733653 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.733671 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.733678 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.733696 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.733702 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.733435 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.733479 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.733733 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.829757 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.831006 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.831068 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.831080 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.831108 4676 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 13:58:17 crc kubenswrapper[4676]: E0930 13:58:17.831667 4676 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.31:6443: connect: connection refused" node="crc" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.866542 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.882023 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.898332 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.904312 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-cf0ee2c8025cac969f51045443a53684b0e2b1bff3e7173908fe9c60f491ad89 WatchSource:0}: Error finding container cf0ee2c8025cac969f51045443a53684b0e2b1bff3e7173908fe9c60f491ad89: Status 404 returned error can't find the container with id cf0ee2c8025cac969f51045443a53684b0e2b1bff3e7173908fe9c60f491ad89 Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.906537 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 13:58:17 crc kubenswrapper[4676]: I0930 13:58:17.910898 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.925635 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-44feb765c16eb2e7f782d2a55f8d125735747c0fd49bc43deba14a4d477691f6 WatchSource:0}: Error finding container 44feb765c16eb2e7f782d2a55f8d125735747c0fd49bc43deba14a4d477691f6: Status 404 returned error can't find the container with id 44feb765c16eb2e7f782d2a55f8d125735747c0fd49bc43deba14a4d477691f6 Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.930579 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-f5cf96a9c8522d96b1cd44794a15052c97b1b7b6b607a09ef4234cb80199d3e1 WatchSource:0}: Error finding container f5cf96a9c8522d96b1cd44794a15052c97b1b7b6b607a09ef4234cb80199d3e1: Status 404 returned error can't find the container with id f5cf96a9c8522d96b1cd44794a15052c97b1b7b6b607a09ef4234cb80199d3e1 Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.931180 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-0f0ca57dbf4387c154a81e83801526c6f533324081167041d4ffb7162df1bcd6 WatchSource:0}: Error finding container 0f0ca57dbf4387c154a81e83801526c6f533324081167041d4ffb7162df1bcd6: Status 404 returned error can't find the container with id 0f0ca57dbf4387c154a81e83801526c6f533324081167041d4ffb7162df1bcd6 Sep 30 13:58:17 crc kubenswrapper[4676]: W0930 13:58:17.932485 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-54231ceb4f6159a6fa2b022d10c5edeb9e4f1a67c1f2edcc1e2b4f4e0dba8451 WatchSource:0}: Error finding container 54231ceb4f6159a6fa2b022d10c5edeb9e4f1a67c1f2edcc1e2b4f4e0dba8451: Status 404 returned error can't find the container with id 54231ceb4f6159a6fa2b022d10c5edeb9e4f1a67c1f2edcc1e2b4f4e0dba8451 Sep 30 13:58:18 crc kubenswrapper[4676]: E0930 13:58:18.001320 4676 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.31:6443: connect: connection refused" interval="800ms" Sep 30 13:58:18 crc kubenswrapper[4676]: I0930 13:58:18.232240 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:58:18 crc kubenswrapper[4676]: I0930 13:58:18.234660 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:18 crc kubenswrapper[4676]: I0930 13:58:18.234705 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:18 crc kubenswrapper[4676]: I0930 13:58:18.234716 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:18 crc kubenswrapper[4676]: I0930 13:58:18.234747 4676 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 13:58:18 crc kubenswrapper[4676]: E0930 13:58:18.235356 4676 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.31:6443: connect: connection refused" node="crc" Sep 30 13:58:18 crc kubenswrapper[4676]: I0930 13:58:18.377781 4676 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.31:6443: connect: connection refused Sep 30 13:58:18 crc kubenswrapper[4676]: W0930 13:58:18.429966 4676 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.31:6443: connect: connection refused Sep 30 13:58:18 crc kubenswrapper[4676]: E0930 13:58:18.430105 4676 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.31:6443: connect: connection refused" logger="UnhandledError" Sep 30 13:58:18 crc kubenswrapper[4676]: I0930 13:58:18.435811 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"54231ceb4f6159a6fa2b022d10c5edeb9e4f1a67c1f2edcc1e2b4f4e0dba8451"} Sep 30 13:58:18 crc kubenswrapper[4676]: I0930 13:58:18.436834 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"f5cf96a9c8522d96b1cd44794a15052c97b1b7b6b607a09ef4234cb80199d3e1"} Sep 30 13:58:18 crc kubenswrapper[4676]: I0930 13:58:18.437746 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0f0ca57dbf4387c154a81e83801526c6f533324081167041d4ffb7162df1bcd6"} Sep 30 13:58:18 crc kubenswrapper[4676]: I0930 13:58:18.438509 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"44feb765c16eb2e7f782d2a55f8d125735747c0fd49bc43deba14a4d477691f6"} Sep 30 13:58:18 crc kubenswrapper[4676]: I0930 13:58:18.439486 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"cf0ee2c8025cac969f51045443a53684b0e2b1bff3e7173908fe9c60f491ad89"} Sep 30 13:58:18 crc kubenswrapper[4676]: W0930 13:58:18.660450 4676 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.31:6443: connect: connection refused Sep 30 13:58:18 crc kubenswrapper[4676]: E0930 13:58:18.660549 4676 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.31:6443: connect: connection refused" logger="UnhandledError" Sep 30 13:58:18 crc kubenswrapper[4676]: W0930 13:58:18.746745 4676 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.31:6443: connect: connection refused Sep 30 13:58:18 crc kubenswrapper[4676]: E0930 13:58:18.746837 4676 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.31:6443: connect: connection refused" logger="UnhandledError" Sep 30 13:58:18 crc kubenswrapper[4676]: E0930 13:58:18.803003 4676 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.31:6443: connect: connection refused" interval="1.6s" Sep 30 13:58:18 crc kubenswrapper[4676]: W0930 13:58:18.823237 4676 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.31:6443: connect: connection refused Sep 30 13:58:18 crc kubenswrapper[4676]: E0930 13:58:18.823362 4676 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.31:6443: connect: connection refused" logger="UnhandledError" Sep 30 13:58:19 crc kubenswrapper[4676]: I0930 13:58:19.036396 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:58:19 crc kubenswrapper[4676]: I0930 13:58:19.038551 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:19 crc kubenswrapper[4676]: I0930 13:58:19.038598 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:19 crc kubenswrapper[4676]: I0930 13:58:19.038610 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:19 crc kubenswrapper[4676]: I0930 13:58:19.038639 4676 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 13:58:19 crc kubenswrapper[4676]: E0930 13:58:19.039278 4676 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.31:6443: connect: connection refused" node="crc" Sep 30 13:58:19 crc kubenswrapper[4676]: I0930 13:58:19.377326 4676 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.31:6443: connect: connection refused Sep 30 13:58:19 crc kubenswrapper[4676]: I0930 13:58:19.443933 4676 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="c5c198babd51af386813fff3e1bc7b3428c21d2bbfc18c77c3d1b69d664b471e" exitCode=0 Sep 30 13:58:19 crc kubenswrapper[4676]: I0930 13:58:19.444006 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"c5c198babd51af386813fff3e1bc7b3428c21d2bbfc18c77c3d1b69d664b471e"} Sep 30 13:58:19 crc kubenswrapper[4676]: I0930 13:58:19.444065 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:58:19 crc kubenswrapper[4676]: I0930 13:58:19.446056 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:19 crc kubenswrapper[4676]: I0930 13:58:19.446093 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:19 crc kubenswrapper[4676]: I0930 13:58:19.446103 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:19 crc kubenswrapper[4676]: I0930 13:58:19.448645 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5591cefc1293c938f8a2a8db06c7287d0e54d55a9d99c65a40126ca77abccd80"} Sep 30 13:58:19 crc kubenswrapper[4676]: I0930 13:58:19.448698 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"86ba7b90dcf7a1fcd028d2fcf19bb86de8c58446ef34191d50a5055f4042bb71"} Sep 30 13:58:19 crc kubenswrapper[4676]: I0930 13:58:19.448712 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"cbb64f69d88447dcfe2a2e206f1e59b4b10f370ac02a3f9653e0b95e6d94529c"} Sep 30 13:58:19 crc kubenswrapper[4676]: I0930 13:58:19.448728 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f628b643cc4c4a9a0d4b2c6d923d6ee5839b325a873da093e84dfc48255548f7"} Sep 30 13:58:19 crc kubenswrapper[4676]: I0930 13:58:19.448730 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:58:19 crc kubenswrapper[4676]: I0930 13:58:19.449716 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:19 crc kubenswrapper[4676]: I0930 13:58:19.449744 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:19 crc kubenswrapper[4676]: I0930 13:58:19.449754 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:19 crc kubenswrapper[4676]: I0930 13:58:19.450699 4676 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="05f3170673f4e23690bc0c5543265931501d623eda84ee7df8e515e1ad8b6346" exitCode=0 Sep 30 13:58:19 crc kubenswrapper[4676]: I0930 13:58:19.450758 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"05f3170673f4e23690bc0c5543265931501d623eda84ee7df8e515e1ad8b6346"} Sep 30 13:58:19 crc kubenswrapper[4676]: I0930 13:58:19.450810 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:58:19 crc kubenswrapper[4676]: I0930 13:58:19.451767 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:19 crc kubenswrapper[4676]: I0930 13:58:19.451805 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:19 crc kubenswrapper[4676]: I0930 13:58:19.451814 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:19 crc kubenswrapper[4676]: I0930 13:58:19.453519 4676 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da" exitCode=0 Sep 30 13:58:19 crc kubenswrapper[4676]: I0930 13:58:19.453625 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da"} Sep 30 13:58:19 crc kubenswrapper[4676]: I0930 13:58:19.453814 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:58:19 crc kubenswrapper[4676]: I0930 13:58:19.455250 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:19 crc kubenswrapper[4676]: I0930 13:58:19.455288 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:19 crc kubenswrapper[4676]: I0930 13:58:19.455299 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:19 crc kubenswrapper[4676]: I0930 13:58:19.456718 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:58:19 crc kubenswrapper[4676]: I0930 13:58:19.457544 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:19 crc kubenswrapper[4676]: I0930 13:58:19.457564 4676 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb" exitCode=0 Sep 30 13:58:19 crc kubenswrapper[4676]: I0930 13:58:19.457587 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:19 crc kubenswrapper[4676]: I0930 13:58:19.457623 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb"} Sep 30 13:58:19 crc kubenswrapper[4676]: I0930 13:58:19.457664 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:58:19 crc kubenswrapper[4676]: I0930 13:58:19.457663 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:19 crc kubenswrapper[4676]: I0930 13:58:19.458955 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:19 crc kubenswrapper[4676]: I0930 13:58:19.459006 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:19 crc kubenswrapper[4676]: I0930 13:58:19.459020 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:20 crc kubenswrapper[4676]: I0930 13:58:20.377040 4676 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.31:6443: connect: connection refused Sep 30 13:58:20 crc kubenswrapper[4676]: E0930 13:58:20.403762 4676 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.31:6443: connect: connection refused" interval="3.2s" Sep 30 13:58:20 crc kubenswrapper[4676]: I0930 13:58:20.461103 4676 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d" exitCode=0 Sep 30 13:58:20 crc kubenswrapper[4676]: I0930 13:58:20.461174 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d"} Sep 30 13:58:20 crc kubenswrapper[4676]: I0930 13:58:20.461310 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:58:20 crc kubenswrapper[4676]: I0930 13:58:20.462255 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:20 crc kubenswrapper[4676]: I0930 13:58:20.462283 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:20 crc kubenswrapper[4676]: I0930 13:58:20.462293 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:20 crc kubenswrapper[4676]: I0930 13:58:20.465677 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"2b470ddef2c5c0f7a683cfc156369cebaab5235260a98145e32ae118fafdd214"} Sep 30 13:58:20 crc kubenswrapper[4676]: I0930 13:58:20.465714 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:58:20 crc kubenswrapper[4676]: I0930 13:58:20.466612 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:20 crc kubenswrapper[4676]: I0930 13:58:20.466643 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:20 crc kubenswrapper[4676]: I0930 13:58:20.466657 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:20 crc kubenswrapper[4676]: I0930 13:58:20.469061 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"44aa745d9d01236afaa143bbc447bc230cac1190229c1c74ecdf92555cc721eb"} Sep 30 13:58:20 crc kubenswrapper[4676]: I0930 13:58:20.469102 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"14d07155343479f6d7f452a871bca1361e4fd22efd3ef828281e1836117454a1"} Sep 30 13:58:20 crc kubenswrapper[4676]: I0930 13:58:20.469122 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"c3a7307312b0604e0dced8cbc2b8ed07583ec6650b05091d4c5f41a08da70691"} Sep 30 13:58:20 crc kubenswrapper[4676]: I0930 13:58:20.469125 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:58:20 crc kubenswrapper[4676]: I0930 13:58:20.469690 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:20 crc kubenswrapper[4676]: I0930 13:58:20.469716 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:20 crc kubenswrapper[4676]: I0930 13:58:20.469727 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:20 crc kubenswrapper[4676]: I0930 13:58:20.478603 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:58:20 crc kubenswrapper[4676]: I0930 13:58:20.479111 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:58:20 crc kubenswrapper[4676]: I0930 13:58:20.479335 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3049f477ecf683e57b583e8f15aa9e5335347be632ffc02799ef279fefe9b446"} Sep 30 13:58:20 crc kubenswrapper[4676]: I0930 13:58:20.479402 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c2e2c7ce3911aaca47ffc14e5628e9efb3b32594d812028b2659b8f8ca681d53"} Sep 30 13:58:20 crc kubenswrapper[4676]: I0930 13:58:20.479420 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f838844cf38dd9b54b6054863a395010ea5cb87ca8066642ffe388365a91f5a7"} Sep 30 13:58:20 crc kubenswrapper[4676]: I0930 13:58:20.479432 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"151ae8ebe882d8c0150cd97088a56e52f55f0ae066819f8e4aaae2279a61881f"} Sep 30 13:58:20 crc kubenswrapper[4676]: I0930 13:58:20.479443 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"bf15b6b87f6e760bf922b02b49658153875a79c4253bc50a895372c460e0f077"} Sep 30 13:58:20 crc kubenswrapper[4676]: I0930 13:58:20.483695 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:20 crc kubenswrapper[4676]: I0930 13:58:20.483746 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:20 crc kubenswrapper[4676]: I0930 13:58:20.483758 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:20 crc kubenswrapper[4676]: I0930 13:58:20.483711 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:20 crc kubenswrapper[4676]: I0930 13:58:20.483821 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:20 crc kubenswrapper[4676]: I0930 13:58:20.483834 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:20 crc kubenswrapper[4676]: W0930 13:58:20.484571 4676 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.31:6443: connect: connection refused Sep 30 13:58:20 crc kubenswrapper[4676]: E0930 13:58:20.484632 4676 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.31:6443: connect: connection refused" logger="UnhandledError" Sep 30 13:58:20 crc kubenswrapper[4676]: I0930 13:58:20.495160 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 13:58:20 crc kubenswrapper[4676]: W0930 13:58:20.557163 4676 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.31:6443: connect: connection refused Sep 30 13:58:20 crc kubenswrapper[4676]: E0930 13:58:20.557254 4676 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.31:6443: connect: connection refused" logger="UnhandledError" Sep 30 13:58:20 crc kubenswrapper[4676]: I0930 13:58:20.639731 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:58:20 crc kubenswrapper[4676]: I0930 13:58:20.641150 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:20 crc kubenswrapper[4676]: I0930 13:58:20.641189 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:20 crc kubenswrapper[4676]: I0930 13:58:20.641199 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:20 crc kubenswrapper[4676]: I0930 13:58:20.641220 4676 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 13:58:20 crc kubenswrapper[4676]: E0930 13:58:20.642441 4676 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.31:6443: connect: connection refused" node="crc" Sep 30 13:58:20 crc kubenswrapper[4676]: W0930 13:58:20.689283 4676 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.31:6443: connect: connection refused Sep 30 13:58:20 crc kubenswrapper[4676]: E0930 13:58:20.689353 4676 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.31:6443: connect: connection refused" logger="UnhandledError" Sep 30 13:58:21 crc kubenswrapper[4676]: I0930 13:58:21.382925 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 13:58:21 crc kubenswrapper[4676]: I0930 13:58:21.482852 4676 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b" exitCode=0 Sep 30 13:58:21 crc kubenswrapper[4676]: I0930 13:58:21.482915 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b"} Sep 30 13:58:21 crc kubenswrapper[4676]: I0930 13:58:21.483014 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:58:21 crc kubenswrapper[4676]: I0930 13:58:21.483062 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:58:21 crc kubenswrapper[4676]: I0930 13:58:21.483082 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:58:21 crc kubenswrapper[4676]: I0930 13:58:21.483025 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:58:21 crc kubenswrapper[4676]: I0930 13:58:21.483304 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 13:58:21 crc kubenswrapper[4676]: I0930 13:58:21.484265 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:21 crc kubenswrapper[4676]: I0930 13:58:21.484302 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:21 crc kubenswrapper[4676]: I0930 13:58:21.484315 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:21 crc kubenswrapper[4676]: I0930 13:58:21.484330 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:21 crc kubenswrapper[4676]: I0930 13:58:21.484360 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:21 crc kubenswrapper[4676]: I0930 13:58:21.484372 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:21 crc kubenswrapper[4676]: I0930 13:58:21.484469 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:21 crc kubenswrapper[4676]: I0930 13:58:21.484507 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:21 crc kubenswrapper[4676]: I0930 13:58:21.484515 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:21 crc kubenswrapper[4676]: I0930 13:58:21.484524 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:21 crc kubenswrapper[4676]: I0930 13:58:21.484537 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:21 crc kubenswrapper[4676]: I0930 13:58:21.484671 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:21 crc kubenswrapper[4676]: I0930 13:58:21.578520 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 13:58:21 crc kubenswrapper[4676]: I0930 13:58:21.578687 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:58:21 crc kubenswrapper[4676]: I0930 13:58:21.579844 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:21 crc kubenswrapper[4676]: I0930 13:58:21.579875 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:21 crc kubenswrapper[4676]: I0930 13:58:21.579902 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:22 crc kubenswrapper[4676]: I0930 13:58:22.490208 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"288cb819756dee05f608c0e6875f3a2b4eb629a5d07db35aaf706a4bc4110899"} Sep 30 13:58:22 crc kubenswrapper[4676]: I0930 13:58:22.490259 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"0228750af194e47c41da2795fb0b21b2c44b61e7d5a8943a943793954b33e0d3"} Sep 30 13:58:22 crc kubenswrapper[4676]: I0930 13:58:22.490280 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"69000b0d87b6f1e68fb36dd2349f95de211b17cc4e7bd7ec489969feca3810d4"} Sep 30 13:58:22 crc kubenswrapper[4676]: I0930 13:58:22.490293 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"168155413f2783887fde890a607fd98c1823da5a45b7fc550ef9358750451da7"} Sep 30 13:58:22 crc kubenswrapper[4676]: I0930 13:58:22.490303 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"1c9e51dab4ec7df53e36f707786a013e4fca0909f0d69cb9113dfdff6e0c9a2f"} Sep 30 13:58:22 crc kubenswrapper[4676]: I0930 13:58:22.490334 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:58:22 crc kubenswrapper[4676]: I0930 13:58:22.490360 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:58:22 crc kubenswrapper[4676]: I0930 13:58:22.490291 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:58:22 crc kubenswrapper[4676]: I0930 13:58:22.491495 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:22 crc kubenswrapper[4676]: I0930 13:58:22.491524 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:22 crc kubenswrapper[4676]: I0930 13:58:22.491495 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:22 crc kubenswrapper[4676]: I0930 13:58:22.491550 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:22 crc kubenswrapper[4676]: I0930 13:58:22.491532 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:22 crc kubenswrapper[4676]: I0930 13:58:22.491563 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:22 crc kubenswrapper[4676]: I0930 13:58:22.491554 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:22 crc kubenswrapper[4676]: I0930 13:58:22.491588 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:22 crc kubenswrapper[4676]: I0930 13:58:22.491567 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:22 crc kubenswrapper[4676]: I0930 13:58:22.650817 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 13:58:22 crc kubenswrapper[4676]: I0930 13:58:22.651006 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:58:22 crc kubenswrapper[4676]: I0930 13:58:22.652119 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:22 crc kubenswrapper[4676]: I0930 13:58:22.652172 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:22 crc kubenswrapper[4676]: I0930 13:58:22.652183 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:22 crc kubenswrapper[4676]: I0930 13:58:22.660639 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 13:58:23 crc kubenswrapper[4676]: I0930 13:58:23.492621 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:58:23 crc kubenswrapper[4676]: I0930 13:58:23.492703 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:58:23 crc kubenswrapper[4676]: I0930 13:58:23.493653 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:23 crc kubenswrapper[4676]: I0930 13:58:23.493691 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:23 crc kubenswrapper[4676]: I0930 13:58:23.493705 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:23 crc kubenswrapper[4676]: I0930 13:58:23.493804 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:23 crc kubenswrapper[4676]: I0930 13:58:23.493839 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:23 crc kubenswrapper[4676]: I0930 13:58:23.493850 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:23 crc kubenswrapper[4676]: I0930 13:58:23.843032 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:58:23 crc kubenswrapper[4676]: I0930 13:58:23.844181 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:23 crc kubenswrapper[4676]: I0930 13:58:23.844209 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:23 crc kubenswrapper[4676]: I0930 13:58:23.844218 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:23 crc kubenswrapper[4676]: I0930 13:58:23.844237 4676 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 13:58:24 crc kubenswrapper[4676]: I0930 13:58:24.634927 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 13:58:24 crc kubenswrapper[4676]: I0930 13:58:24.635240 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:58:24 crc kubenswrapper[4676]: I0930 13:58:24.636351 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:24 crc kubenswrapper[4676]: I0930 13:58:24.636406 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:24 crc kubenswrapper[4676]: I0930 13:58:24.636419 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:27 crc kubenswrapper[4676]: I0930 13:58:27.020807 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Sep 30 13:58:27 crc kubenswrapper[4676]: I0930 13:58:27.021051 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:58:27 crc kubenswrapper[4676]: I0930 13:58:27.022084 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:27 crc kubenswrapper[4676]: I0930 13:58:27.022113 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:27 crc kubenswrapper[4676]: I0930 13:58:27.022121 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:27 crc kubenswrapper[4676]: I0930 13:58:27.208999 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Sep 30 13:58:27 crc kubenswrapper[4676]: I0930 13:58:27.435465 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 13:58:27 crc kubenswrapper[4676]: I0930 13:58:27.435612 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:58:27 crc kubenswrapper[4676]: I0930 13:58:27.436585 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:27 crc kubenswrapper[4676]: I0930 13:58:27.436612 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:27 crc kubenswrapper[4676]: I0930 13:58:27.436627 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:27 crc kubenswrapper[4676]: I0930 13:58:27.501759 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:58:27 crc kubenswrapper[4676]: I0930 13:58:27.502580 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:27 crc kubenswrapper[4676]: I0930 13:58:27.502620 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:27 crc kubenswrapper[4676]: I0930 13:58:27.502628 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:27 crc kubenswrapper[4676]: E0930 13:58:27.536856 4676 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Sep 30 13:58:28 crc kubenswrapper[4676]: I0930 13:58:28.656075 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 13:58:28 crc kubenswrapper[4676]: I0930 13:58:28.656244 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:58:28 crc kubenswrapper[4676]: I0930 13:58:28.657418 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:28 crc kubenswrapper[4676]: I0930 13:58:28.657465 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:28 crc kubenswrapper[4676]: I0930 13:58:28.657485 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:28 crc kubenswrapper[4676]: I0930 13:58:28.660587 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 13:58:29 crc kubenswrapper[4676]: I0930 13:58:29.505242 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:58:29 crc kubenswrapper[4676]: I0930 13:58:29.506165 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:29 crc kubenswrapper[4676]: I0930 13:58:29.506205 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:29 crc kubenswrapper[4676]: I0930 13:58:29.506219 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:31 crc kubenswrapper[4676]: I0930 13:58:31.315433 4676 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Sep 30 13:58:31 crc kubenswrapper[4676]: I0930 13:58:31.315496 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Sep 30 13:58:31 crc kubenswrapper[4676]: I0930 13:58:31.323398 4676 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Sep 30 13:58:31 crc kubenswrapper[4676]: I0930 13:58:31.323473 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Sep 30 13:58:31 crc kubenswrapper[4676]: I0930 13:58:31.393781 4676 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Sep 30 13:58:31 crc kubenswrapper[4676]: [+]log ok Sep 30 13:58:31 crc kubenswrapper[4676]: [+]etcd ok Sep 30 13:58:31 crc kubenswrapper[4676]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Sep 30 13:58:31 crc kubenswrapper[4676]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Sep 30 13:58:31 crc kubenswrapper[4676]: [+]poststarthook/start-apiserver-admission-initializer ok Sep 30 13:58:31 crc kubenswrapper[4676]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Sep 30 13:58:31 crc kubenswrapper[4676]: [+]poststarthook/openshift.io-api-request-count-filter ok Sep 30 13:58:31 crc kubenswrapper[4676]: [+]poststarthook/openshift.io-startkubeinformers ok Sep 30 13:58:31 crc kubenswrapper[4676]: [+]poststarthook/generic-apiserver-start-informers ok Sep 30 13:58:31 crc kubenswrapper[4676]: [+]poststarthook/priority-and-fairness-config-consumer ok Sep 30 13:58:31 crc kubenswrapper[4676]: [+]poststarthook/priority-and-fairness-filter ok Sep 30 13:58:31 crc kubenswrapper[4676]: [+]poststarthook/storage-object-count-tracker-hook ok Sep 30 13:58:31 crc kubenswrapper[4676]: [+]poststarthook/start-apiextensions-informers ok Sep 30 13:58:31 crc kubenswrapper[4676]: [-]poststarthook/start-apiextensions-controllers failed: reason withheld Sep 30 13:58:31 crc kubenswrapper[4676]: [-]poststarthook/crd-informer-synced failed: reason withheld Sep 30 13:58:31 crc kubenswrapper[4676]: [+]poststarthook/start-system-namespaces-controller ok Sep 30 13:58:31 crc kubenswrapper[4676]: [+]poststarthook/start-cluster-authentication-info-controller ok Sep 30 13:58:31 crc kubenswrapper[4676]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Sep 30 13:58:31 crc kubenswrapper[4676]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Sep 30 13:58:31 crc kubenswrapper[4676]: [+]poststarthook/start-legacy-token-tracking-controller ok Sep 30 13:58:31 crc kubenswrapper[4676]: [+]poststarthook/start-service-ip-repair-controllers ok Sep 30 13:58:31 crc kubenswrapper[4676]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Sep 30 13:58:31 crc kubenswrapper[4676]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Sep 30 13:58:31 crc kubenswrapper[4676]: [-]poststarthook/priority-and-fairness-config-producer failed: reason withheld Sep 30 13:58:31 crc kubenswrapper[4676]: [-]poststarthook/bootstrap-controller failed: reason withheld Sep 30 13:58:31 crc kubenswrapper[4676]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Sep 30 13:58:31 crc kubenswrapper[4676]: [+]poststarthook/start-kube-aggregator-informers ok Sep 30 13:58:31 crc kubenswrapper[4676]: [+]poststarthook/apiservice-status-local-available-controller ok Sep 30 13:58:31 crc kubenswrapper[4676]: [+]poststarthook/apiservice-status-remote-available-controller ok Sep 30 13:58:31 crc kubenswrapper[4676]: [-]poststarthook/apiservice-registration-controller failed: reason withheld Sep 30 13:58:31 crc kubenswrapper[4676]: [+]poststarthook/apiservice-wait-for-first-sync ok Sep 30 13:58:31 crc kubenswrapper[4676]: [-]poststarthook/apiservice-discovery-controller failed: reason withheld Sep 30 13:58:31 crc kubenswrapper[4676]: [+]poststarthook/kube-apiserver-autoregistration ok Sep 30 13:58:31 crc kubenswrapper[4676]: [+]autoregister-completion ok Sep 30 13:58:31 crc kubenswrapper[4676]: [+]poststarthook/apiservice-openapi-controller ok Sep 30 13:58:31 crc kubenswrapper[4676]: [+]poststarthook/apiservice-openapiv3-controller ok Sep 30 13:58:31 crc kubenswrapper[4676]: livez check failed Sep 30 13:58:31 crc kubenswrapper[4676]: I0930 13:58:31.393846 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 13:58:31 crc kubenswrapper[4676]: I0930 13:58:31.656314 4676 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Sep 30 13:58:31 crc kubenswrapper[4676]: I0930 13:58:31.656404 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Sep 30 13:58:36 crc kubenswrapper[4676]: E0930 13:58:36.311021 4676 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.311485 4676 trace.go:236] Trace[1744488022]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Sep-2025 13:58:24.660) (total time: 11650ms): Sep 30 13:58:36 crc kubenswrapper[4676]: Trace[1744488022]: ---"Objects listed" error: 11650ms (13:58:36.311) Sep 30 13:58:36 crc kubenswrapper[4676]: Trace[1744488022]: [11.650883996s] [11.650883996s] END Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.311510 4676 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Sep 30 13:58:36 crc kubenswrapper[4676]: E0930 13:58:36.315628 4676 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.316752 4676 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.317008 4676 trace.go:236] Trace[996544586]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Sep-2025 13:58:26.256) (total time: 10060ms): Sep 30 13:58:36 crc kubenswrapper[4676]: Trace[996544586]: ---"Objects listed" error: 10060ms (13:58:36.316) Sep 30 13:58:36 crc kubenswrapper[4676]: Trace[996544586]: [10.060405469s] [10.060405469s] END Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.317032 4676 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.322108 4676 trace.go:236] Trace[151956673]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Sep-2025 13:58:21.438) (total time: 14883ms): Sep 30 13:58:36 crc kubenswrapper[4676]: Trace[151956673]: ---"Objects listed" error: 14883ms (13:58:36.322) Sep 30 13:58:36 crc kubenswrapper[4676]: Trace[151956673]: [14.883707823s] [14.883707823s] END Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.322134 4676 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.322806 4676 trace.go:236] Trace[1523223218]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Sep-2025 13:58:26.104) (total time: 10218ms): Sep 30 13:58:36 crc kubenswrapper[4676]: Trace[1523223218]: ---"Objects listed" error: 10218ms (13:58:36.322) Sep 30 13:58:36 crc kubenswrapper[4676]: Trace[1523223218]: [10.218603114s] [10.218603114s] END Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.322835 4676 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.371399 4676 apiserver.go:52] "Watching apiserver" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.374795 4676 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.375159 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h"] Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.375567 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.375648 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.376175 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.376218 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:58:36 crc kubenswrapper[4676]: E0930 13:58:36.376447 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:58:36 crc kubenswrapper[4676]: E0930 13:58:36.376224 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.376588 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:58:36 crc kubenswrapper[4676]: E0930 13:58:36.376664 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.376738 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.378708 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.378975 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.379045 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.379086 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.379186 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.379313 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.379472 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.379483 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.379476 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.390027 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.390164 4676 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.394933 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.403373 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.412604 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.417278 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.417393 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.417434 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.417464 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.417492 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.417520 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.417546 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.417580 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.417609 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.417639 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.417674 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.417698 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.417724 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.417755 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.417779 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.417803 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.417801 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.417832 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.417862 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.417916 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.417936 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.417956 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.417975 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.417995 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.418018 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.418040 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.418059 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.418062 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.418079 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.418101 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.418123 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.418144 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.418165 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.418193 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.418276 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.418301 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.418327 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.418350 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.418356 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.418372 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.418453 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.418480 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.418500 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.418529 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.418532 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.418540 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.418554 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.418620 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.418652 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.418678 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.418702 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.418725 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.418749 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.418784 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.418807 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.418827 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.418843 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.418864 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.418905 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.418969 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.419000 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.419022 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.419043 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.419069 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.419094 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.419117 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.419141 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.419163 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.419188 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.419213 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.419236 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.419262 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.419285 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.419314 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.419340 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.419365 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.419387 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.419411 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.419437 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.419459 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.419481 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.419506 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.419531 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.419556 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.419580 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.419607 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.419628 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.419652 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.419674 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.419698 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.419722 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.419748 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.419805 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.419829 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.419850 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.419895 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.419918 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.419942 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.420056 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.420081 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.420105 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.420151 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.420172 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.420194 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.420219 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.420243 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.420271 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.420297 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.420322 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.420347 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.420372 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.420398 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.420422 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.420446 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.420471 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.420495 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.420517 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.421949 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.422014 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.422050 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.422079 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.422105 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.422135 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.422161 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.422186 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.422211 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.422239 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.422268 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.422293 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.422318 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.422345 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.422370 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.422395 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.422419 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.422447 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.422481 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.422516 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.422542 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.422569 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.422593 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.422621 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.422648 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.422674 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.422708 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.422732 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.422758 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.422788 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.422810 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.422832 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.422856 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.422939 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.422965 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.422993 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.423019 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.423042 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.423084 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.423114 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.423143 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.423169 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.423198 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.423223 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.423253 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.423282 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.423311 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.423338 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.423363 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.423390 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.423414 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.423438 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.423465 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.423488 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.423517 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.423542 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.423568 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.423592 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.423628 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.423664 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.423690 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.423717 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.423746 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.423778 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.424793 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.424859 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.424971 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.425002 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.425029 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.425059 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.425086 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.425112 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.425137 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.425169 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.425193 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.425216 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.425242 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.425270 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.425300 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.425338 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.425406 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.425438 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.425469 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.425532 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.425584 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.425614 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.418724 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.427317 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.418739 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.419015 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.419026 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.419160 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.419259 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.419286 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.419459 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.419518 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.419715 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.419748 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.427452 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.419854 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.421539 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.422179 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.422233 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.422431 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.422561 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.422667 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.422962 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.423057 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.423346 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.423379 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.423590 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.423733 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.423753 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.424144 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.424133 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.424172 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.424640 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.425325 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.425630 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: E0930 13:58:36.425646 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:58:36.925617917 +0000 UTC m=+20.908706336 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:58:36 crc kubenswrapper[4676]: E0930 13:58:36.425695 4676 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.426169 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.426514 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.426573 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.426758 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.427700 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.427748 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.427010 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.427261 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.427625 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.427761 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.427959 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.428180 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.428258 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.428325 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.428451 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.428490 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.428528 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.428605 4676 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:33560->192.168.126.11:17697: read: connection reset by peer" start-of-body= Sep 30 13:58:36 crc kubenswrapper[4676]: E0930 13:58:36.428638 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 13:58:36.92861264 +0000 UTC m=+20.911701069 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.428663 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.428681 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:33560->192.168.126.11:17697: read: connection reset by peer" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.428805 4676 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:33552->192.168.126.11:17697: read: connection reset by peer" start-of-body= Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.428816 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.428827 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:33552->192.168.126.11:17697: read: connection reset by peer" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.428953 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.428799 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.429131 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.429202 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.429243 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.429277 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.429313 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.429349 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.429383 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.429409 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.429437 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.429463 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.429574 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.429593 4676 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.429609 4676 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.429624 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.429639 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.429658 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.429661 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.429709 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.429829 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.430061 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.430189 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.430253 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.430385 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.429674 4676 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.430415 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.430431 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.430421 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.429267 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.430434 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: E0930 13:58:36.430763 4676 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.430779 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: E0930 13:58:36.430812 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 13:58:36.930795823 +0000 UTC m=+20.913884252 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.433495 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.433753 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.433798 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.433866 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.434104 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.434165 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.434198 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.434446 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.434407 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.434900 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.434913 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.434949 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.435099 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.435215 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.435260 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.435298 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.435340 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.435613 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.436912 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.437338 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.437356 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.437375 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.437429 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.437967 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.438334 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.438639 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.438632 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.438840 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.438959 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.438974 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.439082 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.439106 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.439130 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.438459 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.439563 4676 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.441075 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.440631 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.439778 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.439931 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.440578 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.441153 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.442215 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.442808 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.443136 4676 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.443185 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.443206 4676 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.443229 4676 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.443257 4676 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.443277 4676 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.443296 4676 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.443311 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.443330 4676 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.443347 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.443367 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.443386 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.443406 4676 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.443422 4676 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.443441 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.443456 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.443473 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.443487 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.443635 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.443654 4676 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.443665 4676 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.443675 4676 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.443690 4676 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.443702 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.443714 4676 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.443725 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.443738 4676 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.443749 4676 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.443771 4676 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.443782 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.443799 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.443811 4676 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.443823 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.443839 4676 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.443850 4676 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.443862 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.443938 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.443955 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.443966 4676 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.443976 4676 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.443986 4676 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.443998 4676 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.444009 4676 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.444020 4676 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.444033 4676 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.444045 4676 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.444056 4676 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.444123 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.430428 4676 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Sep 30 13:58:36 crc kubenswrapper[4676]: E0930 13:58:36.445205 4676 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 13:58:36 crc kubenswrapper[4676]: E0930 13:58:36.452151 4676 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 13:58:36 crc kubenswrapper[4676]: E0930 13:58:36.452185 4676 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:58:36 crc kubenswrapper[4676]: E0930 13:58:36.452290 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 13:58:36.952272025 +0000 UTC m=+20.935360454 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.453595 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.454216 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.454394 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.454505 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.454619 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.454711 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.454795 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.455366 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: E0930 13:58:36.455437 4676 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 13:58:36 crc kubenswrapper[4676]: E0930 13:58:36.455461 4676 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 13:58:36 crc kubenswrapper[4676]: E0930 13:58:36.455475 4676 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:58:36 crc kubenswrapper[4676]: E0930 13:58:36.455534 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 13:58:36.955516603 +0000 UTC m=+20.938605242 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.455550 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.455869 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.456077 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.456203 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.456403 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.456692 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.457028 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.458293 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.458603 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.459273 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.459362 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.462376 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.462469 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.462834 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.463686 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.464327 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.464482 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.464637 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.464927 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.464991 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.465231 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.465239 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.465754 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.467143 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.467152 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.467197 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.467404 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.467515 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.467709 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.467751 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.468305 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.468908 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.468929 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.470015 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.470162 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.470500 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.470718 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.471090 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.471673 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.471743 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.471763 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.471783 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.471951 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.472018 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.472112 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.472269 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.472352 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.472399 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.472438 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.472475 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.472484 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.472592 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.472695 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.472963 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.473661 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.473664 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.473769 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.473898 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.473910 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.475074 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.475262 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.475331 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.476222 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.476345 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.476386 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.476478 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.476492 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.476539 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.476563 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.477208 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.477248 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.477293 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.477423 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.477502 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.477551 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.479659 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.480008 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.480529 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.480800 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.480834 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.480891 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.484995 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.485223 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.489102 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.493958 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.503407 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.503806 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.515113 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.528950 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.532248 4676 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3049f477ecf683e57b583e8f15aa9e5335347be632ffc02799ef279fefe9b446" exitCode=255 Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.532287 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"3049f477ecf683e57b583e8f15aa9e5335347be632ffc02799ef279fefe9b446"} Sep 30 13:58:36 crc kubenswrapper[4676]: E0930 13:58:36.537820 4676 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-apiserver-crc\" already exists" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.538852 4676 scope.go:117] "RemoveContainer" containerID="3049f477ecf683e57b583e8f15aa9e5335347be632ffc02799ef279fefe9b446" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.544409 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86284cfd-7d99-4f3b-bb3a-593b66737ae3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf15b6b87f6e760bf922b02b49658153875a79c4253bc50a895372c460e0f077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f838844cf38dd9b54b6054863a395010ea5cb87ca8066642ffe388365a91f5a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://151ae8ebe882d8c0150cd97088a56e52f55f0ae066819f8e4aaae2279a61881f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3049f477ecf683e57b583e8f15aa9e5335347be632ffc02799ef279fefe9b446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3049f477ecf683e57b583e8f15aa9e5335347be632ffc02799ef279fefe9b446\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 13:58:31.152502 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:58:31.168708 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3623665920/tls.crt::/tmp/serving-cert-3623665920/tls.key\\\\\\\"\\\\nI0930 13:58:36.402239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:58:36.407288 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:58:36.407313 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:58:36.407332 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:58:36.407337 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:58:36.414209 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:58:36.414232 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414236 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:58:36.414244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:58:36.414247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:58:36.414250 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:58:36.414249 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:58:36.417272 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e2c7ce3911aaca47ffc14e5628e9efb3b32594d812028b2659b8f8ca681d53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.544774 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.544808 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.544856 4676 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.544870 4676 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.544896 4676 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.544906 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.544916 4676 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.544924 4676 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.544934 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.544943 4676 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.544952 4676 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.544960 4676 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.544969 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.544978 4676 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.544986 4676 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.544995 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545003 4676 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545012 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545020 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545028 4676 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545036 4676 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545045 4676 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545054 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545064 4676 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545097 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545107 4676 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545116 4676 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545124 4676 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545133 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545141 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545150 4676 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545158 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545166 4676 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545175 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545183 4676 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545208 4676 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545223 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545229 4676 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545241 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545253 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545264 4676 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545275 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545285 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545294 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545304 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545314 4676 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545324 4676 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545334 4676 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545344 4676 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545357 4676 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545367 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545377 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545388 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545399 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545410 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545420 4676 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545430 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545440 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545453 4676 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545472 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545484 4676 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545499 4676 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545512 4676 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545524 4676 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545542 4676 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545569 4676 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545580 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545588 4676 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545595 4676 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545604 4676 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545611 4676 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545619 4676 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545627 4676 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545635 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545643 4676 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545657 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545665 4676 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545685 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545694 4676 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545703 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545713 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545721 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545730 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545738 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545747 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545755 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545763 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545771 4676 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545779 4676 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545788 4676 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545808 4676 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545817 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545826 4676 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545835 4676 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545844 4676 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545852 4676 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545860 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545868 4676 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545890 4676 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545901 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545910 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545918 4676 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545926 4676 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545934 4676 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545942 4676 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545950 4676 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.545960 4676 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.546009 4676 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.546018 4676 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.546027 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.546035 4676 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.546045 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.546054 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.546062 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.546071 4676 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.546080 4676 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.546088 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.546096 4676 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.546111 4676 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.546120 4676 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.546130 4676 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.546138 4676 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.546146 4676 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.546154 4676 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.546163 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.546172 4676 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.546181 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.546191 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.546199 4676 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.546207 4676 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.546215 4676 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.546251 4676 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.546261 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.546269 4676 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.546278 4676 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.546286 4676 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.546294 4676 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.546302 4676 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.546311 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.546319 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.546564 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.553889 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.565914 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.574770 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.583606 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.591842 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.601348 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.690602 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 13:58:36 crc kubenswrapper[4676]: W0930 13:58:36.700758 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-850b3ebfec80dba77a36620a813450d011c43ab6527236cb2e490346bc0c0753 WatchSource:0}: Error finding container 850b3ebfec80dba77a36620a813450d011c43ab6527236cb2e490346bc0c0753: Status 404 returned error can't find the container with id 850b3ebfec80dba77a36620a813450d011c43ab6527236cb2e490346bc0c0753 Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.702459 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.706671 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 13:58:36 crc kubenswrapper[4676]: W0930 13:58:36.713908 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-6f4ac0d3b1ec31105ba45dab233655e6f21c33c6767d707805b37aa3c93dfcaa WatchSource:0}: Error finding container 6f4ac0d3b1ec31105ba45dab233655e6f21c33c6767d707805b37aa3c93dfcaa: Status 404 returned error can't find the container with id 6f4ac0d3b1ec31105ba45dab233655e6f21c33c6767d707805b37aa3c93dfcaa Sep 30 13:58:36 crc kubenswrapper[4676]: W0930 13:58:36.722969 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-50c9b79eb23e421445ed15444b8b380b08d872b5ce9a3eea4bcd760b9bd422a5 WatchSource:0}: Error finding container 50c9b79eb23e421445ed15444b8b380b08d872b5ce9a3eea4bcd760b9bd422a5: Status 404 returned error can't find the container with id 50c9b79eb23e421445ed15444b8b380b08d872b5ce9a3eea4bcd760b9bd422a5 Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.950564 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.950699 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:58:36 crc kubenswrapper[4676]: I0930 13:58:36.950737 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:58:36 crc kubenswrapper[4676]: E0930 13:58:36.950835 4676 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 13:58:36 crc kubenswrapper[4676]: E0930 13:58:36.950918 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 13:58:37.950900017 +0000 UTC m=+21.933988446 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 13:58:36 crc kubenswrapper[4676]: E0930 13:58:36.951216 4676 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 13:58:36 crc kubenswrapper[4676]: E0930 13:58:36.951293 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:58:37.951264706 +0000 UTC m=+21.934353135 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:58:36 crc kubenswrapper[4676]: E0930 13:58:36.951316 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 13:58:37.951308207 +0000 UTC m=+21.934396636 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.051722 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.051771 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:58:37 crc kubenswrapper[4676]: E0930 13:58:37.051898 4676 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 13:58:37 crc kubenswrapper[4676]: E0930 13:58:37.051914 4676 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 13:58:37 crc kubenswrapper[4676]: E0930 13:58:37.051924 4676 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:58:37 crc kubenswrapper[4676]: E0930 13:58:37.051963 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 13:58:38.051951094 +0000 UTC m=+22.035039523 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:58:37 crc kubenswrapper[4676]: E0930 13:58:37.052037 4676 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 13:58:37 crc kubenswrapper[4676]: E0930 13:58:37.052109 4676 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 13:58:37 crc kubenswrapper[4676]: E0930 13:58:37.052123 4676 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:58:37 crc kubenswrapper[4676]: E0930 13:58:37.052207 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 13:58:38.05218683 +0000 UTC m=+22.035275329 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.234358 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.253836 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.271054 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.273616 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:37Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.286157 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:37Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.298518 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:37Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.311532 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:37Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.323492 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86284cfd-7d99-4f3b-bb3a-593b66737ae3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf15b6b87f6e760bf922b02b49658153875a79c4253bc50a895372c460e0f077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f838844cf38dd9b54b6054863a395010ea5cb87ca8066642ffe388365a91f5a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://151ae8ebe882d8c0150cd97088a56e52f55f0ae066819f8e4aaae2279a61881f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3049f477ecf683e57b583e8f15aa9e5335347be632ffc02799ef279fefe9b446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3049f477ecf683e57b583e8f15aa9e5335347be632ffc02799ef279fefe9b446\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 13:58:31.152502 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:58:31.168708 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3623665920/tls.crt::/tmp/serving-cert-3623665920/tls.key\\\\\\\"\\\\nI0930 13:58:36.402239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:58:36.407288 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:58:36.407313 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:58:36.407332 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:58:36.407337 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:58:36.414209 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:58:36.414232 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414236 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:58:36.414244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:58:36.414247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:58:36.414250 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:58:36.414249 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:58:36.417272 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e2c7ce3911aaca47ffc14e5628e9efb3b32594d812028b2659b8f8ca681d53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:37Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.335098 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:37Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.347381 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:37Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.359746 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:37Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.370859 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:37Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.383076 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:37Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.394234 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:37Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.418611 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9343584b-a3e2-45a2-9f71-b815d206f44c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://168155413f2783887fde890a607fd98c1823da5a45b7fc550ef9358750451da7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69000b0d87b6f1e68fb36dd2349f95de211b17cc4e7bd7ec489969feca3810d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0228750af194e47c41da2795fb0b21b2c44b61e7d5a8943a943793954b33e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://288cb819756dee05f608c0e6875f3a2b4eb629a5d07db35aaf706a4bc4110899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9e51dab4ec7df53e36f707786a013e4fca0909f0d69cb9113dfdff6e0c9a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:37Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.435122 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86284cfd-7d99-4f3b-bb3a-593b66737ae3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf15b6b87f6e760bf922b02b49658153875a79c4253bc50a895372c460e0f077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f838844cf38dd9b54b6054863a395010ea5cb87ca8066642ffe388365a91f5a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://151ae8ebe882d8c0150cd97088a56e52f55f0ae066819f8e4aaae2279a61881f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3049f477ecf683e57b583e8f15aa9e5335347be632ffc02799ef279fefe9b446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3049f477ecf683e57b583e8f15aa9e5335347be632ffc02799ef279fefe9b446\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 13:58:31.152502 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:58:31.168708 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3623665920/tls.crt::/tmp/serving-cert-3623665920/tls.key\\\\\\\"\\\\nI0930 13:58:36.402239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:58:36.407288 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:58:36.407313 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:58:36.407332 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:58:36.407337 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:58:36.414209 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:58:36.414232 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414236 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:58:36.414244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:58:36.414247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:58:36.414250 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:58:36.414249 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:58:36.417272 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e2c7ce3911aaca47ffc14e5628e9efb3b32594d812028b2659b8f8ca681d53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:37Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.436122 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.436987 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.437788 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.438536 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.440204 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.440809 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.441931 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.442515 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.443652 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.444271 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.444803 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.446321 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.446811 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.448021 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.448651 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.449428 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:37Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.450340 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.451055 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.451436 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.452662 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.454140 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.454601 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.455692 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.456135 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.457246 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.457652 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.458668 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.459307 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.459784 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.460734 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.461305 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.462130 4676 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.462195 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:37Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.462226 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.464035 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.465161 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.465547 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.467217 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.468271 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.468780 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.469958 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.470595 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.471116 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.472097 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.473101 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.473660 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.474468 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.475021 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.475914 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.476642 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.477750 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.478393 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.478845 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.479331 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86284cfd-7d99-4f3b-bb3a-593b66737ae3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf15b6b87f6e760bf922b02b49658153875a79c4253bc50a895372c460e0f077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f838844cf38dd9b54b6054863a395010ea5cb87ca8066642ffe388365a91f5a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://151ae8ebe882d8c0150cd97088a56e52f55f0ae066819f8e4aaae2279a61881f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3049f477ecf683e57b583e8f15aa9e5335347be632ffc02799ef279fefe9b446\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3049f477ecf683e57b583e8f15aa9e5335347be632ffc02799ef279fefe9b446\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 13:58:31.152502 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:58:31.168708 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3623665920/tls.crt::/tmp/serving-cert-3623665920/tls.key\\\\\\\"\\\\nI0930 13:58:36.402239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:58:36.407288 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:58:36.407313 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:58:36.407332 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:58:36.407337 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:58:36.414209 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:58:36.414232 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414236 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:58:36.414244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:58:36.414247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:58:36.414250 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:58:36.414249 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:58:36.417272 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e2c7ce3911aaca47ffc14e5628e9efb3b32594d812028b2659b8f8ca681d53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:37Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.479803 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.480350 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.481237 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.491805 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:37Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.506170 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:37Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.520240 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:37Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.537252 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:37Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.538213 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"03e12a8bf70722b099302a9c3b8d88526bbea68746ea168b2bf41306a3e18641"} Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.538255 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"850b3ebfec80dba77a36620a813450d011c43ab6527236cb2e490346bc0c0753"} Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.541433 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.550760 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6ff5fec130c99031e2cb7164e0a3d5c4af84e2b2ac25d3cf255008dc5c1ab967"} Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.551286 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.552111 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"50c9b79eb23e421445ed15444b8b380b08d872b5ce9a3eea4bcd760b9bd422a5"} Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.553955 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"54e1c7778cf1d8d9f6dc68d72d153ea277d19d66d775fa02ed06687e3392040c"} Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.553983 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"98be87b4cb316839099b69460cc1c1ecd98d5162c7dfc995c8f6d86ad477fedd"} Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.554014 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"6f4ac0d3b1ec31105ba45dab233655e6f21c33c6767d707805b37aa3c93dfcaa"} Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.558646 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:37Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:37 crc kubenswrapper[4676]: E0930 13:58:37.562713 4676 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-crc\" already exists" pod="openshift-etcd/etcd-crc" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.577688 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:37Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.598931 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9343584b-a3e2-45a2-9f71-b815d206f44c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://168155413f2783887fde890a607fd98c1823da5a45b7fc550ef9358750451da7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69000b0d87b6f1e68fb36dd2349f95de211b17cc4e7bd7ec489969feca3810d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0228750af194e47c41da2795fb0b21b2c44b61e7d5a8943a943793954b33e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://288cb819756dee05f608c0e6875f3a2b4eb629a5d07db35aaf706a4bc4110899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9e51dab4ec7df53e36f707786a013e4fca0909f0d69cb9113dfdff6e0c9a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:37Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.617358 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9343584b-a3e2-45a2-9f71-b815d206f44c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://168155413f2783887fde890a607fd98c1823da5a45b7fc550ef9358750451da7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69000b0d87b6f1e68fb36dd2349f95de211b17cc4e7bd7ec489969feca3810d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0228750af194e47c41da2795fb0b21b2c44b61e7d5a8943a943793954b33e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://288cb819756dee05f608c0e6875f3a2b4eb629a5d07db35aaf706a4bc4110899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9e51dab4ec7df53e36f707786a013e4fca0909f0d69cb9113dfdff6e0c9a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:37Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.632593 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e12a8bf70722b099302a9c3b8d88526bbea68746ea168b2bf41306a3e18641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:37Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.643500 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e1c7778cf1d8d9f6dc68d72d153ea277d19d66d775fa02ed06687e3392040c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98be87b4cb316839099b69460cc1c1ecd98d5162c7dfc995c8f6d86ad477fedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:37Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.654282 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:37Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.664616 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:37Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.676757 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:37Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.687439 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:37Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.699249 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86284cfd-7d99-4f3b-bb3a-593b66737ae3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf15b6b87f6e760bf922b02b49658153875a79c4253bc50a895372c460e0f077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f838844cf38dd9b54b6054863a395010ea5cb87ca8066642ffe388365a91f5a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://151ae8ebe882d8c0150cd97088a56e52f55f0ae066819f8e4aaae2279a61881f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff5fec130c99031e2cb7164e0a3d5c4af84e2b2ac25d3cf255008dc5c1ab967\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3049f477ecf683e57b583e8f15aa9e5335347be632ffc02799ef279fefe9b446\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 13:58:31.152502 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:58:31.168708 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3623665920/tls.crt::/tmp/serving-cert-3623665920/tls.key\\\\\\\"\\\\nI0930 13:58:36.402239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:58:36.407288 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:58:36.407313 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:58:36.407332 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:58:36.407337 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:58:36.414209 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:58:36.414232 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414236 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:58:36.414244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:58:36.414247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:58:36.414250 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:58:36.414249 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:58:36.417272 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e2c7ce3911aaca47ffc14e5628e9efb3b32594d812028b2659b8f8ca681d53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:37Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.958682 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.958759 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:58:37 crc kubenswrapper[4676]: I0930 13:58:37.958794 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:58:37 crc kubenswrapper[4676]: E0930 13:58:37.958910 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:58:39.958858653 +0000 UTC m=+23.941947092 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:58:37 crc kubenswrapper[4676]: E0930 13:58:37.958942 4676 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 13:58:37 crc kubenswrapper[4676]: E0930 13:58:37.958961 4676 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 13:58:37 crc kubenswrapper[4676]: E0930 13:58:37.959011 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 13:58:39.958994016 +0000 UTC m=+23.942082475 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 13:58:37 crc kubenswrapper[4676]: E0930 13:58:37.959032 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 13:58:39.959023907 +0000 UTC m=+23.942112426 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 13:58:38 crc kubenswrapper[4676]: I0930 13:58:38.059798 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:58:38 crc kubenswrapper[4676]: I0930 13:58:38.059919 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:58:38 crc kubenswrapper[4676]: E0930 13:58:38.060026 4676 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 13:58:38 crc kubenswrapper[4676]: E0930 13:58:38.060078 4676 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 13:58:38 crc kubenswrapper[4676]: E0930 13:58:38.060093 4676 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:58:38 crc kubenswrapper[4676]: E0930 13:58:38.060117 4676 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 13:58:38 crc kubenswrapper[4676]: E0930 13:58:38.060146 4676 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 13:58:38 crc kubenswrapper[4676]: E0930 13:58:38.060170 4676 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:58:38 crc kubenswrapper[4676]: E0930 13:58:38.060174 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 13:58:40.060153265 +0000 UTC m=+24.043241864 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:58:38 crc kubenswrapper[4676]: E0930 13:58:38.060241 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 13:58:40.060218007 +0000 UTC m=+24.043306616 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:58:38 crc kubenswrapper[4676]: I0930 13:58:38.432072 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:58:38 crc kubenswrapper[4676]: I0930 13:58:38.432116 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:58:38 crc kubenswrapper[4676]: I0930 13:58:38.432123 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:58:38 crc kubenswrapper[4676]: E0930 13:58:38.432219 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:58:38 crc kubenswrapper[4676]: E0930 13:58:38.432316 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:58:38 crc kubenswrapper[4676]: E0930 13:58:38.432424 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:58:38 crc kubenswrapper[4676]: I0930 13:58:38.660510 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 13:58:38 crc kubenswrapper[4676]: I0930 13:58:38.664539 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 13:58:38 crc kubenswrapper[4676]: I0930 13:58:38.667206 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Sep 30 13:58:38 crc kubenswrapper[4676]: I0930 13:58:38.671134 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e1c7778cf1d8d9f6dc68d72d153ea277d19d66d775fa02ed06687e3392040c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98be87b4cb316839099b69460cc1c1ecd98d5162c7dfc995c8f6d86ad477fedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:38Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:38 crc kubenswrapper[4676]: I0930 13:58:38.681504 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:38Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:38 crc kubenswrapper[4676]: I0930 13:58:38.692540 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:38Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:38 crc kubenswrapper[4676]: I0930 13:58:38.710959 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9343584b-a3e2-45a2-9f71-b815d206f44c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://168155413f2783887fde890a607fd98c1823da5a45b7fc550ef9358750451da7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69000b0d87b6f1e68fb36dd2349f95de211b17cc4e7bd7ec489969feca3810d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0228750af194e47c41da2795fb0b21b2c44b61e7d5a8943a943793954b33e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://288cb819756dee05f608c0e6875f3a2b4eb629a5d07db35aaf706a4bc4110899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9e51dab4ec7df53e36f707786a013e4fca0909f0d69cb9113dfdff6e0c9a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:38Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:38 crc kubenswrapper[4676]: I0930 13:58:38.722395 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e12a8bf70722b099302a9c3b8d88526bbea68746ea168b2bf41306a3e18641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:38Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:38 crc kubenswrapper[4676]: I0930 13:58:38.738771 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86284cfd-7d99-4f3b-bb3a-593b66737ae3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf15b6b87f6e760bf922b02b49658153875a79c4253bc50a895372c460e0f077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f838844cf38dd9b54b6054863a395010ea5cb87ca8066642ffe388365a91f5a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://151ae8ebe882d8c0150cd97088a56e52f55f0ae066819f8e4aaae2279a61881f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff5fec130c99031e2cb7164e0a3d5c4af84e2b2ac25d3cf255008dc5c1ab967\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3049f477ecf683e57b583e8f15aa9e5335347be632ffc02799ef279fefe9b446\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 13:58:31.152502 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:58:31.168708 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3623665920/tls.crt::/tmp/serving-cert-3623665920/tls.key\\\\\\\"\\\\nI0930 13:58:36.402239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:58:36.407288 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:58:36.407313 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:58:36.407332 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:58:36.407337 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:58:36.414209 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:58:36.414232 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414236 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:58:36.414244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:58:36.414247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:58:36.414250 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:58:36.414249 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:58:36.417272 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e2c7ce3911aaca47ffc14e5628e9efb3b32594d812028b2659b8f8ca681d53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:38Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:38 crc kubenswrapper[4676]: I0930 13:58:38.748749 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:38Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:38 crc kubenswrapper[4676]: I0930 13:58:38.758466 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:38Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:38 crc kubenswrapper[4676]: I0930 13:58:38.768713 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e1c7778cf1d8d9f6dc68d72d153ea277d19d66d775fa02ed06687e3392040c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98be87b4cb316839099b69460cc1c1ecd98d5162c7dfc995c8f6d86ad477fedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:38Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:38 crc kubenswrapper[4676]: I0930 13:58:38.780212 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:38Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:38 crc kubenswrapper[4676]: I0930 13:58:38.790612 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:38Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:38 crc kubenswrapper[4676]: I0930 13:58:38.810535 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9343584b-a3e2-45a2-9f71-b815d206f44c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://168155413f2783887fde890a607fd98c1823da5a45b7fc550ef9358750451da7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69000b0d87b6f1e68fb36dd2349f95de211b17cc4e7bd7ec489969feca3810d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0228750af194e47c41da2795fb0b21b2c44b61e7d5a8943a943793954b33e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://288cb819756dee05f608c0e6875f3a2b4eb629a5d07db35aaf706a4bc4110899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9e51dab4ec7df53e36f707786a013e4fca0909f0d69cb9113dfdff6e0c9a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:38Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:38 crc kubenswrapper[4676]: I0930 13:58:38.823269 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2355fa03-bfc0-4ba0-941c-ceee570ab4b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb64f69d88447dcfe2a2e206f1e59b4b10f370ac02a3f9653e0b95e6d94529c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f628b643cc4c4a9a0d4b2c6d923d6ee5839b325a873da093e84dfc48255548f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86ba7b90dcf7a1fcd028d2fcf19bb86de8c58446ef34191d50a5055f4042bb71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5591cefc1293c938f8a2a8db06c7287d0e54d55a9d99c65a40126ca77abccd80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:38Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:38 crc kubenswrapper[4676]: I0930 13:58:38.834484 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e12a8bf70722b099302a9c3b8d88526bbea68746ea168b2bf41306a3e18641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:38Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:38 crc kubenswrapper[4676]: I0930 13:58:38.847160 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86284cfd-7d99-4f3b-bb3a-593b66737ae3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf15b6b87f6e760bf922b02b49658153875a79c4253bc50a895372c460e0f077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f838844cf38dd9b54b6054863a395010ea5cb87ca8066642ffe388365a91f5a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://151ae8ebe882d8c0150cd97088a56e52f55f0ae066819f8e4aaae2279a61881f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff5fec130c99031e2cb7164e0a3d5c4af84e2b2ac25d3cf255008dc5c1ab967\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3049f477ecf683e57b583e8f15aa9e5335347be632ffc02799ef279fefe9b446\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 13:58:31.152502 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:58:31.168708 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3623665920/tls.crt::/tmp/serving-cert-3623665920/tls.key\\\\\\\"\\\\nI0930 13:58:36.402239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:58:36.407288 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:58:36.407313 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:58:36.407332 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:58:36.407337 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:58:36.414209 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:58:36.414232 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414236 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:58:36.414244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:58:36.414247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:58:36.414250 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:58:36.414249 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:58:36.417272 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e2c7ce3911aaca47ffc14e5628e9efb3b32594d812028b2659b8f8ca681d53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:38Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:38 crc kubenswrapper[4676]: I0930 13:58:38.858488 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:38Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:38 crc kubenswrapper[4676]: I0930 13:58:38.870262 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:38Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:39 crc kubenswrapper[4676]: I0930 13:58:39.559892 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"938f56b18fe369ceeb13a74b0c257fd1b7fafeeb4d331327d4508f482caca32b"} Sep 30 13:58:39 crc kubenswrapper[4676]: I0930 13:58:39.577686 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86284cfd-7d99-4f3b-bb3a-593b66737ae3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf15b6b87f6e760bf922b02b49658153875a79c4253bc50a895372c460e0f077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f838844cf38dd9b54b6054863a395010ea5cb87ca8066642ffe388365a91f5a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://151ae8ebe882d8c0150cd97088a56e52f55f0ae066819f8e4aaae2279a61881f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff5fec130c99031e2cb7164e0a3d5c4af84e2b2ac25d3cf255008dc5c1ab967\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3049f477ecf683e57b583e8f15aa9e5335347be632ffc02799ef279fefe9b446\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 13:58:31.152502 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:58:31.168708 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3623665920/tls.crt::/tmp/serving-cert-3623665920/tls.key\\\\\\\"\\\\nI0930 13:58:36.402239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:58:36.407288 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:58:36.407313 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:58:36.407332 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:58:36.407337 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:58:36.414209 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:58:36.414232 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414236 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:58:36.414244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:58:36.414247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:58:36.414250 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:58:36.414249 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:58:36.417272 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e2c7ce3911aaca47ffc14e5628e9efb3b32594d812028b2659b8f8ca681d53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:39Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:39 crc kubenswrapper[4676]: I0930 13:58:39.590034 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:39Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:39 crc kubenswrapper[4676]: I0930 13:58:39.606333 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:39Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:39 crc kubenswrapper[4676]: I0930 13:58:39.626268 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9343584b-a3e2-45a2-9f71-b815d206f44c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://168155413f2783887fde890a607fd98c1823da5a45b7fc550ef9358750451da7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69000b0d87b6f1e68fb36dd2349f95de211b17cc4e7bd7ec489969feca3810d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0228750af194e47c41da2795fb0b21b2c44b61e7d5a8943a943793954b33e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://288cb819756dee05f608c0e6875f3a2b4eb629a5d07db35aaf706a4bc4110899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9e51dab4ec7df53e36f707786a013e4fca0909f0d69cb9113dfdff6e0c9a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:39Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:39 crc kubenswrapper[4676]: I0930 13:58:39.640416 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2355fa03-bfc0-4ba0-941c-ceee570ab4b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb64f69d88447dcfe2a2e206f1e59b4b10f370ac02a3f9653e0b95e6d94529c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f628b643cc4c4a9a0d4b2c6d923d6ee5839b325a873da093e84dfc48255548f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86ba7b90dcf7a1fcd028d2fcf19bb86de8c58446ef34191d50a5055f4042bb71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5591cefc1293c938f8a2a8db06c7287d0e54d55a9d99c65a40126ca77abccd80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:39Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:39 crc kubenswrapper[4676]: I0930 13:58:39.656781 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e12a8bf70722b099302a9c3b8d88526bbea68746ea168b2bf41306a3e18641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:39Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:39 crc kubenswrapper[4676]: I0930 13:58:39.670362 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e1c7778cf1d8d9f6dc68d72d153ea277d19d66d775fa02ed06687e3392040c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98be87b4cb316839099b69460cc1c1ecd98d5162c7dfc995c8f6d86ad477fedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:39Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:39 crc kubenswrapper[4676]: I0930 13:58:39.686147 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:39Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:39 crc kubenswrapper[4676]: I0930 13:58:39.697731 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938f56b18fe369ceeb13a74b0c257fd1b7fafeeb4d331327d4508f482caca32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:39Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:39 crc kubenswrapper[4676]: I0930 13:58:39.974379 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:58:39 crc kubenswrapper[4676]: I0930 13:58:39.974465 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:58:39 crc kubenswrapper[4676]: I0930 13:58:39.974500 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:58:39 crc kubenswrapper[4676]: E0930 13:58:39.974587 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:58:43.974560109 +0000 UTC m=+27.957648528 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:58:39 crc kubenswrapper[4676]: E0930 13:58:39.974597 4676 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 13:58:39 crc kubenswrapper[4676]: E0930 13:58:39.974674 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 13:58:43.974665242 +0000 UTC m=+27.957753761 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 13:58:39 crc kubenswrapper[4676]: E0930 13:58:39.974675 4676 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 13:58:39 crc kubenswrapper[4676]: E0930 13:58:39.974826 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 13:58:43.974800075 +0000 UTC m=+27.957888494 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 13:58:40 crc kubenswrapper[4676]: I0930 13:58:40.075946 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:58:40 crc kubenswrapper[4676]: I0930 13:58:40.076032 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:58:40 crc kubenswrapper[4676]: E0930 13:58:40.076152 4676 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 13:58:40 crc kubenswrapper[4676]: E0930 13:58:40.076167 4676 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 13:58:40 crc kubenswrapper[4676]: E0930 13:58:40.076179 4676 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:58:40 crc kubenswrapper[4676]: E0930 13:58:40.076230 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 13:58:44.07621627 +0000 UTC m=+28.059304699 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:58:40 crc kubenswrapper[4676]: E0930 13:58:40.076250 4676 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 13:58:40 crc kubenswrapper[4676]: E0930 13:58:40.076300 4676 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 13:58:40 crc kubenswrapper[4676]: E0930 13:58:40.076317 4676 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:58:40 crc kubenswrapper[4676]: E0930 13:58:40.076383 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 13:58:44.076360393 +0000 UTC m=+28.059448902 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:58:40 crc kubenswrapper[4676]: I0930 13:58:40.432988 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:58:40 crc kubenswrapper[4676]: I0930 13:58:40.432999 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:58:40 crc kubenswrapper[4676]: I0930 13:58:40.432988 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:58:40 crc kubenswrapper[4676]: E0930 13:58:40.433119 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:58:40 crc kubenswrapper[4676]: E0930 13:58:40.433300 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:58:40 crc kubenswrapper[4676]: E0930 13:58:40.433364 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:58:41 crc kubenswrapper[4676]: I0930 13:58:41.560992 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-7stxd"] Sep 30 13:58:41 crc kubenswrapper[4676]: I0930 13:58:41.561279 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7stxd" Sep 30 13:58:41 crc kubenswrapper[4676]: I0930 13:58:41.564137 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Sep 30 13:58:41 crc kubenswrapper[4676]: I0930 13:58:41.564161 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Sep 30 13:58:41 crc kubenswrapper[4676]: I0930 13:58:41.564134 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Sep 30 13:58:41 crc kubenswrapper[4676]: I0930 13:58:41.569986 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Sep 30 13:58:41 crc kubenswrapper[4676]: I0930 13:58:41.577076 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86284cfd-7d99-4f3b-bb3a-593b66737ae3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf15b6b87f6e760bf922b02b49658153875a79c4253bc50a895372c460e0f077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f838844cf38dd9b54b6054863a395010ea5cb87ca8066642ffe388365a91f5a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://151ae8ebe882d8c0150cd97088a56e52f55f0ae066819f8e4aaae2279a61881f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff5fec130c99031e2cb7164e0a3d5c4af84e2b2ac25d3cf255008dc5c1ab967\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3049f477ecf683e57b583e8f15aa9e5335347be632ffc02799ef279fefe9b446\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 13:58:31.152502 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:58:31.168708 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3623665920/tls.crt::/tmp/serving-cert-3623665920/tls.key\\\\\\\"\\\\nI0930 13:58:36.402239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:58:36.407288 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:58:36.407313 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:58:36.407332 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:58:36.407337 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:58:36.414209 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:58:36.414232 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414236 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:58:36.414244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:58:36.414247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:58:36.414250 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:58:36.414249 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:58:36.417272 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e2c7ce3911aaca47ffc14e5628e9efb3b32594d812028b2659b8f8ca681d53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:41Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:41 crc kubenswrapper[4676]: I0930 13:58:41.587662 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:41Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:41 crc kubenswrapper[4676]: I0930 13:58:41.598630 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:41Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:41 crc kubenswrapper[4676]: I0930 13:58:41.607189 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7stxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e065baf-f38b-4397-bbbe-ef52eea10f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nkkjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7stxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:41Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:41 crc kubenswrapper[4676]: I0930 13:58:41.617117 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938f56b18fe369ceeb13a74b0c257fd1b7fafeeb4d331327d4508f482caca32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:41Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:41 crc kubenswrapper[4676]: I0930 13:58:41.633428 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9343584b-a3e2-45a2-9f71-b815d206f44c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://168155413f2783887fde890a607fd98c1823da5a45b7fc550ef9358750451da7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69000b0d87b6f1e68fb36dd2349f95de211b17cc4e7bd7ec489969feca3810d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0228750af194e47c41da2795fb0b21b2c44b61e7d5a8943a943793954b33e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://288cb819756dee05f608c0e6875f3a2b4eb629a5d07db35aaf706a4bc4110899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9e51dab4ec7df53e36f707786a013e4fca0909f0d69cb9113dfdff6e0c9a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:41Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:41 crc kubenswrapper[4676]: I0930 13:58:41.647210 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2355fa03-bfc0-4ba0-941c-ceee570ab4b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb64f69d88447dcfe2a2e206f1e59b4b10f370ac02a3f9653e0b95e6d94529c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f628b643cc4c4a9a0d4b2c6d923d6ee5839b325a873da093e84dfc48255548f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86ba7b90dcf7a1fcd028d2fcf19bb86de8c58446ef34191d50a5055f4042bb71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5591cefc1293c938f8a2a8db06c7287d0e54d55a9d99c65a40126ca77abccd80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:41Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:41 crc kubenswrapper[4676]: I0930 13:58:41.672517 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e12a8bf70722b099302a9c3b8d88526bbea68746ea168b2bf41306a3e18641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:41Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:41 crc kubenswrapper[4676]: I0930 13:58:41.687101 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e1c7778cf1d8d9f6dc68d72d153ea277d19d66d775fa02ed06687e3392040c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98be87b4cb316839099b69460cc1c1ecd98d5162c7dfc995c8f6d86ad477fedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:41Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:41 crc kubenswrapper[4676]: I0930 13:58:41.687633 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkkjq\" (UniqueName: \"kubernetes.io/projected/2e065baf-f38b-4397-bbbe-ef52eea10f84-kube-api-access-nkkjq\") pod \"node-ca-7stxd\" (UID: \"2e065baf-f38b-4397-bbbe-ef52eea10f84\") " pod="openshift-image-registry/node-ca-7stxd" Sep 30 13:58:41 crc kubenswrapper[4676]: I0930 13:58:41.687683 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2e065baf-f38b-4397-bbbe-ef52eea10f84-host\") pod \"node-ca-7stxd\" (UID: \"2e065baf-f38b-4397-bbbe-ef52eea10f84\") " pod="openshift-image-registry/node-ca-7stxd" Sep 30 13:58:41 crc kubenswrapper[4676]: I0930 13:58:41.687707 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2e065baf-f38b-4397-bbbe-ef52eea10f84-serviceca\") pod \"node-ca-7stxd\" (UID: \"2e065baf-f38b-4397-bbbe-ef52eea10f84\") " pod="openshift-image-registry/node-ca-7stxd" Sep 30 13:58:41 crc kubenswrapper[4676]: I0930 13:58:41.697511 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:41Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:41 crc kubenswrapper[4676]: I0930 13:58:41.789035 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkkjq\" (UniqueName: \"kubernetes.io/projected/2e065baf-f38b-4397-bbbe-ef52eea10f84-kube-api-access-nkkjq\") pod \"node-ca-7stxd\" (UID: \"2e065baf-f38b-4397-bbbe-ef52eea10f84\") " pod="openshift-image-registry/node-ca-7stxd" Sep 30 13:58:41 crc kubenswrapper[4676]: I0930 13:58:41.789079 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2e065baf-f38b-4397-bbbe-ef52eea10f84-host\") pod \"node-ca-7stxd\" (UID: \"2e065baf-f38b-4397-bbbe-ef52eea10f84\") " pod="openshift-image-registry/node-ca-7stxd" Sep 30 13:58:41 crc kubenswrapper[4676]: I0930 13:58:41.789113 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2e065baf-f38b-4397-bbbe-ef52eea10f84-serviceca\") pod \"node-ca-7stxd\" (UID: \"2e065baf-f38b-4397-bbbe-ef52eea10f84\") " pod="openshift-image-registry/node-ca-7stxd" Sep 30 13:58:41 crc kubenswrapper[4676]: I0930 13:58:41.789162 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2e065baf-f38b-4397-bbbe-ef52eea10f84-host\") pod \"node-ca-7stxd\" (UID: \"2e065baf-f38b-4397-bbbe-ef52eea10f84\") " pod="openshift-image-registry/node-ca-7stxd" Sep 30 13:58:41 crc kubenswrapper[4676]: I0930 13:58:41.792137 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2e065baf-f38b-4397-bbbe-ef52eea10f84-serviceca\") pod \"node-ca-7stxd\" (UID: \"2e065baf-f38b-4397-bbbe-ef52eea10f84\") " pod="openshift-image-registry/node-ca-7stxd" Sep 30 13:58:41 crc kubenswrapper[4676]: I0930 13:58:41.817584 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkkjq\" (UniqueName: \"kubernetes.io/projected/2e065baf-f38b-4397-bbbe-ef52eea10f84-kube-api-access-nkkjq\") pod \"node-ca-7stxd\" (UID: \"2e065baf-f38b-4397-bbbe-ef52eea10f84\") " pod="openshift-image-registry/node-ca-7stxd" Sep 30 13:58:41 crc kubenswrapper[4676]: I0930 13:58:41.874766 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7stxd" Sep 30 13:58:41 crc kubenswrapper[4676]: W0930 13:58:41.886664 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e065baf_f38b_4397_bbbe_ef52eea10f84.slice/crio-e723a5d74c7d6ee207a6b8a9649bfc3c9b08553355d035830bc6939d01f849e4 WatchSource:0}: Error finding container e723a5d74c7d6ee207a6b8a9649bfc3c9b08553355d035830bc6939d01f849e4: Status 404 returned error can't find the container with id e723a5d74c7d6ee207a6b8a9649bfc3c9b08553355d035830bc6939d01f849e4 Sep 30 13:58:41 crc kubenswrapper[4676]: I0930 13:58:41.936398 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-qmgfw"] Sep 30 13:58:41 crc kubenswrapper[4676]: I0930 13:58:41.936710 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-qmgfw" Sep 30 13:58:41 crc kubenswrapper[4676]: I0930 13:58:41.938272 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Sep 30 13:58:41 crc kubenswrapper[4676]: I0930 13:58:41.938662 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Sep 30 13:58:41 crc kubenswrapper[4676]: I0930 13:58:41.938851 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Sep 30 13:58:41 crc kubenswrapper[4676]: I0930 13:58:41.952481 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86284cfd-7d99-4f3b-bb3a-593b66737ae3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf15b6b87f6e760bf922b02b49658153875a79c4253bc50a895372c460e0f077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f838844cf38dd9b54b6054863a395010ea5cb87ca8066642ffe388365a91f5a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://151ae8ebe882d8c0150cd97088a56e52f55f0ae066819f8e4aaae2279a61881f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff5fec130c99031e2cb7164e0a3d5c4af84e2b2ac25d3cf255008dc5c1ab967\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3049f477ecf683e57b583e8f15aa9e5335347be632ffc02799ef279fefe9b446\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 13:58:31.152502 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:58:31.168708 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3623665920/tls.crt::/tmp/serving-cert-3623665920/tls.key\\\\\\\"\\\\nI0930 13:58:36.402239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:58:36.407288 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:58:36.407313 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:58:36.407332 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:58:36.407337 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:58:36.414209 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:58:36.414232 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414236 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:58:36.414244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:58:36.414247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:58:36.414250 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:58:36.414249 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:58:36.417272 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e2c7ce3911aaca47ffc14e5628e9efb3b32594d812028b2659b8f8ca681d53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:41Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:41 crc kubenswrapper[4676]: I0930 13:58:41.965760 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:41Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:41 crc kubenswrapper[4676]: I0930 13:58:41.981075 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:41Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:41 crc kubenswrapper[4676]: I0930 13:58:41.998039 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7stxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e065baf-f38b-4397-bbbe-ef52eea10f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nkkjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7stxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:41Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.017281 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qmgfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0fc06a-eb2b-4fa4-8554-ce8a0fd62074\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mwsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qmgfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.032536 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.052275 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938f56b18fe369ceeb13a74b0c257fd1b7fafeeb4d331327d4508f482caca32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.079983 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9343584b-a3e2-45a2-9f71-b815d206f44c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://168155413f2783887fde890a607fd98c1823da5a45b7fc550ef9358750451da7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69000b0d87b6f1e68fb36dd2349f95de211b17cc4e7bd7ec489969feca3810d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0228750af194e47c41da2795fb0b21b2c44b61e7d5a8943a943793954b33e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://288cb819756dee05f608c0e6875f3a2b4eb629a5d07db35aaf706a4bc4110899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9e51dab4ec7df53e36f707786a013e4fca0909f0d69cb9113dfdff6e0c9a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.093571 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mwsm\" (UniqueName: \"kubernetes.io/projected/2c0fc06a-eb2b-4fa4-8554-ce8a0fd62074-kube-api-access-5mwsm\") pod \"node-resolver-qmgfw\" (UID: \"2c0fc06a-eb2b-4fa4-8554-ce8a0fd62074\") " pod="openshift-dns/node-resolver-qmgfw" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.093636 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2c0fc06a-eb2b-4fa4-8554-ce8a0fd62074-hosts-file\") pod \"node-resolver-qmgfw\" (UID: \"2c0fc06a-eb2b-4fa4-8554-ce8a0fd62074\") " pod="openshift-dns/node-resolver-qmgfw" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.102942 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2355fa03-bfc0-4ba0-941c-ceee570ab4b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb64f69d88447dcfe2a2e206f1e59b4b10f370ac02a3f9653e0b95e6d94529c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f628b643cc4c4a9a0d4b2c6d923d6ee5839b325a873da093e84dfc48255548f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86ba7b90dcf7a1fcd028d2fcf19bb86de8c58446ef34191d50a5055f4042bb71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5591cefc1293c938f8a2a8db06c7287d0e54d55a9d99c65a40126ca77abccd80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.122920 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e12a8bf70722b099302a9c3b8d88526bbea68746ea168b2bf41306a3e18641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.140869 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e1c7778cf1d8d9f6dc68d72d153ea277d19d66d775fa02ed06687e3392040c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98be87b4cb316839099b69460cc1c1ecd98d5162c7dfc995c8f6d86ad477fedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.194269 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mwsm\" (UniqueName: \"kubernetes.io/projected/2c0fc06a-eb2b-4fa4-8554-ce8a0fd62074-kube-api-access-5mwsm\") pod \"node-resolver-qmgfw\" (UID: \"2c0fc06a-eb2b-4fa4-8554-ce8a0fd62074\") " pod="openshift-dns/node-resolver-qmgfw" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.194320 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2c0fc06a-eb2b-4fa4-8554-ce8a0fd62074-hosts-file\") pod \"node-resolver-qmgfw\" (UID: \"2c0fc06a-eb2b-4fa4-8554-ce8a0fd62074\") " pod="openshift-dns/node-resolver-qmgfw" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.194391 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2c0fc06a-eb2b-4fa4-8554-ce8a0fd62074-hosts-file\") pod \"node-resolver-qmgfw\" (UID: \"2c0fc06a-eb2b-4fa4-8554-ce8a0fd62074\") " pod="openshift-dns/node-resolver-qmgfw" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.213539 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mwsm\" (UniqueName: \"kubernetes.io/projected/2c0fc06a-eb2b-4fa4-8554-ce8a0fd62074-kube-api-access-5mwsm\") pod \"node-resolver-qmgfw\" (UID: \"2c0fc06a-eb2b-4fa4-8554-ce8a0fd62074\") " pod="openshift-dns/node-resolver-qmgfw" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.248052 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-qmgfw" Sep 30 13:58:42 crc kubenswrapper[4676]: W0930 13:58:42.272697 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c0fc06a_eb2b_4fa4_8554_ce8a0fd62074.slice/crio-2c558d4a4366b2ea2ebaca7d4f2658d14aaa4eab29fb10aff002efb4e6caae79 WatchSource:0}: Error finding container 2c558d4a4366b2ea2ebaca7d4f2658d14aaa4eab29fb10aff002efb4e6caae79: Status 404 returned error can't find the container with id 2c558d4a4366b2ea2ebaca7d4f2658d14aaa4eab29fb10aff002efb4e6caae79 Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.367145 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-s7q5x"] Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.367514 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-s7q5x" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.368332 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-ksfzg"] Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.369237 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-ksfzg" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.370645 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.371127 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.371278 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.375366 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.375649 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.375653 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.375906 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.377857 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-4k2dp"] Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.378528 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.388178 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.388587 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.388870 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.393167 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.396254 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.404431 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86284cfd-7d99-4f3b-bb3a-593b66737ae3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf15b6b87f6e760bf922b02b49658153875a79c4253bc50a895372c460e0f077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f838844cf38dd9b54b6054863a395010ea5cb87ca8066642ffe388365a91f5a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://151ae8ebe882d8c0150cd97088a56e52f55f0ae066819f8e4aaae2279a61881f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff5fec130c99031e2cb7164e0a3d5c4af84e2b2ac25d3cf255008dc5c1ab967\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3049f477ecf683e57b583e8f15aa9e5335347be632ffc02799ef279fefe9b446\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 13:58:31.152502 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:58:31.168708 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3623665920/tls.crt::/tmp/serving-cert-3623665920/tls.key\\\\\\\"\\\\nI0930 13:58:36.402239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:58:36.407288 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:58:36.407313 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:58:36.407332 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:58:36.407337 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:58:36.414209 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:58:36.414232 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414236 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:58:36.414244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:58:36.414247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:58:36.414250 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:58:36.414249 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:58:36.417272 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e2c7ce3911aaca47ffc14e5628e9efb3b32594d812028b2659b8f8ca681d53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.423465 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qmgfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0fc06a-eb2b-4fa4-8554-ce8a0fd62074\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mwsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qmgfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.432236 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:58:42 crc kubenswrapper[4676]: E0930 13:58:42.432355 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.432449 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.432528 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:58:42 crc kubenswrapper[4676]: E0930 13:58:42.432697 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:58:42 crc kubenswrapper[4676]: E0930 13:58:42.432835 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.439042 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.451130 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.465006 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7stxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e065baf-f38b-4397-bbbe-ef52eea10f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nkkjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7stxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.477593 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e12a8bf70722b099302a9c3b8d88526bbea68746ea168b2bf41306a3e18641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.490041 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e1c7778cf1d8d9f6dc68d72d153ea277d19d66d775fa02ed06687e3392040c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98be87b4cb316839099b69460cc1c1ecd98d5162c7dfc995c8f6d86ad477fedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.497150 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/12808c49-1bed-4251-bcbe-fad6207eea57-multus-socket-dir-parent\") pod \"multus-s7q5x\" (UID: \"12808c49-1bed-4251-bcbe-fad6207eea57\") " pod="openshift-multus/multus-s7q5x" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.497185 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/70e11f41-7d9e-49ef-a2f5-0691d5f8f631-os-release\") pod \"multus-additional-cni-plugins-ksfzg\" (UID: \"70e11f41-7d9e-49ef-a2f5-0691d5f8f631\") " pod="openshift-multus/multus-additional-cni-plugins-ksfzg" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.497200 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/70e11f41-7d9e-49ef-a2f5-0691d5f8f631-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ksfzg\" (UID: \"70e11f41-7d9e-49ef-a2f5-0691d5f8f631\") " pod="openshift-multus/multus-additional-cni-plugins-ksfzg" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.497218 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/12808c49-1bed-4251-bcbe-fad6207eea57-host-run-netns\") pod \"multus-s7q5x\" (UID: \"12808c49-1bed-4251-bcbe-fad6207eea57\") " pod="openshift-multus/multus-s7q5x" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.497235 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/12808c49-1bed-4251-bcbe-fad6207eea57-host-run-multus-certs\") pod \"multus-s7q5x\" (UID: \"12808c49-1bed-4251-bcbe-fad6207eea57\") " pod="openshift-multus/multus-s7q5x" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.497291 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/12808c49-1bed-4251-bcbe-fad6207eea57-cnibin\") pod \"multus-s7q5x\" (UID: \"12808c49-1bed-4251-bcbe-fad6207eea57\") " pod="openshift-multus/multus-s7q5x" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.497342 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/70e11f41-7d9e-49ef-a2f5-0691d5f8f631-cni-binary-copy\") pod \"multus-additional-cni-plugins-ksfzg\" (UID: \"70e11f41-7d9e-49ef-a2f5-0691d5f8f631\") " pod="openshift-multus/multus-additional-cni-plugins-ksfzg" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.497374 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/12808c49-1bed-4251-bcbe-fad6207eea57-host-var-lib-kubelet\") pod \"multus-s7q5x\" (UID: \"12808c49-1bed-4251-bcbe-fad6207eea57\") " pod="openshift-multus/multus-s7q5x" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.497397 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/70e11f41-7d9e-49ef-a2f5-0691d5f8f631-cnibin\") pod \"multus-additional-cni-plugins-ksfzg\" (UID: \"70e11f41-7d9e-49ef-a2f5-0691d5f8f631\") " pod="openshift-multus/multus-additional-cni-plugins-ksfzg" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.497425 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/af133cb7-f0e4-428e-b348-c6e81493fc1d-mcd-auth-proxy-config\") pod \"machine-config-daemon-4k2dp\" (UID: \"af133cb7-f0e4-428e-b348-c6e81493fc1d\") " pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.497464 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/12808c49-1bed-4251-bcbe-fad6207eea57-os-release\") pod \"multus-s7q5x\" (UID: \"12808c49-1bed-4251-bcbe-fad6207eea57\") " pod="openshift-multus/multus-s7q5x" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.497484 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/12808c49-1bed-4251-bcbe-fad6207eea57-hostroot\") pod \"multus-s7q5x\" (UID: \"12808c49-1bed-4251-bcbe-fad6207eea57\") " pod="openshift-multus/multus-s7q5x" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.497503 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94m48\" (UniqueName: \"kubernetes.io/projected/70e11f41-7d9e-49ef-a2f5-0691d5f8f631-kube-api-access-94m48\") pod \"multus-additional-cni-plugins-ksfzg\" (UID: \"70e11f41-7d9e-49ef-a2f5-0691d5f8f631\") " pod="openshift-multus/multus-additional-cni-plugins-ksfzg" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.497528 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/12808c49-1bed-4251-bcbe-fad6207eea57-host-var-lib-cni-multus\") pod \"multus-s7q5x\" (UID: \"12808c49-1bed-4251-bcbe-fad6207eea57\") " pod="openshift-multus/multus-s7q5x" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.497547 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/70e11f41-7d9e-49ef-a2f5-0691d5f8f631-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ksfzg\" (UID: \"70e11f41-7d9e-49ef-a2f5-0691d5f8f631\") " pod="openshift-multus/multus-additional-cni-plugins-ksfzg" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.497566 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7djb\" (UniqueName: \"kubernetes.io/projected/af133cb7-f0e4-428e-b348-c6e81493fc1d-kube-api-access-z7djb\") pod \"machine-config-daemon-4k2dp\" (UID: \"af133cb7-f0e4-428e-b348-c6e81493fc1d\") " pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.497581 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/12808c49-1bed-4251-bcbe-fad6207eea57-host-var-lib-cni-bin\") pod \"multus-s7q5x\" (UID: \"12808c49-1bed-4251-bcbe-fad6207eea57\") " pod="openshift-multus/multus-s7q5x" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.497597 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/12808c49-1bed-4251-bcbe-fad6207eea57-cni-binary-copy\") pod \"multus-s7q5x\" (UID: \"12808c49-1bed-4251-bcbe-fad6207eea57\") " pod="openshift-multus/multus-s7q5x" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.497611 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/12808c49-1bed-4251-bcbe-fad6207eea57-etc-kubernetes\") pod \"multus-s7q5x\" (UID: \"12808c49-1bed-4251-bcbe-fad6207eea57\") " pod="openshift-multus/multus-s7q5x" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.497626 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84swg\" (UniqueName: \"kubernetes.io/projected/12808c49-1bed-4251-bcbe-fad6207eea57-kube-api-access-84swg\") pod \"multus-s7q5x\" (UID: \"12808c49-1bed-4251-bcbe-fad6207eea57\") " pod="openshift-multus/multus-s7q5x" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.497651 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/af133cb7-f0e4-428e-b348-c6e81493fc1d-rootfs\") pod \"machine-config-daemon-4k2dp\" (UID: \"af133cb7-f0e4-428e-b348-c6e81493fc1d\") " pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.497669 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/af133cb7-f0e4-428e-b348-c6e81493fc1d-proxy-tls\") pod \"machine-config-daemon-4k2dp\" (UID: \"af133cb7-f0e4-428e-b348-c6e81493fc1d\") " pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.497693 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/12808c49-1bed-4251-bcbe-fad6207eea57-host-run-k8s-cni-cncf-io\") pod \"multus-s7q5x\" (UID: \"12808c49-1bed-4251-bcbe-fad6207eea57\") " pod="openshift-multus/multus-s7q5x" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.497707 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/12808c49-1bed-4251-bcbe-fad6207eea57-multus-conf-dir\") pod \"multus-s7q5x\" (UID: \"12808c49-1bed-4251-bcbe-fad6207eea57\") " pod="openshift-multus/multus-s7q5x" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.497725 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/12808c49-1bed-4251-bcbe-fad6207eea57-multus-daemon-config\") pod \"multus-s7q5x\" (UID: \"12808c49-1bed-4251-bcbe-fad6207eea57\") " pod="openshift-multus/multus-s7q5x" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.497746 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/12808c49-1bed-4251-bcbe-fad6207eea57-multus-cni-dir\") pod \"multus-s7q5x\" (UID: \"12808c49-1bed-4251-bcbe-fad6207eea57\") " pod="openshift-multus/multus-s7q5x" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.497761 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/12808c49-1bed-4251-bcbe-fad6207eea57-system-cni-dir\") pod \"multus-s7q5x\" (UID: \"12808c49-1bed-4251-bcbe-fad6207eea57\") " pod="openshift-multus/multus-s7q5x" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.497780 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/70e11f41-7d9e-49ef-a2f5-0691d5f8f631-system-cni-dir\") pod \"multus-additional-cni-plugins-ksfzg\" (UID: \"70e11f41-7d9e-49ef-a2f5-0691d5f8f631\") " pod="openshift-multus/multus-additional-cni-plugins-ksfzg" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.502673 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.514306 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938f56b18fe369ceeb13a74b0c257fd1b7fafeeb4d331327d4508f482caca32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.525830 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s7q5x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12808c49-1bed-4251-bcbe-fad6207eea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84swg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s7q5x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.552855 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9343584b-a3e2-45a2-9f71-b815d206f44c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://168155413f2783887fde890a607fd98c1823da5a45b7fc550ef9358750451da7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69000b0d87b6f1e68fb36dd2349f95de211b17cc4e7bd7ec489969feca3810d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0228750af194e47c41da2795fb0b21b2c44b61e7d5a8943a943793954b33e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://288cb819756dee05f608c0e6875f3a2b4eb629a5d07db35aaf706a4bc4110899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9e51dab4ec7df53e36f707786a013e4fca0909f0d69cb9113dfdff6e0c9a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.568187 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7stxd" event={"ID":"2e065baf-f38b-4397-bbbe-ef52eea10f84","Type":"ContainerStarted","Data":"d9f5fe4110f265eb89d70f5d85485514e7fea4ed909359edaa0c76d305d1a9c1"} Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.568242 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7stxd" event={"ID":"2e065baf-f38b-4397-bbbe-ef52eea10f84","Type":"ContainerStarted","Data":"e723a5d74c7d6ee207a6b8a9649bfc3c9b08553355d035830bc6939d01f849e4"} Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.569617 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-qmgfw" event={"ID":"2c0fc06a-eb2b-4fa4-8554-ce8a0fd62074","Type":"ContainerStarted","Data":"cbb44b6e0c8081c62bace086876620d80d412864b9afb7fd495413baf0363c4f"} Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.569656 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-qmgfw" event={"ID":"2c0fc06a-eb2b-4fa4-8554-ce8a0fd62074","Type":"ContainerStarted","Data":"2c558d4a4366b2ea2ebaca7d4f2658d14aaa4eab29fb10aff002efb4e6caae79"} Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.572846 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2355fa03-bfc0-4ba0-941c-ceee570ab4b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb64f69d88447dcfe2a2e206f1e59b4b10f370ac02a3f9653e0b95e6d94529c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f628b643cc4c4a9a0d4b2c6d923d6ee5839b325a873da093e84dfc48255548f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86ba7b90dcf7a1fcd028d2fcf19bb86de8c58446ef34191d50a5055f4042bb71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5591cefc1293c938f8a2a8db06c7287d0e54d55a9d99c65a40126ca77abccd80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.587594 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938f56b18fe369ceeb13a74b0c257fd1b7fafeeb4d331327d4508f482caca32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.598772 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/12808c49-1bed-4251-bcbe-fad6207eea57-host-run-multus-certs\") pod \"multus-s7q5x\" (UID: \"12808c49-1bed-4251-bcbe-fad6207eea57\") " pod="openshift-multus/multus-s7q5x" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.598812 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/12808c49-1bed-4251-bcbe-fad6207eea57-cnibin\") pod \"multus-s7q5x\" (UID: \"12808c49-1bed-4251-bcbe-fad6207eea57\") " pod="openshift-multus/multus-s7q5x" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.598830 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/70e11f41-7d9e-49ef-a2f5-0691d5f8f631-cni-binary-copy\") pod \"multus-additional-cni-plugins-ksfzg\" (UID: \"70e11f41-7d9e-49ef-a2f5-0691d5f8f631\") " pod="openshift-multus/multus-additional-cni-plugins-ksfzg" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.598849 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/12808c49-1bed-4251-bcbe-fad6207eea57-host-var-lib-kubelet\") pod \"multus-s7q5x\" (UID: \"12808c49-1bed-4251-bcbe-fad6207eea57\") " pod="openshift-multus/multus-s7q5x" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.598865 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/70e11f41-7d9e-49ef-a2f5-0691d5f8f631-cnibin\") pod \"multus-additional-cni-plugins-ksfzg\" (UID: \"70e11f41-7d9e-49ef-a2f5-0691d5f8f631\") " pod="openshift-multus/multus-additional-cni-plugins-ksfzg" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.598899 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94m48\" (UniqueName: \"kubernetes.io/projected/70e11f41-7d9e-49ef-a2f5-0691d5f8f631-kube-api-access-94m48\") pod \"multus-additional-cni-plugins-ksfzg\" (UID: \"70e11f41-7d9e-49ef-a2f5-0691d5f8f631\") " pod="openshift-multus/multus-additional-cni-plugins-ksfzg" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.598913 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/12808c49-1bed-4251-bcbe-fad6207eea57-host-run-multus-certs\") pod \"multus-s7q5x\" (UID: \"12808c49-1bed-4251-bcbe-fad6207eea57\") " pod="openshift-multus/multus-s7q5x" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.598922 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/af133cb7-f0e4-428e-b348-c6e81493fc1d-mcd-auth-proxy-config\") pod \"machine-config-daemon-4k2dp\" (UID: \"af133cb7-f0e4-428e-b348-c6e81493fc1d\") " pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.598963 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/12808c49-1bed-4251-bcbe-fad6207eea57-cnibin\") pod \"multus-s7q5x\" (UID: \"12808c49-1bed-4251-bcbe-fad6207eea57\") " pod="openshift-multus/multus-s7q5x" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.598982 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/12808c49-1bed-4251-bcbe-fad6207eea57-os-release\") pod \"multus-s7q5x\" (UID: \"12808c49-1bed-4251-bcbe-fad6207eea57\") " pod="openshift-multus/multus-s7q5x" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.599003 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/12808c49-1bed-4251-bcbe-fad6207eea57-host-var-lib-kubelet\") pod \"multus-s7q5x\" (UID: \"12808c49-1bed-4251-bcbe-fad6207eea57\") " pod="openshift-multus/multus-s7q5x" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.599010 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/12808c49-1bed-4251-bcbe-fad6207eea57-hostroot\") pod \"multus-s7q5x\" (UID: \"12808c49-1bed-4251-bcbe-fad6207eea57\") " pod="openshift-multus/multus-s7q5x" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.599032 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/12808c49-1bed-4251-bcbe-fad6207eea57-host-var-lib-cni-multus\") pod \"multus-s7q5x\" (UID: \"12808c49-1bed-4251-bcbe-fad6207eea57\") " pod="openshift-multus/multus-s7q5x" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.599033 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/70e11f41-7d9e-49ef-a2f5-0691d5f8f631-cnibin\") pod \"multus-additional-cni-plugins-ksfzg\" (UID: \"70e11f41-7d9e-49ef-a2f5-0691d5f8f631\") " pod="openshift-multus/multus-additional-cni-plugins-ksfzg" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.599054 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/70e11f41-7d9e-49ef-a2f5-0691d5f8f631-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ksfzg\" (UID: \"70e11f41-7d9e-49ef-a2f5-0691d5f8f631\") " pod="openshift-multus/multus-additional-cni-plugins-ksfzg" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.599129 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7djb\" (UniqueName: \"kubernetes.io/projected/af133cb7-f0e4-428e-b348-c6e81493fc1d-kube-api-access-z7djb\") pod \"machine-config-daemon-4k2dp\" (UID: \"af133cb7-f0e4-428e-b348-c6e81493fc1d\") " pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.599160 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/12808c49-1bed-4251-bcbe-fad6207eea57-host-var-lib-cni-bin\") pod \"multus-s7q5x\" (UID: \"12808c49-1bed-4251-bcbe-fad6207eea57\") " pod="openshift-multus/multus-s7q5x" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.599183 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/12808c49-1bed-4251-bcbe-fad6207eea57-cni-binary-copy\") pod \"multus-s7q5x\" (UID: \"12808c49-1bed-4251-bcbe-fad6207eea57\") " pod="openshift-multus/multus-s7q5x" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.599202 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/12808c49-1bed-4251-bcbe-fad6207eea57-etc-kubernetes\") pod \"multus-s7q5x\" (UID: \"12808c49-1bed-4251-bcbe-fad6207eea57\") " pod="openshift-multus/multus-s7q5x" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.599224 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84swg\" (UniqueName: \"kubernetes.io/projected/12808c49-1bed-4251-bcbe-fad6207eea57-kube-api-access-84swg\") pod \"multus-s7q5x\" (UID: \"12808c49-1bed-4251-bcbe-fad6207eea57\") " pod="openshift-multus/multus-s7q5x" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.599242 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/12808c49-1bed-4251-bcbe-fad6207eea57-os-release\") pod \"multus-s7q5x\" (UID: \"12808c49-1bed-4251-bcbe-fad6207eea57\") " pod="openshift-multus/multus-s7q5x" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.599267 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/af133cb7-f0e4-428e-b348-c6e81493fc1d-rootfs\") pod \"machine-config-daemon-4k2dp\" (UID: \"af133cb7-f0e4-428e-b348-c6e81493fc1d\") " pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.599273 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/12808c49-1bed-4251-bcbe-fad6207eea57-host-var-lib-cni-bin\") pod \"multus-s7q5x\" (UID: \"12808c49-1bed-4251-bcbe-fad6207eea57\") " pod="openshift-multus/multus-s7q5x" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.599288 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/af133cb7-f0e4-428e-b348-c6e81493fc1d-proxy-tls\") pod \"machine-config-daemon-4k2dp\" (UID: \"af133cb7-f0e4-428e-b348-c6e81493fc1d\") " pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.599311 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/12808c49-1bed-4251-bcbe-fad6207eea57-host-run-k8s-cni-cncf-io\") pod \"multus-s7q5x\" (UID: \"12808c49-1bed-4251-bcbe-fad6207eea57\") " pod="openshift-multus/multus-s7q5x" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.599334 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/12808c49-1bed-4251-bcbe-fad6207eea57-multus-conf-dir\") pod \"multus-s7q5x\" (UID: \"12808c49-1bed-4251-bcbe-fad6207eea57\") " pod="openshift-multus/multus-s7q5x" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.599357 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/12808c49-1bed-4251-bcbe-fad6207eea57-multus-daemon-config\") pod \"multus-s7q5x\" (UID: \"12808c49-1bed-4251-bcbe-fad6207eea57\") " pod="openshift-multus/multus-s7q5x" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.599402 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/12808c49-1bed-4251-bcbe-fad6207eea57-multus-cni-dir\") pod \"multus-s7q5x\" (UID: \"12808c49-1bed-4251-bcbe-fad6207eea57\") " pod="openshift-multus/multus-s7q5x" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.599426 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/12808c49-1bed-4251-bcbe-fad6207eea57-system-cni-dir\") pod \"multus-s7q5x\" (UID: \"12808c49-1bed-4251-bcbe-fad6207eea57\") " pod="openshift-multus/multus-s7q5x" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.599447 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/70e11f41-7d9e-49ef-a2f5-0691d5f8f631-system-cni-dir\") pod \"multus-additional-cni-plugins-ksfzg\" (UID: \"70e11f41-7d9e-49ef-a2f5-0691d5f8f631\") " pod="openshift-multus/multus-additional-cni-plugins-ksfzg" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.599481 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/12808c49-1bed-4251-bcbe-fad6207eea57-multus-socket-dir-parent\") pod \"multus-s7q5x\" (UID: \"12808c49-1bed-4251-bcbe-fad6207eea57\") " pod="openshift-multus/multus-s7q5x" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.599501 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/70e11f41-7d9e-49ef-a2f5-0691d5f8f631-os-release\") pod \"multus-additional-cni-plugins-ksfzg\" (UID: \"70e11f41-7d9e-49ef-a2f5-0691d5f8f631\") " pod="openshift-multus/multus-additional-cni-plugins-ksfzg" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.599521 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/70e11f41-7d9e-49ef-a2f5-0691d5f8f631-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ksfzg\" (UID: \"70e11f41-7d9e-49ef-a2f5-0691d5f8f631\") " pod="openshift-multus/multus-additional-cni-plugins-ksfzg" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.599546 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/12808c49-1bed-4251-bcbe-fad6207eea57-host-run-k8s-cni-cncf-io\") pod \"multus-s7q5x\" (UID: \"12808c49-1bed-4251-bcbe-fad6207eea57\") " pod="openshift-multus/multus-s7q5x" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.599557 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/af133cb7-f0e4-428e-b348-c6e81493fc1d-mcd-auth-proxy-config\") pod \"machine-config-daemon-4k2dp\" (UID: \"af133cb7-f0e4-428e-b348-c6e81493fc1d\") " pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.599547 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/12808c49-1bed-4251-bcbe-fad6207eea57-host-run-netns\") pod \"multus-s7q5x\" (UID: \"12808c49-1bed-4251-bcbe-fad6207eea57\") " pod="openshift-multus/multus-s7q5x" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.599580 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/12808c49-1bed-4251-bcbe-fad6207eea57-host-run-netns\") pod \"multus-s7q5x\" (UID: \"12808c49-1bed-4251-bcbe-fad6207eea57\") " pod="openshift-multus/multus-s7q5x" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.599607 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/12808c49-1bed-4251-bcbe-fad6207eea57-multus-conf-dir\") pod \"multus-s7q5x\" (UID: \"12808c49-1bed-4251-bcbe-fad6207eea57\") " pod="openshift-multus/multus-s7q5x" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.599640 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/70e11f41-7d9e-49ef-a2f5-0691d5f8f631-system-cni-dir\") pod \"multus-additional-cni-plugins-ksfzg\" (UID: \"70e11f41-7d9e-49ef-a2f5-0691d5f8f631\") " pod="openshift-multus/multus-additional-cni-plugins-ksfzg" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.599760 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/70e11f41-7d9e-49ef-a2f5-0691d5f8f631-cni-binary-copy\") pod \"multus-additional-cni-plugins-ksfzg\" (UID: \"70e11f41-7d9e-49ef-a2f5-0691d5f8f631\") " pod="openshift-multus/multus-additional-cni-plugins-ksfzg" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.599805 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/12808c49-1bed-4251-bcbe-fad6207eea57-hostroot\") pod \"multus-s7q5x\" (UID: \"12808c49-1bed-4251-bcbe-fad6207eea57\") " pod="openshift-multus/multus-s7q5x" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.599771 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/70e11f41-7d9e-49ef-a2f5-0691d5f8f631-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ksfzg\" (UID: \"70e11f41-7d9e-49ef-a2f5-0691d5f8f631\") " pod="openshift-multus/multus-additional-cni-plugins-ksfzg" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.599850 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/af133cb7-f0e4-428e-b348-c6e81493fc1d-rootfs\") pod \"machine-config-daemon-4k2dp\" (UID: \"af133cb7-f0e4-428e-b348-c6e81493fc1d\") " pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.599863 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/12808c49-1bed-4251-bcbe-fad6207eea57-host-var-lib-cni-multus\") pod \"multus-s7q5x\" (UID: \"12808c49-1bed-4251-bcbe-fad6207eea57\") " pod="openshift-multus/multus-s7q5x" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.599918 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/12808c49-1bed-4251-bcbe-fad6207eea57-etc-kubernetes\") pod \"multus-s7q5x\" (UID: \"12808c49-1bed-4251-bcbe-fad6207eea57\") " pod="openshift-multus/multus-s7q5x" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.599973 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/12808c49-1bed-4251-bcbe-fad6207eea57-multus-socket-dir-parent\") pod \"multus-s7q5x\" (UID: \"12808c49-1bed-4251-bcbe-fad6207eea57\") " pod="openshift-multus/multus-s7q5x" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.600004 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/12808c49-1bed-4251-bcbe-fad6207eea57-multus-cni-dir\") pod \"multus-s7q5x\" (UID: \"12808c49-1bed-4251-bcbe-fad6207eea57\") " pod="openshift-multus/multus-s7q5x" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.600039 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/70e11f41-7d9e-49ef-a2f5-0691d5f8f631-os-release\") pod \"multus-additional-cni-plugins-ksfzg\" (UID: \"70e11f41-7d9e-49ef-a2f5-0691d5f8f631\") " pod="openshift-multus/multus-additional-cni-plugins-ksfzg" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.600079 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/12808c49-1bed-4251-bcbe-fad6207eea57-cni-binary-copy\") pod \"multus-s7q5x\" (UID: \"12808c49-1bed-4251-bcbe-fad6207eea57\") " pod="openshift-multus/multus-s7q5x" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.600117 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/12808c49-1bed-4251-bcbe-fad6207eea57-multus-daemon-config\") pod \"multus-s7q5x\" (UID: \"12808c49-1bed-4251-bcbe-fad6207eea57\") " pod="openshift-multus/multus-s7q5x" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.600184 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/12808c49-1bed-4251-bcbe-fad6207eea57-system-cni-dir\") pod \"multus-s7q5x\" (UID: \"12808c49-1bed-4251-bcbe-fad6207eea57\") " pod="openshift-multus/multus-s7q5x" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.600607 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/70e11f41-7d9e-49ef-a2f5-0691d5f8f631-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ksfzg\" (UID: \"70e11f41-7d9e-49ef-a2f5-0691d5f8f631\") " pod="openshift-multus/multus-additional-cni-plugins-ksfzg" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.604590 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/af133cb7-f0e4-428e-b348-c6e81493fc1d-proxy-tls\") pod \"machine-config-daemon-4k2dp\" (UID: \"af133cb7-f0e4-428e-b348-c6e81493fc1d\") " pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.605441 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s7q5x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12808c49-1bed-4251-bcbe-fad6207eea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84swg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s7q5x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.617268 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7djb\" (UniqueName: \"kubernetes.io/projected/af133cb7-f0e4-428e-b348-c6e81493fc1d-kube-api-access-z7djb\") pod \"machine-config-daemon-4k2dp\" (UID: \"af133cb7-f0e4-428e-b348-c6e81493fc1d\") " pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.619999 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94m48\" (UniqueName: \"kubernetes.io/projected/70e11f41-7d9e-49ef-a2f5-0691d5f8f631-kube-api-access-94m48\") pod \"multus-additional-cni-plugins-ksfzg\" (UID: \"70e11f41-7d9e-49ef-a2f5-0691d5f8f631\") " pod="openshift-multus/multus-additional-cni-plugins-ksfzg" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.623121 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84swg\" (UniqueName: \"kubernetes.io/projected/12808c49-1bed-4251-bcbe-fad6207eea57-kube-api-access-84swg\") pod \"multus-s7q5x\" (UID: \"12808c49-1bed-4251-bcbe-fad6207eea57\") " pod="openshift-multus/multus-s7q5x" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.625037 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9343584b-a3e2-45a2-9f71-b815d206f44c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://168155413f2783887fde890a607fd98c1823da5a45b7fc550ef9358750451da7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69000b0d87b6f1e68fb36dd2349f95de211b17cc4e7bd7ec489969feca3810d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0228750af194e47c41da2795fb0b21b2c44b61e7d5a8943a943793954b33e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://288cb819756dee05f608c0e6875f3a2b4eb629a5d07db35aaf706a4bc4110899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9e51dab4ec7df53e36f707786a013e4fca0909f0d69cb9113dfdff6e0c9a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.636990 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2355fa03-bfc0-4ba0-941c-ceee570ab4b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb64f69d88447dcfe2a2e206f1e59b4b10f370ac02a3f9653e0b95e6d94529c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f628b643cc4c4a9a0d4b2c6d923d6ee5839b325a873da093e84dfc48255548f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86ba7b90dcf7a1fcd028d2fcf19bb86de8c58446ef34191d50a5055f4042bb71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5591cefc1293c938f8a2a8db06c7287d0e54d55a9d99c65a40126ca77abccd80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.649383 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e12a8bf70722b099302a9c3b8d88526bbea68746ea168b2bf41306a3e18641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.663860 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e1c7778cf1d8d9f6dc68d72d153ea277d19d66d775fa02ed06687e3392040c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98be87b4cb316839099b69460cc1c1ecd98d5162c7dfc995c8f6d86ad477fedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.674812 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.687558 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-s7q5x" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.688933 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86284cfd-7d99-4f3b-bb3a-593b66737ae3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf15b6b87f6e760bf922b02b49658153875a79c4253bc50a895372c460e0f077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f838844cf38dd9b54b6054863a395010ea5cb87ca8066642ffe388365a91f5a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://151ae8ebe882d8c0150cd97088a56e52f55f0ae066819f8e4aaae2279a61881f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff5fec130c99031e2cb7164e0a3d5c4af84e2b2ac25d3cf255008dc5c1ab967\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3049f477ecf683e57b583e8f15aa9e5335347be632ffc02799ef279fefe9b446\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 13:58:31.152502 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:58:31.168708 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3623665920/tls.crt::/tmp/serving-cert-3623665920/tls.key\\\\\\\"\\\\nI0930 13:58:36.402239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:58:36.407288 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:58:36.407313 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:58:36.407332 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:58:36.407337 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:58:36.414209 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:58:36.414232 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414236 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:58:36.414244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:58:36.414247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:58:36.414250 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:58:36.414249 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:58:36.417272 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e2c7ce3911aaca47ffc14e5628e9efb3b32594d812028b2659b8f8ca681d53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.694825 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-ksfzg" Sep 30 13:58:42 crc kubenswrapper[4676]: W0930 13:58:42.698234 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12808c49_1bed_4251_bcbe_fad6207eea57.slice/crio-5ff2c250c2cd0afc925f670c5b2d4a94c4e31e430c547567636e834e2b381029 WatchSource:0}: Error finding container 5ff2c250c2cd0afc925f670c5b2d4a94c4e31e430c547567636e834e2b381029: Status 404 returned error can't find the container with id 5ff2c250c2cd0afc925f670c5b2d4a94c4e31e430c547567636e834e2b381029 Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.700320 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.707442 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ksfzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70e11f41-7d9e-49ef-a2f5-0691d5f8f631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ksfzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:42 crc kubenswrapper[4676]: W0930 13:58:42.707539 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70e11f41_7d9e_49ef_a2f5_0691d5f8f631.slice/crio-d4c03e007f98d5d729688393967f46f4605a9c871aa09e22509b2424c7c37716 WatchSource:0}: Error finding container d4c03e007f98d5d729688393967f46f4605a9c871aa09e22509b2424c7c37716: Status 404 returned error can't find the container with id d4c03e007f98d5d729688393967f46f4605a9c871aa09e22509b2424c7c37716 Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.715763 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.717356 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.717390 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.717401 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.717522 4676 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 13:58:42 crc kubenswrapper[4676]: W0930 13:58:42.718152 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf133cb7_f0e4_428e_b348_c6e81493fc1d.slice/crio-162e34529e78d4dd19cf776e5d3908779f0051b50cc50409879c8641d79c62e6 WatchSource:0}: Error finding container 162e34529e78d4dd19cf776e5d3908779f0051b50cc50409879c8641d79c62e6: Status 404 returned error can't find the container with id 162e34529e78d4dd19cf776e5d3908779f0051b50cc50409879c8641d79c62e6 Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.718982 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af133cb7-f0e4-428e-b348-c6e81493fc1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7djb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7djb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4k2dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.728024 4676 kubelet_node_status.go:115] "Node was previously registered" node="crc" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.728473 4676 kubelet_node_status.go:79] "Successfully registered node" node="crc" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.730045 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.730116 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.730129 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.730147 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.730159 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:42Z","lastTransitionTime":"2025-09-30T13:58:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.734353 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.737176 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-9775s"] Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.738212 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.741787 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.741902 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.742076 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.742179 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.742365 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.742469 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.742997 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.748733 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:42 crc kubenswrapper[4676]: E0930 13:58:42.751187 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1d613b32-e65a-4ccd-985a-b2960fc60b41\\\",\\\"systemUUID\\\":\\\"e459744e-c3c7-46eb-b48a-f9d168ffc645\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.755743 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.755792 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.755804 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.755826 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.755839 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:42Z","lastTransitionTime":"2025-09-30T13:58:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.759089 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7stxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e065baf-f38b-4397-bbbe-ef52eea10f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f5fe4110f265eb89d70f5d85485514e7fea4ed909359edaa0c76d305d1a9c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nkkjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7stxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:42 crc kubenswrapper[4676]: E0930 13:58:42.768044 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1d613b32-e65a-4ccd-985a-b2960fc60b41\\\",\\\"systemUUID\\\":\\\"e459744e-c3c7-46eb-b48a-f9d168ffc645\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.771528 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qmgfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0fc06a-eb2b-4fa4-8554-ce8a0fd62074\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb44b6e0c8081c62bace086876620d80d412864b9afb7fd495413baf0363c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mwsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qmgfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.775867 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.775916 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.775926 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.775944 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.775957 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:42Z","lastTransitionTime":"2025-09-30T13:58:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.789858 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e12a8bf70722b099302a9c3b8d88526bbea68746ea168b2bf41306a3e18641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:42 crc kubenswrapper[4676]: E0930 13:58:42.789934 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1d613b32-e65a-4ccd-985a-b2960fc60b41\\\",\\\"systemUUID\\\":\\\"e459744e-c3c7-46eb-b48a-f9d168ffc645\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.792844 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.792871 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.792931 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.792951 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.792960 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:42Z","lastTransitionTime":"2025-09-30T13:58:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.801463 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e1c7778cf1d8d9f6dc68d72d153ea277d19d66d775fa02ed06687e3392040c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98be87b4cb316839099b69460cc1c1ecd98d5162c7dfc995c8f6d86ad477fedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:42 crc kubenswrapper[4676]: E0930 13:58:42.804546 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1d613b32-e65a-4ccd-985a-b2960fc60b41\\\",\\\"systemUUID\\\":\\\"e459744e-c3c7-46eb-b48a-f9d168ffc645\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.807670 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.807712 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.807727 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.807749 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.807761 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:42Z","lastTransitionTime":"2025-09-30T13:58:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.816390 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:42 crc kubenswrapper[4676]: E0930 13:58:42.822923 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1d613b32-e65a-4ccd-985a-b2960fc60b41\\\",\\\"systemUUID\\\":\\\"e459744e-c3c7-46eb-b48a-f9d168ffc645\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:42 crc kubenswrapper[4676]: E0930 13:58:42.823052 4676 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.826182 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.826214 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.826228 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.826251 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.826266 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:42Z","lastTransitionTime":"2025-09-30T13:58:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.832029 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938f56b18fe369ceeb13a74b0c257fd1b7fafeeb4d331327d4508f482caca32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.845428 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s7q5x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12808c49-1bed-4251-bcbe-fad6207eea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84swg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s7q5x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.866671 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9343584b-a3e2-45a2-9f71-b815d206f44c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://168155413f2783887fde890a607fd98c1823da5a45b7fc550ef9358750451da7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69000b0d87b6f1e68fb36dd2349f95de211b17cc4e7bd7ec489969feca3810d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0228750af194e47c41da2795fb0b21b2c44b61e7d5a8943a943793954b33e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://288cb819756dee05f608c0e6875f3a2b4eb629a5d07db35aaf706a4bc4110899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9e51dab4ec7df53e36f707786a013e4fca0909f0d69cb9113dfdff6e0c9a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.881393 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2355fa03-bfc0-4ba0-941c-ceee570ab4b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb64f69d88447dcfe2a2e206f1e59b4b10f370ac02a3f9653e0b95e6d94529c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f628b643cc4c4a9a0d4b2c6d923d6ee5839b325a873da093e84dfc48255548f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86ba7b90dcf7a1fcd028d2fcf19bb86de8c58446ef34191d50a5055f4042bb71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5591cefc1293c938f8a2a8db06c7287d0e54d55a9d99c65a40126ca77abccd80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.899801 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86284cfd-7d99-4f3b-bb3a-593b66737ae3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf15b6b87f6e760bf922b02b49658153875a79c4253bc50a895372c460e0f077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f838844cf38dd9b54b6054863a395010ea5cb87ca8066642ffe388365a91f5a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://151ae8ebe882d8c0150cd97088a56e52f55f0ae066819f8e4aaae2279a61881f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff5fec130c99031e2cb7164e0a3d5c4af84e2b2ac25d3cf255008dc5c1ab967\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3049f477ecf683e57b583e8f15aa9e5335347be632ffc02799ef279fefe9b446\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 13:58:31.152502 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:58:31.168708 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3623665920/tls.crt::/tmp/serving-cert-3623665920/tls.key\\\\\\\"\\\\nI0930 13:58:36.402239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:58:36.407288 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:58:36.407313 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:58:36.407332 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:58:36.407337 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:58:36.414209 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:58:36.414232 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414236 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:58:36.414244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:58:36.414247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:58:36.414250 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:58:36.414249 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:58:36.417272 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e2c7ce3911aaca47ffc14e5628e9efb3b32594d812028b2659b8f8ca681d53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.901272 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4fae6bdf-2a3f-4961-934d-b8f653412538-env-overrides\") pod \"ovnkube-node-9775s\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.901400 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-host-run-ovn-kubernetes\") pod \"ovnkube-node-9775s\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.901515 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9775s\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.901636 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r529k\" (UniqueName: \"kubernetes.io/projected/4fae6bdf-2a3f-4961-934d-b8f653412538-kube-api-access-r529k\") pod \"ovnkube-node-9775s\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.901721 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-systemd-units\") pod \"ovnkube-node-9775s\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.901800 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4fae6bdf-2a3f-4961-934d-b8f653412538-ovnkube-script-lib\") pod \"ovnkube-node-9775s\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.901939 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-log-socket\") pod \"ovnkube-node-9775s\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.902059 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4fae6bdf-2a3f-4961-934d-b8f653412538-ovn-node-metrics-cert\") pod \"ovnkube-node-9775s\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.902161 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-run-openvswitch\") pod \"ovnkube-node-9775s\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.902277 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-host-kubelet\") pod \"ovnkube-node-9775s\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.902387 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-node-log\") pod \"ovnkube-node-9775s\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.902489 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4fae6bdf-2a3f-4961-934d-b8f653412538-ovnkube-config\") pod \"ovnkube-node-9775s\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.902595 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-etc-openvswitch\") pod \"ovnkube-node-9775s\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.902697 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-run-systemd\") pod \"ovnkube-node-9775s\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.902838 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-run-ovn\") pod \"ovnkube-node-9775s\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.902914 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-host-run-netns\") pod \"ovnkube-node-9775s\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.902944 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-host-cni-bin\") pod \"ovnkube-node-9775s\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.902961 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-host-cni-netd\") pod \"ovnkube-node-9775s\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.902993 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-host-slash\") pod \"ovnkube-node-9775s\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.903016 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-var-lib-openvswitch\") pod \"ovnkube-node-9775s\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.913803 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ksfzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70e11f41-7d9e-49ef-a2f5-0691d5f8f631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ksfzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.924645 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af133cb7-f0e4-428e-b348-c6e81493fc1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7djb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7djb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4k2dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.928840 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.928926 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.928941 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.928965 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.928980 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:42Z","lastTransitionTime":"2025-09-30T13:58:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.935727 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qmgfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0fc06a-eb2b-4fa4-8554-ce8a0fd62074\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb44b6e0c8081c62bace086876620d80d412864b9afb7fd495413baf0363c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mwsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qmgfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.953243 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fae6bdf-2a3f-4961-934d-b8f653412538\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9775s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.965340 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.977654 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:42 crc kubenswrapper[4676]: I0930 13:58:42.986844 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7stxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e065baf-f38b-4397-bbbe-ef52eea10f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f5fe4110f265eb89d70f5d85485514e7fea4ed909359edaa0c76d305d1a9c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nkkjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7stxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:42Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.004199 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-run-openvswitch\") pod \"ovnkube-node-9775s\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.004503 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-host-kubelet\") pod \"ovnkube-node-9775s\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.004586 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-host-kubelet\") pod \"ovnkube-node-9775s\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.004589 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4fae6bdf-2a3f-4961-934d-b8f653412538-ovnkube-config\") pod \"ovnkube-node-9775s\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.004365 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-run-openvswitch\") pod \"ovnkube-node-9775s\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.004665 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-node-log\") pod \"ovnkube-node-9775s\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.004692 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-etc-openvswitch\") pod \"ovnkube-node-9775s\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.004723 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-run-systemd\") pod \"ovnkube-node-9775s\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.004739 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-run-ovn\") pod \"ovnkube-node-9775s\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.004768 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-host-cni-bin\") pod \"ovnkube-node-9775s\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.004774 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-node-log\") pod \"ovnkube-node-9775s\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.004805 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-host-cni-netd\") pod \"ovnkube-node-9775s\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.004808 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-run-systemd\") pod \"ovnkube-node-9775s\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.004787 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-host-cni-netd\") pod \"ovnkube-node-9775s\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.004836 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-host-cni-bin\") pod \"ovnkube-node-9775s\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.004860 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-run-ovn\") pod \"ovnkube-node-9775s\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.004892 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-etc-openvswitch\") pod \"ovnkube-node-9775s\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.004962 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-host-run-netns\") pod \"ovnkube-node-9775s\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.004990 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-host-slash\") pod \"ovnkube-node-9775s\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.005058 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-var-lib-openvswitch\") pod \"ovnkube-node-9775s\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.005063 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-host-run-netns\") pod \"ovnkube-node-9775s\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.005091 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4fae6bdf-2a3f-4961-934d-b8f653412538-env-overrides\") pod \"ovnkube-node-9775s\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.005112 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-host-run-ovn-kubernetes\") pod \"ovnkube-node-9775s\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.005119 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-host-slash\") pod \"ovnkube-node-9775s\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.005140 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9775s\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.005162 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r529k\" (UniqueName: \"kubernetes.io/projected/4fae6bdf-2a3f-4961-934d-b8f653412538-kube-api-access-r529k\") pod \"ovnkube-node-9775s\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.005177 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4fae6bdf-2a3f-4961-934d-b8f653412538-ovnkube-script-lib\") pod \"ovnkube-node-9775s\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.005180 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-host-run-ovn-kubernetes\") pod \"ovnkube-node-9775s\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.005199 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-systemd-units\") pod \"ovnkube-node-9775s\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.005216 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4fae6bdf-2a3f-4961-934d-b8f653412538-ovn-node-metrics-cert\") pod \"ovnkube-node-9775s\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.005231 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-log-socket\") pod \"ovnkube-node-9775s\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.005251 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9775s\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.005268 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-log-socket\") pod \"ovnkube-node-9775s\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.005317 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-systemd-units\") pod \"ovnkube-node-9775s\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.005470 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-var-lib-openvswitch\") pod \"ovnkube-node-9775s\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.005799 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4fae6bdf-2a3f-4961-934d-b8f653412538-env-overrides\") pod \"ovnkube-node-9775s\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.006302 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4fae6bdf-2a3f-4961-934d-b8f653412538-ovnkube-script-lib\") pod \"ovnkube-node-9775s\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.006342 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4fae6bdf-2a3f-4961-934d-b8f653412538-ovnkube-config\") pod \"ovnkube-node-9775s\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.008872 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4fae6bdf-2a3f-4961-934d-b8f653412538-ovn-node-metrics-cert\") pod \"ovnkube-node-9775s\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.022033 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r529k\" (UniqueName: \"kubernetes.io/projected/4fae6bdf-2a3f-4961-934d-b8f653412538-kube-api-access-r529k\") pod \"ovnkube-node-9775s\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.031208 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.031239 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.031247 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.031262 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.031272 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:43Z","lastTransitionTime":"2025-09-30T13:58:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.078006 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 13:58:43 crc kubenswrapper[4676]: W0930 13:58:43.088578 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4fae6bdf_2a3f_4961_934d_b8f653412538.slice/crio-b4b3ede39112fa145ad47d0259f78cb10b78ed13fbc9ab5e8d51d85991a81773 WatchSource:0}: Error finding container b4b3ede39112fa145ad47d0259f78cb10b78ed13fbc9ab5e8d51d85991a81773: Status 404 returned error can't find the container with id b4b3ede39112fa145ad47d0259f78cb10b78ed13fbc9ab5e8d51d85991a81773 Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.133406 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.133916 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.133928 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.133941 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.133949 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:43Z","lastTransitionTime":"2025-09-30T13:58:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.236239 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.236276 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.236287 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.236303 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.236317 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:43Z","lastTransitionTime":"2025-09-30T13:58:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.338620 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.338653 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.338662 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.338677 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.338686 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:43Z","lastTransitionTime":"2025-09-30T13:58:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.440459 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.440503 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.440514 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.440530 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.440543 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:43Z","lastTransitionTime":"2025-09-30T13:58:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.543056 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.543117 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.543127 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.543142 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.543152 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:43Z","lastTransitionTime":"2025-09-30T13:58:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.576265 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-s7q5x" event={"ID":"12808c49-1bed-4251-bcbe-fad6207eea57","Type":"ContainerStarted","Data":"397ae2103d09a79613ccc4281fb5318128fa7dc84e8013a5327cf3cd31490049"} Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.576311 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-s7q5x" event={"ID":"12808c49-1bed-4251-bcbe-fad6207eea57","Type":"ContainerStarted","Data":"5ff2c250c2cd0afc925f670c5b2d4a94c4e31e430c547567636e834e2b381029"} Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.581417 4676 generic.go:334] "Generic (PLEG): container finished" podID="4fae6bdf-2a3f-4961-934d-b8f653412538" containerID="d6b55521ba448057d6f78ab17ef0a1a7d4c8e593e5c7a23bebb4db492fb4b54f" exitCode=0 Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.581495 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" event={"ID":"4fae6bdf-2a3f-4961-934d-b8f653412538","Type":"ContainerDied","Data":"d6b55521ba448057d6f78ab17ef0a1a7d4c8e593e5c7a23bebb4db492fb4b54f"} Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.581541 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" event={"ID":"4fae6bdf-2a3f-4961-934d-b8f653412538","Type":"ContainerStarted","Data":"b4b3ede39112fa145ad47d0259f78cb10b78ed13fbc9ab5e8d51d85991a81773"} Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.583358 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" event={"ID":"af133cb7-f0e4-428e-b348-c6e81493fc1d","Type":"ContainerStarted","Data":"a8aa7f7317a33eb0f00d05f59ec96a210aa79be528b5dc94d099b25fa019511a"} Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.583384 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" event={"ID":"af133cb7-f0e4-428e-b348-c6e81493fc1d","Type":"ContainerStarted","Data":"e6ba88aa74993edfc49871c7cb7c45c498fc9a6d1e2cc573864d4656d4ade55b"} Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.583394 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" event={"ID":"af133cb7-f0e4-428e-b348-c6e81493fc1d","Type":"ContainerStarted","Data":"162e34529e78d4dd19cf776e5d3908779f0051b50cc50409879c8641d79c62e6"} Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.584605 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ksfzg" event={"ID":"70e11f41-7d9e-49ef-a2f5-0691d5f8f631","Type":"ContainerStarted","Data":"e68434ff5ca08dde9677cf96ed4703958895aa880048ef99eead372a1a06bcf8"} Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.584638 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ksfzg" event={"ID":"70e11f41-7d9e-49ef-a2f5-0691d5f8f631","Type":"ContainerStarted","Data":"d4c03e007f98d5d729688393967f46f4605a9c871aa09e22509b2424c7c37716"} Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.591976 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e12a8bf70722b099302a9c3b8d88526bbea68746ea168b2bf41306a3e18641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:43Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.604217 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e1c7778cf1d8d9f6dc68d72d153ea277d19d66d775fa02ed06687e3392040c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98be87b4cb316839099b69460cc1c1ecd98d5162c7dfc995c8f6d86ad477fedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:43Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.616179 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:43Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.629072 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938f56b18fe369ceeb13a74b0c257fd1b7fafeeb4d331327d4508f482caca32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:43Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.642582 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s7q5x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12808c49-1bed-4251-bcbe-fad6207eea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://397ae2103d09a79613ccc4281fb5318128fa7dc84e8013a5327cf3cd31490049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84swg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s7q5x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:43Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.644929 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.644965 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.644974 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.644990 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.645002 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:43Z","lastTransitionTime":"2025-09-30T13:58:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.662976 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9343584b-a3e2-45a2-9f71-b815d206f44c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://168155413f2783887fde890a607fd98c1823da5a45b7fc550ef9358750451da7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69000b0d87b6f1e68fb36dd2349f95de211b17cc4e7bd7ec489969feca3810d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0228750af194e47c41da2795fb0b21b2c44b61e7d5a8943a943793954b33e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://288cb819756dee05f608c0e6875f3a2b4eb629a5d07db35aaf706a4bc4110899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9e51dab4ec7df53e36f707786a013e4fca0909f0d69cb9113dfdff6e0c9a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:43Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.673799 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2355fa03-bfc0-4ba0-941c-ceee570ab4b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb64f69d88447dcfe2a2e206f1e59b4b10f370ac02a3f9653e0b95e6d94529c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f628b643cc4c4a9a0d4b2c6d923d6ee5839b325a873da093e84dfc48255548f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86ba7b90dcf7a1fcd028d2fcf19bb86de8c58446ef34191d50a5055f4042bb71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5591cefc1293c938f8a2a8db06c7287d0e54d55a9d99c65a40126ca77abccd80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:43Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.688487 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86284cfd-7d99-4f3b-bb3a-593b66737ae3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf15b6b87f6e760bf922b02b49658153875a79c4253bc50a895372c460e0f077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f838844cf38dd9b54b6054863a395010ea5cb87ca8066642ffe388365a91f5a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://151ae8ebe882d8c0150cd97088a56e52f55f0ae066819f8e4aaae2279a61881f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff5fec130c99031e2cb7164e0a3d5c4af84e2b2ac25d3cf255008dc5c1ab967\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3049f477ecf683e57b583e8f15aa9e5335347be632ffc02799ef279fefe9b446\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 13:58:31.152502 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:58:31.168708 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3623665920/tls.crt::/tmp/serving-cert-3623665920/tls.key\\\\\\\"\\\\nI0930 13:58:36.402239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:58:36.407288 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:58:36.407313 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:58:36.407332 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:58:36.407337 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:58:36.414209 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:58:36.414232 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414236 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:58:36.414244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:58:36.414247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:58:36.414250 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:58:36.414249 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:58:36.417272 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e2c7ce3911aaca47ffc14e5628e9efb3b32594d812028b2659b8f8ca681d53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:43Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.699316 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af133cb7-f0e4-428e-b348-c6e81493fc1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7djb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7djb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4k2dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:43Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.713521 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ksfzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70e11f41-7d9e-49ef-a2f5-0691d5f8f631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ksfzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:43Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.723451 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7stxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e065baf-f38b-4397-bbbe-ef52eea10f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f5fe4110f265eb89d70f5d85485514e7fea4ed909359edaa0c76d305d1a9c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nkkjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7stxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:43Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.732942 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qmgfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0fc06a-eb2b-4fa4-8554-ce8a0fd62074\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb44b6e0c8081c62bace086876620d80d412864b9afb7fd495413baf0363c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mwsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qmgfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:43Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.748267 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.748344 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.748356 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.748374 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.748386 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:43Z","lastTransitionTime":"2025-09-30T13:58:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.751682 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fae6bdf-2a3f-4961-934d-b8f653412538\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9775s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:43Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.767095 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:43Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.793424 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:43Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.811375 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86284cfd-7d99-4f3b-bb3a-593b66737ae3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf15b6b87f6e760bf922b02b49658153875a79c4253bc50a895372c460e0f077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f838844cf38dd9b54b6054863a395010ea5cb87ca8066642ffe388365a91f5a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://151ae8ebe882d8c0150cd97088a56e52f55f0ae066819f8e4aaae2279a61881f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff5fec130c99031e2cb7164e0a3d5c4af84e2b2ac25d3cf255008dc5c1ab967\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3049f477ecf683e57b583e8f15aa9e5335347be632ffc02799ef279fefe9b446\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 13:58:31.152502 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:58:31.168708 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3623665920/tls.crt::/tmp/serving-cert-3623665920/tls.key\\\\\\\"\\\\nI0930 13:58:36.402239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:58:36.407288 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:58:36.407313 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:58:36.407332 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:58:36.407337 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:58:36.414209 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:58:36.414232 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414236 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:58:36.414244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:58:36.414247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:58:36.414250 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:58:36.414249 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:58:36.417272 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e2c7ce3911aaca47ffc14e5628e9efb3b32594d812028b2659b8f8ca681d53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:43Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.840348 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ksfzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70e11f41-7d9e-49ef-a2f5-0691d5f8f631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e68434ff5ca08dde9677cf96ed4703958895aa880048ef99eead372a1a06bcf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ksfzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:43Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.850989 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.851030 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.851040 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.851054 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.851064 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:43Z","lastTransitionTime":"2025-09-30T13:58:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.857495 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af133cb7-f0e4-428e-b348-c6e81493fc1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8aa7f7317a33eb0f00d05f59ec96a210aa79be528b5dc94d099b25fa019511a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7djb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ba88aa74993edfc49871c7cb7c45c498fc9a6d1e2cc573864d4656d4ade55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7djb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4k2dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:43Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.869766 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:43Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.881276 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:43Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.892132 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7stxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e065baf-f38b-4397-bbbe-ef52eea10f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f5fe4110f265eb89d70f5d85485514e7fea4ed909359edaa0c76d305d1a9c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nkkjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7stxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:43Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.906826 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qmgfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0fc06a-eb2b-4fa4-8554-ce8a0fd62074\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb44b6e0c8081c62bace086876620d80d412864b9afb7fd495413baf0363c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mwsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qmgfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:43Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.923871 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fae6bdf-2a3f-4961-934d-b8f653412538\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6b55521ba448057d6f78ab17ef0a1a7d4c8e593e5c7a23bebb4db492fb4b54f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6b55521ba448057d6f78ab17ef0a1a7d4c8e593e5c7a23bebb4db492fb4b54f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9775s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:43Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.936535 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s7q5x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12808c49-1bed-4251-bcbe-fad6207eea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://397ae2103d09a79613ccc4281fb5318128fa7dc84e8013a5327cf3cd31490049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84swg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s7q5x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:43Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.953491 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.953521 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.953531 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.953545 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.953555 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:43Z","lastTransitionTime":"2025-09-30T13:58:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.956042 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9343584b-a3e2-45a2-9f71-b815d206f44c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://168155413f2783887fde890a607fd98c1823da5a45b7fc550ef9358750451da7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69000b0d87b6f1e68fb36dd2349f95de211b17cc4e7bd7ec489969feca3810d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0228750af194e47c41da2795fb0b21b2c44b61e7d5a8943a943793954b33e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://288cb819756dee05f608c0e6875f3a2b4eb629a5d07db35aaf706a4bc4110899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9e51dab4ec7df53e36f707786a013e4fca0909f0d69cb9113dfdff6e0c9a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:43Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.970834 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2355fa03-bfc0-4ba0-941c-ceee570ab4b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb64f69d88447dcfe2a2e206f1e59b4b10f370ac02a3f9653e0b95e6d94529c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f628b643cc4c4a9a0d4b2c6d923d6ee5839b325a873da093e84dfc48255548f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86ba7b90dcf7a1fcd028d2fcf19bb86de8c58446ef34191d50a5055f4042bb71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5591cefc1293c938f8a2a8db06c7287d0e54d55a9d99c65a40126ca77abccd80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:43Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.984216 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e12a8bf70722b099302a9c3b8d88526bbea68746ea168b2bf41306a3e18641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:43Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:43 crc kubenswrapper[4676]: I0930 13:58:43.996942 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e1c7778cf1d8d9f6dc68d72d153ea277d19d66d775fa02ed06687e3392040c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98be87b4cb316839099b69460cc1c1ecd98d5162c7dfc995c8f6d86ad477fedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:43Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.010719 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:44Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.015061 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.015176 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.015233 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:58:44 crc kubenswrapper[4676]: E0930 13:58:44.015243 4676 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 13:58:44 crc kubenswrapper[4676]: E0930 13:58:44.015285 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 13:58:52.015272226 +0000 UTC m=+35.998360655 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 13:58:44 crc kubenswrapper[4676]: E0930 13:58:44.015303 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:58:52.015291177 +0000 UTC m=+35.998379606 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:58:44 crc kubenswrapper[4676]: E0930 13:58:44.015408 4676 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 13:58:44 crc kubenswrapper[4676]: E0930 13:58:44.015517 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 13:58:52.015495782 +0000 UTC m=+35.998584271 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.025502 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938f56b18fe369ceeb13a74b0c257fd1b7fafeeb4d331327d4508f482caca32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:44Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.056515 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.056565 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.056576 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.056595 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.056606 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:44Z","lastTransitionTime":"2025-09-30T13:58:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.116254 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.116366 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:58:44 crc kubenswrapper[4676]: E0930 13:58:44.116421 4676 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 13:58:44 crc kubenswrapper[4676]: E0930 13:58:44.116449 4676 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 13:58:44 crc kubenswrapper[4676]: E0930 13:58:44.116469 4676 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:58:44 crc kubenswrapper[4676]: E0930 13:58:44.116519 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 13:58:52.116506858 +0000 UTC m=+36.099595277 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:58:44 crc kubenswrapper[4676]: E0930 13:58:44.116574 4676 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 13:58:44 crc kubenswrapper[4676]: E0930 13:58:44.116618 4676 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 13:58:44 crc kubenswrapper[4676]: E0930 13:58:44.116634 4676 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:58:44 crc kubenswrapper[4676]: E0930 13:58:44.116709 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 13:58:52.116687662 +0000 UTC m=+36.099776101 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.158789 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.158828 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.158840 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.158858 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.158869 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:44Z","lastTransitionTime":"2025-09-30T13:58:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.261736 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.261784 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.261795 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.261814 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.261836 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:44Z","lastTransitionTime":"2025-09-30T13:58:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.364695 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.364738 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.364750 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.364777 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.364791 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:44Z","lastTransitionTime":"2025-09-30T13:58:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.432616 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.432688 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.432745 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:58:44 crc kubenswrapper[4676]: E0930 13:58:44.432830 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:58:44 crc kubenswrapper[4676]: E0930 13:58:44.432963 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:58:44 crc kubenswrapper[4676]: E0930 13:58:44.433138 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.466670 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.466708 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.466716 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.466731 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.466745 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:44Z","lastTransitionTime":"2025-09-30T13:58:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.569559 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.569602 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.569615 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.569632 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.569644 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:44Z","lastTransitionTime":"2025-09-30T13:58:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.588833 4676 generic.go:334] "Generic (PLEG): container finished" podID="70e11f41-7d9e-49ef-a2f5-0691d5f8f631" containerID="e68434ff5ca08dde9677cf96ed4703958895aa880048ef99eead372a1a06bcf8" exitCode=0 Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.588926 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ksfzg" event={"ID":"70e11f41-7d9e-49ef-a2f5-0691d5f8f631","Type":"ContainerDied","Data":"e68434ff5ca08dde9677cf96ed4703958895aa880048ef99eead372a1a06bcf8"} Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.592521 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" event={"ID":"4fae6bdf-2a3f-4961-934d-b8f653412538","Type":"ContainerStarted","Data":"d2a8ea8d40cefaa9290dfd65c7f1fa80963910c38b71de3992bdcf77fde19eef"} Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.592584 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" event={"ID":"4fae6bdf-2a3f-4961-934d-b8f653412538","Type":"ContainerStarted","Data":"1323c0deb00658d6452b2d6aa4515233551626a4c4def3f4fff539db4a3f5038"} Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.592602 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" event={"ID":"4fae6bdf-2a3f-4961-934d-b8f653412538","Type":"ContainerStarted","Data":"2def96f529a78f915d69adfd74b48057081c10bf8f21bd69fce0d4b305f80a9b"} Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.592615 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" event={"ID":"4fae6bdf-2a3f-4961-934d-b8f653412538","Type":"ContainerStarted","Data":"1dbb7809575c05151462f55a6d1d4eca91ba859d817d152220688e61385beff0"} Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.592625 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" event={"ID":"4fae6bdf-2a3f-4961-934d-b8f653412538","Type":"ContainerStarted","Data":"6836ea3a8ec1a7fdff3ef83bd00963c21caeb4e8264f3b6775201c90b83d4653"} Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.592660 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" event={"ID":"4fae6bdf-2a3f-4961-934d-b8f653412538","Type":"ContainerStarted","Data":"4f1d81c0d66a327a4174d0f39a9540f66d89a6c2049490ddad67174aa684b773"} Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.604410 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:44Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.616202 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:44Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.629017 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7stxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e065baf-f38b-4397-bbbe-ef52eea10f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f5fe4110f265eb89d70f5d85485514e7fea4ed909359edaa0c76d305d1a9c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nkkjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7stxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:44Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.639899 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qmgfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0fc06a-eb2b-4fa4-8554-ce8a0fd62074\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb44b6e0c8081c62bace086876620d80d412864b9afb7fd495413baf0363c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mwsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qmgfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:44Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.658318 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fae6bdf-2a3f-4961-934d-b8f653412538\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6b55521ba448057d6f78ab17ef0a1a7d4c8e593e5c7a23bebb4db492fb4b54f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6b55521ba448057d6f78ab17ef0a1a7d4c8e593e5c7a23bebb4db492fb4b54f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9775s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:44Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.672004 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.672039 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.672048 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.672060 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.672068 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:44Z","lastTransitionTime":"2025-09-30T13:58:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.675970 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9343584b-a3e2-45a2-9f71-b815d206f44c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://168155413f2783887fde890a607fd98c1823da5a45b7fc550ef9358750451da7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69000b0d87b6f1e68fb36dd2349f95de211b17cc4e7bd7ec489969feca3810d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0228750af194e47c41da2795fb0b21b2c44b61e7d5a8943a943793954b33e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://288cb819756dee05f608c0e6875f3a2b4eb629a5d07db35aaf706a4bc4110899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9e51dab4ec7df53e36f707786a013e4fca0909f0d69cb9113dfdff6e0c9a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:44Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.688993 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2355fa03-bfc0-4ba0-941c-ceee570ab4b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb64f69d88447dcfe2a2e206f1e59b4b10f370ac02a3f9653e0b95e6d94529c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f628b643cc4c4a9a0d4b2c6d923d6ee5839b325a873da093e84dfc48255548f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86ba7b90dcf7a1fcd028d2fcf19bb86de8c58446ef34191d50a5055f4042bb71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5591cefc1293c938f8a2a8db06c7287d0e54d55a9d99c65a40126ca77abccd80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:44Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.701454 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e12a8bf70722b099302a9c3b8d88526bbea68746ea168b2bf41306a3e18641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:44Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.715179 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e1c7778cf1d8d9f6dc68d72d153ea277d19d66d775fa02ed06687e3392040c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98be87b4cb316839099b69460cc1c1ecd98d5162c7dfc995c8f6d86ad477fedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:44Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.727426 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:44Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.740624 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938f56b18fe369ceeb13a74b0c257fd1b7fafeeb4d331327d4508f482caca32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:44Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.752754 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s7q5x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12808c49-1bed-4251-bcbe-fad6207eea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://397ae2103d09a79613ccc4281fb5318128fa7dc84e8013a5327cf3cd31490049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84swg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s7q5x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:44Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.767249 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86284cfd-7d99-4f3b-bb3a-593b66737ae3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf15b6b87f6e760bf922b02b49658153875a79c4253bc50a895372c460e0f077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f838844cf38dd9b54b6054863a395010ea5cb87ca8066642ffe388365a91f5a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://151ae8ebe882d8c0150cd97088a56e52f55f0ae066819f8e4aaae2279a61881f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff5fec130c99031e2cb7164e0a3d5c4af84e2b2ac25d3cf255008dc5c1ab967\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3049f477ecf683e57b583e8f15aa9e5335347be632ffc02799ef279fefe9b446\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 13:58:31.152502 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:58:31.168708 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3623665920/tls.crt::/tmp/serving-cert-3623665920/tls.key\\\\\\\"\\\\nI0930 13:58:36.402239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:58:36.407288 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:58:36.407313 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:58:36.407332 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:58:36.407337 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:58:36.414209 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:58:36.414232 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414236 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:58:36.414244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:58:36.414247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:58:36.414250 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:58:36.414249 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:58:36.417272 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e2c7ce3911aaca47ffc14e5628e9efb3b32594d812028b2659b8f8ca681d53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:44Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.776218 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.776277 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.776291 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.776308 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.776322 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:44Z","lastTransitionTime":"2025-09-30T13:58:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.785630 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ksfzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70e11f41-7d9e-49ef-a2f5-0691d5f8f631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e68434ff5ca08dde9677cf96ed4703958895aa880048ef99eead372a1a06bcf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e68434ff5ca08dde9677cf96ed4703958895aa880048ef99eead372a1a06bcf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ksfzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:44Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.799480 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af133cb7-f0e4-428e-b348-c6e81493fc1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8aa7f7317a33eb0f00d05f59ec96a210aa79be528b5dc94d099b25fa019511a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7djb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ba88aa74993edfc49871c7cb7c45c498fc9a6d1e2cc573864d4656d4ade55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7djb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4k2dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:44Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.879247 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.879700 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.879712 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.879729 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.879739 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:44Z","lastTransitionTime":"2025-09-30T13:58:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.982584 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.982624 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.982634 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.982651 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:44 crc kubenswrapper[4676]: I0930 13:58:44.982662 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:44Z","lastTransitionTime":"2025-09-30T13:58:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:45 crc kubenswrapper[4676]: I0930 13:58:45.084592 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:45 crc kubenswrapper[4676]: I0930 13:58:45.084630 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:45 crc kubenswrapper[4676]: I0930 13:58:45.084639 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:45 crc kubenswrapper[4676]: I0930 13:58:45.084652 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:45 crc kubenswrapper[4676]: I0930 13:58:45.084661 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:45Z","lastTransitionTime":"2025-09-30T13:58:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:45 crc kubenswrapper[4676]: I0930 13:58:45.186637 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:45 crc kubenswrapper[4676]: I0930 13:58:45.186673 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:45 crc kubenswrapper[4676]: I0930 13:58:45.186681 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:45 crc kubenswrapper[4676]: I0930 13:58:45.186695 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:45 crc kubenswrapper[4676]: I0930 13:58:45.186704 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:45Z","lastTransitionTime":"2025-09-30T13:58:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:45 crc kubenswrapper[4676]: I0930 13:58:45.290247 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:45 crc kubenswrapper[4676]: I0930 13:58:45.290285 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:45 crc kubenswrapper[4676]: I0930 13:58:45.290295 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:45 crc kubenswrapper[4676]: I0930 13:58:45.290314 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:45 crc kubenswrapper[4676]: I0930 13:58:45.290326 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:45Z","lastTransitionTime":"2025-09-30T13:58:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:45 crc kubenswrapper[4676]: I0930 13:58:45.392789 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:45 crc kubenswrapper[4676]: I0930 13:58:45.392843 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:45 crc kubenswrapper[4676]: I0930 13:58:45.392854 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:45 crc kubenswrapper[4676]: I0930 13:58:45.392893 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:45 crc kubenswrapper[4676]: I0930 13:58:45.392907 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:45Z","lastTransitionTime":"2025-09-30T13:58:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:45 crc kubenswrapper[4676]: I0930 13:58:45.495868 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:45 crc kubenswrapper[4676]: I0930 13:58:45.496560 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:45 crc kubenswrapper[4676]: I0930 13:58:45.496582 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:45 crc kubenswrapper[4676]: I0930 13:58:45.496657 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:45 crc kubenswrapper[4676]: I0930 13:58:45.496677 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:45Z","lastTransitionTime":"2025-09-30T13:58:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:45 crc kubenswrapper[4676]: I0930 13:58:45.598062 4676 generic.go:334] "Generic (PLEG): container finished" podID="70e11f41-7d9e-49ef-a2f5-0691d5f8f631" containerID="f888604a37c64c17b61fbe754fb766b3a8dd363cd84342d3f464ac3b377b703c" exitCode=0 Sep 30 13:58:45 crc kubenswrapper[4676]: I0930 13:58:45.598110 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ksfzg" event={"ID":"70e11f41-7d9e-49ef-a2f5-0691d5f8f631","Type":"ContainerDied","Data":"f888604a37c64c17b61fbe754fb766b3a8dd363cd84342d3f464ac3b377b703c"} Sep 30 13:58:45 crc kubenswrapper[4676]: I0930 13:58:45.599135 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:45 crc kubenswrapper[4676]: I0930 13:58:45.599183 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:45 crc kubenswrapper[4676]: I0930 13:58:45.599195 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:45 crc kubenswrapper[4676]: I0930 13:58:45.599213 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:45 crc kubenswrapper[4676]: I0930 13:58:45.599225 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:45Z","lastTransitionTime":"2025-09-30T13:58:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:45 crc kubenswrapper[4676]: I0930 13:58:45.615352 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e12a8bf70722b099302a9c3b8d88526bbea68746ea168b2bf41306a3e18641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:45Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:45 crc kubenswrapper[4676]: I0930 13:58:45.627178 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e1c7778cf1d8d9f6dc68d72d153ea277d19d66d775fa02ed06687e3392040c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98be87b4cb316839099b69460cc1c1ecd98d5162c7dfc995c8f6d86ad477fedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:45Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:45 crc kubenswrapper[4676]: I0930 13:58:45.638436 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:45Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:45 crc kubenswrapper[4676]: I0930 13:58:45.648273 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938f56b18fe369ceeb13a74b0c257fd1b7fafeeb4d331327d4508f482caca32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:45Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:45 crc kubenswrapper[4676]: I0930 13:58:45.661421 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s7q5x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12808c49-1bed-4251-bcbe-fad6207eea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://397ae2103d09a79613ccc4281fb5318128fa7dc84e8013a5327cf3cd31490049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84swg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s7q5x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:45Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:45 crc kubenswrapper[4676]: I0930 13:58:45.685918 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9343584b-a3e2-45a2-9f71-b815d206f44c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://168155413f2783887fde890a607fd98c1823da5a45b7fc550ef9358750451da7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69000b0d87b6f1e68fb36dd2349f95de211b17cc4e7bd7ec489969feca3810d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0228750af194e47c41da2795fb0b21b2c44b61e7d5a8943a943793954b33e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://288cb819756dee05f608c0e6875f3a2b4eb629a5d07db35aaf706a4bc4110899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9e51dab4ec7df53e36f707786a013e4fca0909f0d69cb9113dfdff6e0c9a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:45Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:45 crc kubenswrapper[4676]: I0930 13:58:45.697932 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2355fa03-bfc0-4ba0-941c-ceee570ab4b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb64f69d88447dcfe2a2e206f1e59b4b10f370ac02a3f9653e0b95e6d94529c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f628b643cc4c4a9a0d4b2c6d923d6ee5839b325a873da093e84dfc48255548f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86ba7b90dcf7a1fcd028d2fcf19bb86de8c58446ef34191d50a5055f4042bb71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5591cefc1293c938f8a2a8db06c7287d0e54d55a9d99c65a40126ca77abccd80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:45Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:45 crc kubenswrapper[4676]: I0930 13:58:45.701546 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:45 crc kubenswrapper[4676]: I0930 13:58:45.701577 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:45 crc kubenswrapper[4676]: I0930 13:58:45.701586 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:45 crc kubenswrapper[4676]: I0930 13:58:45.701601 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:45 crc kubenswrapper[4676]: I0930 13:58:45.701609 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:45Z","lastTransitionTime":"2025-09-30T13:58:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:45 crc kubenswrapper[4676]: I0930 13:58:45.711680 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86284cfd-7d99-4f3b-bb3a-593b66737ae3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf15b6b87f6e760bf922b02b49658153875a79c4253bc50a895372c460e0f077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f838844cf38dd9b54b6054863a395010ea5cb87ca8066642ffe388365a91f5a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://151ae8ebe882d8c0150cd97088a56e52f55f0ae066819f8e4aaae2279a61881f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff5fec130c99031e2cb7164e0a3d5c4af84e2b2ac25d3cf255008dc5c1ab967\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3049f477ecf683e57b583e8f15aa9e5335347be632ffc02799ef279fefe9b446\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 13:58:31.152502 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:58:31.168708 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3623665920/tls.crt::/tmp/serving-cert-3623665920/tls.key\\\\\\\"\\\\nI0930 13:58:36.402239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:58:36.407288 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:58:36.407313 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:58:36.407332 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:58:36.407337 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:58:36.414209 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:58:36.414232 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414236 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:58:36.414244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:58:36.414247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:58:36.414250 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:58:36.414249 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:58:36.417272 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e2c7ce3911aaca47ffc14e5628e9efb3b32594d812028b2659b8f8ca681d53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:45Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:45 crc kubenswrapper[4676]: I0930 13:58:45.728365 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ksfzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70e11f41-7d9e-49ef-a2f5-0691d5f8f631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e68434ff5ca08dde9677cf96ed4703958895aa880048ef99eead372a1a06bcf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e68434ff5ca08dde9677cf96ed4703958895aa880048ef99eead372a1a06bcf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f888604a37c64c17b61fbe754fb766b3a8dd363cd84342d3f464ac3b377b703c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f888604a37c64c17b61fbe754fb766b3a8dd363cd84342d3f464ac3b377b703c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ksfzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:45Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:45 crc kubenswrapper[4676]: I0930 13:58:45.743094 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af133cb7-f0e4-428e-b348-c6e81493fc1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8aa7f7317a33eb0f00d05f59ec96a210aa79be528b5dc94d099b25fa019511a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7djb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ba88aa74993edfc49871c7cb7c45c498fc9a6d1e2cc573864d4656d4ade55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7djb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4k2dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:45Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:45 crc kubenswrapper[4676]: I0930 13:58:45.754273 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qmgfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0fc06a-eb2b-4fa4-8554-ce8a0fd62074\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb44b6e0c8081c62bace086876620d80d412864b9afb7fd495413baf0363c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mwsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qmgfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:45Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:45 crc kubenswrapper[4676]: I0930 13:58:45.774766 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fae6bdf-2a3f-4961-934d-b8f653412538\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6b55521ba448057d6f78ab17ef0a1a7d4c8e593e5c7a23bebb4db492fb4b54f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6b55521ba448057d6f78ab17ef0a1a7d4c8e593e5c7a23bebb4db492fb4b54f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9775s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:45Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:45 crc kubenswrapper[4676]: I0930 13:58:45.795009 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:45Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:45 crc kubenswrapper[4676]: I0930 13:58:45.804381 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:45 crc kubenswrapper[4676]: I0930 13:58:45.804426 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:45 crc kubenswrapper[4676]: I0930 13:58:45.804437 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:45 crc kubenswrapper[4676]: I0930 13:58:45.804455 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:45 crc kubenswrapper[4676]: I0930 13:58:45.804467 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:45Z","lastTransitionTime":"2025-09-30T13:58:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:45 crc kubenswrapper[4676]: I0930 13:58:45.808075 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:45Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:45 crc kubenswrapper[4676]: I0930 13:58:45.819592 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7stxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e065baf-f38b-4397-bbbe-ef52eea10f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f5fe4110f265eb89d70f5d85485514e7fea4ed909359edaa0c76d305d1a9c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nkkjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7stxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:45Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:45 crc kubenswrapper[4676]: I0930 13:58:45.906389 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:45 crc kubenswrapper[4676]: I0930 13:58:45.906466 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:45 crc kubenswrapper[4676]: I0930 13:58:45.906483 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:45 crc kubenswrapper[4676]: I0930 13:58:45.906510 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:45 crc kubenswrapper[4676]: I0930 13:58:45.906527 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:45Z","lastTransitionTime":"2025-09-30T13:58:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:46 crc kubenswrapper[4676]: I0930 13:58:46.009091 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:46 crc kubenswrapper[4676]: I0930 13:58:46.009157 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:46 crc kubenswrapper[4676]: I0930 13:58:46.009167 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:46 crc kubenswrapper[4676]: I0930 13:58:46.009186 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:46 crc kubenswrapper[4676]: I0930 13:58:46.009200 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:46Z","lastTransitionTime":"2025-09-30T13:58:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:46 crc kubenswrapper[4676]: I0930 13:58:46.112000 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:46 crc kubenswrapper[4676]: I0930 13:58:46.112047 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:46 crc kubenswrapper[4676]: I0930 13:58:46.112059 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:46 crc kubenswrapper[4676]: I0930 13:58:46.112077 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:46 crc kubenswrapper[4676]: I0930 13:58:46.112087 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:46Z","lastTransitionTime":"2025-09-30T13:58:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:46 crc kubenswrapper[4676]: I0930 13:58:46.215237 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:46 crc kubenswrapper[4676]: I0930 13:58:46.215282 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:46 crc kubenswrapper[4676]: I0930 13:58:46.215294 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:46 crc kubenswrapper[4676]: I0930 13:58:46.215313 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:46 crc kubenswrapper[4676]: I0930 13:58:46.215325 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:46Z","lastTransitionTime":"2025-09-30T13:58:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:46 crc kubenswrapper[4676]: I0930 13:58:46.318017 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:46 crc kubenswrapper[4676]: I0930 13:58:46.318067 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:46 crc kubenswrapper[4676]: I0930 13:58:46.318076 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:46 crc kubenswrapper[4676]: I0930 13:58:46.318091 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:46 crc kubenswrapper[4676]: I0930 13:58:46.318105 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:46Z","lastTransitionTime":"2025-09-30T13:58:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:46 crc kubenswrapper[4676]: I0930 13:58:46.420495 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:46 crc kubenswrapper[4676]: I0930 13:58:46.420545 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:46 crc kubenswrapper[4676]: I0930 13:58:46.420555 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:46 crc kubenswrapper[4676]: I0930 13:58:46.420572 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:46 crc kubenswrapper[4676]: I0930 13:58:46.420582 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:46Z","lastTransitionTime":"2025-09-30T13:58:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:46 crc kubenswrapper[4676]: I0930 13:58:46.432050 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:58:46 crc kubenswrapper[4676]: I0930 13:58:46.432131 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:58:46 crc kubenswrapper[4676]: I0930 13:58:46.432165 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:58:46 crc kubenswrapper[4676]: E0930 13:58:46.432237 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:58:46 crc kubenswrapper[4676]: E0930 13:58:46.432179 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:58:46 crc kubenswrapper[4676]: E0930 13:58:46.432340 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:58:46 crc kubenswrapper[4676]: I0930 13:58:46.523283 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:46 crc kubenswrapper[4676]: I0930 13:58:46.523325 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:46 crc kubenswrapper[4676]: I0930 13:58:46.523333 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:46 crc kubenswrapper[4676]: I0930 13:58:46.523346 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:46 crc kubenswrapper[4676]: I0930 13:58:46.523357 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:46Z","lastTransitionTime":"2025-09-30T13:58:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:46 crc kubenswrapper[4676]: I0930 13:58:46.603504 4676 generic.go:334] "Generic (PLEG): container finished" podID="70e11f41-7d9e-49ef-a2f5-0691d5f8f631" containerID="70a4a6ea3224024cc371c57de0a5be4082b7bd9092090f7af42bedda1a897d78" exitCode=0 Sep 30 13:58:46 crc kubenswrapper[4676]: I0930 13:58:46.603540 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ksfzg" event={"ID":"70e11f41-7d9e-49ef-a2f5-0691d5f8f631","Type":"ContainerDied","Data":"70a4a6ea3224024cc371c57de0a5be4082b7bd9092090f7af42bedda1a897d78"} Sep 30 13:58:46 crc kubenswrapper[4676]: I0930 13:58:46.622626 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86284cfd-7d99-4f3b-bb3a-593b66737ae3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf15b6b87f6e760bf922b02b49658153875a79c4253bc50a895372c460e0f077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f838844cf38dd9b54b6054863a395010ea5cb87ca8066642ffe388365a91f5a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://151ae8ebe882d8c0150cd97088a56e52f55f0ae066819f8e4aaae2279a61881f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff5fec130c99031e2cb7164e0a3d5c4af84e2b2ac25d3cf255008dc5c1ab967\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3049f477ecf683e57b583e8f15aa9e5335347be632ffc02799ef279fefe9b446\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 13:58:31.152502 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:58:31.168708 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3623665920/tls.crt::/tmp/serving-cert-3623665920/tls.key\\\\\\\"\\\\nI0930 13:58:36.402239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:58:36.407288 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:58:36.407313 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:58:36.407332 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:58:36.407337 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:58:36.414209 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:58:36.414232 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414236 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:58:36.414244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:58:36.414247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:58:36.414250 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:58:36.414249 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:58:36.417272 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e2c7ce3911aaca47ffc14e5628e9efb3b32594d812028b2659b8f8ca681d53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:46Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:46 crc kubenswrapper[4676]: I0930 13:58:46.626424 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:46 crc kubenswrapper[4676]: I0930 13:58:46.626466 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:46 crc kubenswrapper[4676]: I0930 13:58:46.626477 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:46 crc kubenswrapper[4676]: I0930 13:58:46.626497 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:46 crc kubenswrapper[4676]: I0930 13:58:46.626509 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:46Z","lastTransitionTime":"2025-09-30T13:58:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:46 crc kubenswrapper[4676]: I0930 13:58:46.637084 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ksfzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70e11f41-7d9e-49ef-a2f5-0691d5f8f631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e68434ff5ca08dde9677cf96ed4703958895aa880048ef99eead372a1a06bcf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e68434ff5ca08dde9677cf96ed4703958895aa880048ef99eead372a1a06bcf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f888604a37c64c17b61fbe754fb766b3a8dd363cd84342d3f464ac3b377b703c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f888604a37c64c17b61fbe754fb766b3a8dd363cd84342d3f464ac3b377b703c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a4a6ea3224024cc371c57de0a5be4082b7bd9092090f7af42bedda1a897d78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70a4a6ea3224024cc371c57de0a5be4082b7bd9092090f7af42bedda1a897d78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ksfzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:46Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:46 crc kubenswrapper[4676]: I0930 13:58:46.649651 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af133cb7-f0e4-428e-b348-c6e81493fc1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8aa7f7317a33eb0f00d05f59ec96a210aa79be528b5dc94d099b25fa019511a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7djb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ba88aa74993edfc49871c7cb7c45c498fc9a6d1e2cc573864d4656d4ade55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7djb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4k2dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:46Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:46 crc kubenswrapper[4676]: I0930 13:58:46.662456 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:46Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:46 crc kubenswrapper[4676]: I0930 13:58:46.674540 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:46Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:46 crc kubenswrapper[4676]: I0930 13:58:46.684326 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7stxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e065baf-f38b-4397-bbbe-ef52eea10f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f5fe4110f265eb89d70f5d85485514e7fea4ed909359edaa0c76d305d1a9c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nkkjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7stxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:46Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:46 crc kubenswrapper[4676]: I0930 13:58:46.693150 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qmgfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0fc06a-eb2b-4fa4-8554-ce8a0fd62074\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb44b6e0c8081c62bace086876620d80d412864b9afb7fd495413baf0363c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mwsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qmgfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:46Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:46 crc kubenswrapper[4676]: I0930 13:58:46.710349 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fae6bdf-2a3f-4961-934d-b8f653412538\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6b55521ba448057d6f78ab17ef0a1a7d4c8e593e5c7a23bebb4db492fb4b54f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6b55521ba448057d6f78ab17ef0a1a7d4c8e593e5c7a23bebb4db492fb4b54f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9775s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:46Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:46 crc kubenswrapper[4676]: I0930 13:58:46.722356 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938f56b18fe369ceeb13a74b0c257fd1b7fafeeb4d331327d4508f482caca32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:46Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:46 crc kubenswrapper[4676]: I0930 13:58:46.729095 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:46 crc kubenswrapper[4676]: I0930 13:58:46.729135 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:46 crc kubenswrapper[4676]: I0930 13:58:46.729143 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:46 crc kubenswrapper[4676]: I0930 13:58:46.729157 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:46 crc kubenswrapper[4676]: I0930 13:58:46.729169 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:46Z","lastTransitionTime":"2025-09-30T13:58:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:46 crc kubenswrapper[4676]: I0930 13:58:46.734474 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s7q5x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12808c49-1bed-4251-bcbe-fad6207eea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://397ae2103d09a79613ccc4281fb5318128fa7dc84e8013a5327cf3cd31490049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84swg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s7q5x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:46Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:46 crc kubenswrapper[4676]: I0930 13:58:46.756280 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9343584b-a3e2-45a2-9f71-b815d206f44c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://168155413f2783887fde890a607fd98c1823da5a45b7fc550ef9358750451da7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69000b0d87b6f1e68fb36dd2349f95de211b17cc4e7bd7ec489969feca3810d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0228750af194e47c41da2795fb0b21b2c44b61e7d5a8943a943793954b33e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://288cb819756dee05f608c0e6875f3a2b4eb629a5d07db35aaf706a4bc4110899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9e51dab4ec7df53e36f707786a013e4fca0909f0d69cb9113dfdff6e0c9a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:46Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:46 crc kubenswrapper[4676]: I0930 13:58:46.769032 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2355fa03-bfc0-4ba0-941c-ceee570ab4b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb64f69d88447dcfe2a2e206f1e59b4b10f370ac02a3f9653e0b95e6d94529c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f628b643cc4c4a9a0d4b2c6d923d6ee5839b325a873da093e84dfc48255548f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86ba7b90dcf7a1fcd028d2fcf19bb86de8c58446ef34191d50a5055f4042bb71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5591cefc1293c938f8a2a8db06c7287d0e54d55a9d99c65a40126ca77abccd80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:46Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:46 crc kubenswrapper[4676]: I0930 13:58:46.781698 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e12a8bf70722b099302a9c3b8d88526bbea68746ea168b2bf41306a3e18641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:46Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:46 crc kubenswrapper[4676]: I0930 13:58:46.794848 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e1c7778cf1d8d9f6dc68d72d153ea277d19d66d775fa02ed06687e3392040c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98be87b4cb316839099b69460cc1c1ecd98d5162c7dfc995c8f6d86ad477fedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:46Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:46 crc kubenswrapper[4676]: I0930 13:58:46.806659 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:46Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:46 crc kubenswrapper[4676]: I0930 13:58:46.831081 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:46 crc kubenswrapper[4676]: I0930 13:58:46.831122 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:46 crc kubenswrapper[4676]: I0930 13:58:46.831134 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:46 crc kubenswrapper[4676]: I0930 13:58:46.831154 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:46 crc kubenswrapper[4676]: I0930 13:58:46.831166 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:46Z","lastTransitionTime":"2025-09-30T13:58:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:46 crc kubenswrapper[4676]: I0930 13:58:46.933201 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:46 crc kubenswrapper[4676]: I0930 13:58:46.933232 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:46 crc kubenswrapper[4676]: I0930 13:58:46.933241 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:46 crc kubenswrapper[4676]: I0930 13:58:46.933253 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:46 crc kubenswrapper[4676]: I0930 13:58:46.933262 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:46Z","lastTransitionTime":"2025-09-30T13:58:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.035870 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.035921 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.035932 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.035950 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.035961 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:47Z","lastTransitionTime":"2025-09-30T13:58:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.138400 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.138455 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.138472 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.138496 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.138513 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:47Z","lastTransitionTime":"2025-09-30T13:58:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.240332 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.240374 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.240385 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.240400 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.240409 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:47Z","lastTransitionTime":"2025-09-30T13:58:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.343171 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.343202 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.343210 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.343222 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.343231 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:47Z","lastTransitionTime":"2025-09-30T13:58:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.445109 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.445192 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.445210 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.445239 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.445259 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:47Z","lastTransitionTime":"2025-09-30T13:58:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.455748 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86284cfd-7d99-4f3b-bb3a-593b66737ae3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf15b6b87f6e760bf922b02b49658153875a79c4253bc50a895372c460e0f077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f838844cf38dd9b54b6054863a395010ea5cb87ca8066642ffe388365a91f5a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://151ae8ebe882d8c0150cd97088a56e52f55f0ae066819f8e4aaae2279a61881f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff5fec130c99031e2cb7164e0a3d5c4af84e2b2ac25d3cf255008dc5c1ab967\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3049f477ecf683e57b583e8f15aa9e5335347be632ffc02799ef279fefe9b446\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 13:58:31.152502 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:58:31.168708 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3623665920/tls.crt::/tmp/serving-cert-3623665920/tls.key\\\\\\\"\\\\nI0930 13:58:36.402239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:58:36.407288 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:58:36.407313 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:58:36.407332 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:58:36.407337 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:58:36.414209 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:58:36.414232 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414236 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:58:36.414244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:58:36.414247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:58:36.414250 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:58:36.414249 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:58:36.417272 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e2c7ce3911aaca47ffc14e5628e9efb3b32594d812028b2659b8f8ca681d53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:47Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.472780 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ksfzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70e11f41-7d9e-49ef-a2f5-0691d5f8f631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e68434ff5ca08dde9677cf96ed4703958895aa880048ef99eead372a1a06bcf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e68434ff5ca08dde9677cf96ed4703958895aa880048ef99eead372a1a06bcf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f888604a37c64c17b61fbe754fb766b3a8dd363cd84342d3f464ac3b377b703c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f888604a37c64c17b61fbe754fb766b3a8dd363cd84342d3f464ac3b377b703c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a4a6ea3224024cc371c57de0a5be4082b7bd9092090f7af42bedda1a897d78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70a4a6ea3224024cc371c57de0a5be4082b7bd9092090f7af42bedda1a897d78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ksfzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:47Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.486631 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af133cb7-f0e4-428e-b348-c6e81493fc1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8aa7f7317a33eb0f00d05f59ec96a210aa79be528b5dc94d099b25fa019511a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7djb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ba88aa74993edfc49871c7cb7c45c498fc9a6d1e2cc573864d4656d4ade55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7djb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4k2dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:47Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.503837 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fae6bdf-2a3f-4961-934d-b8f653412538\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6b55521ba448057d6f78ab17ef0a1a7d4c8e593e5c7a23bebb4db492fb4b54f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6b55521ba448057d6f78ab17ef0a1a7d4c8e593e5c7a23bebb4db492fb4b54f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9775s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:47Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.518373 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:47Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.531547 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:47Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.547076 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.547124 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.547137 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.547154 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.547168 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:47Z","lastTransitionTime":"2025-09-30T13:58:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.554351 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7stxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e065baf-f38b-4397-bbbe-ef52eea10f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f5fe4110f265eb89d70f5d85485514e7fea4ed909359edaa0c76d305d1a9c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nkkjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7stxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:47Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.570758 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qmgfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0fc06a-eb2b-4fa4-8554-ce8a0fd62074\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb44b6e0c8081c62bace086876620d80d412864b9afb7fd495413baf0363c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mwsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qmgfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:47Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.584533 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e1c7778cf1d8d9f6dc68d72d153ea277d19d66d775fa02ed06687e3392040c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98be87b4cb316839099b69460cc1c1ecd98d5162c7dfc995c8f6d86ad477fedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:47Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.596465 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:47Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.616009 4676 generic.go:334] "Generic (PLEG): container finished" podID="70e11f41-7d9e-49ef-a2f5-0691d5f8f631" containerID="62090d66b5fc63638db525937297cd84b886fc27446d8df9935ee6b45f7d8610" exitCode=0 Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.616123 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ksfzg" event={"ID":"70e11f41-7d9e-49ef-a2f5-0691d5f8f631","Type":"ContainerDied","Data":"62090d66b5fc63638db525937297cd84b886fc27446d8df9935ee6b45f7d8610"} Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.617190 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938f56b18fe369ceeb13a74b0c257fd1b7fafeeb4d331327d4508f482caca32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:47Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.623831 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" event={"ID":"4fae6bdf-2a3f-4961-934d-b8f653412538","Type":"ContainerStarted","Data":"7d05c6ffe2c46461b1604661e27c75ac6f150de39e30e983d4c4dfbe9c84092e"} Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.633022 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s7q5x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12808c49-1bed-4251-bcbe-fad6207eea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://397ae2103d09a79613ccc4281fb5318128fa7dc84e8013a5327cf3cd31490049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84swg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s7q5x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:47Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.649408 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.649456 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.649471 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.649494 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.649509 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:47Z","lastTransitionTime":"2025-09-30T13:58:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.653546 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9343584b-a3e2-45a2-9f71-b815d206f44c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://168155413f2783887fde890a607fd98c1823da5a45b7fc550ef9358750451da7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69000b0d87b6f1e68fb36dd2349f95de211b17cc4e7bd7ec489969feca3810d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0228750af194e47c41da2795fb0b21b2c44b61e7d5a8943a943793954b33e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://288cb819756dee05f608c0e6875f3a2b4eb629a5d07db35aaf706a4bc4110899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9e51dab4ec7df53e36f707786a013e4fca0909f0d69cb9113dfdff6e0c9a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:47Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.669172 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2355fa03-bfc0-4ba0-941c-ceee570ab4b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb64f69d88447dcfe2a2e206f1e59b4b10f370ac02a3f9653e0b95e6d94529c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f628b643cc4c4a9a0d4b2c6d923d6ee5839b325a873da093e84dfc48255548f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86ba7b90dcf7a1fcd028d2fcf19bb86de8c58446ef34191d50a5055f4042bb71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5591cefc1293c938f8a2a8db06c7287d0e54d55a9d99c65a40126ca77abccd80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:47Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.683079 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e12a8bf70722b099302a9c3b8d88526bbea68746ea168b2bf41306a3e18641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:47Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.702336 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ksfzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70e11f41-7d9e-49ef-a2f5-0691d5f8f631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e68434ff5ca08dde9677cf96ed4703958895aa880048ef99eead372a1a06bcf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e68434ff5ca08dde9677cf96ed4703958895aa880048ef99eead372a1a06bcf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f888604a37c64c17b61fbe754fb766b3a8dd363cd84342d3f464ac3b377b703c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f888604a37c64c17b61fbe754fb766b3a8dd363cd84342d3f464ac3b377b703c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a4a6ea3224024cc371c57de0a5be4082b7bd9092090f7af42bedda1a897d78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70a4a6ea3224024cc371c57de0a5be4082b7bd9092090f7af42bedda1a897d78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62090d66b5fc63638db525937297cd84b886fc27446d8df9935ee6b45f7d8610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62090d66b5fc63638db525937297cd84b886fc27446d8df9935ee6b45f7d8610\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ksfzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:47Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.713926 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af133cb7-f0e4-428e-b348-c6e81493fc1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8aa7f7317a33eb0f00d05f59ec96a210aa79be528b5dc94d099b25fa019511a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7djb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ba88aa74993edfc49871c7cb7c45c498fc9a6d1e2cc573864d4656d4ade55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7djb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4k2dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:47Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.735987 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fae6bdf-2a3f-4961-934d-b8f653412538\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6b55521ba448057d6f78ab17ef0a1a7d4c8e593e5c7a23bebb4db492fb4b54f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6b55521ba448057d6f78ab17ef0a1a7d4c8e593e5c7a23bebb4db492fb4b54f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9775s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:47Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.752606 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:47Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.753217 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.753261 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.753389 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.753416 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.753425 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:47Z","lastTransitionTime":"2025-09-30T13:58:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.766603 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:47Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.778602 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7stxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e065baf-f38b-4397-bbbe-ef52eea10f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f5fe4110f265eb89d70f5d85485514e7fea4ed909359edaa0c76d305d1a9c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nkkjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7stxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:47Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.791376 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qmgfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0fc06a-eb2b-4fa4-8554-ce8a0fd62074\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb44b6e0c8081c62bace086876620d80d412864b9afb7fd495413baf0363c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mwsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qmgfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:47Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.805072 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e1c7778cf1d8d9f6dc68d72d153ea277d19d66d775fa02ed06687e3392040c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98be87b4cb316839099b69460cc1c1ecd98d5162c7dfc995c8f6d86ad477fedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:47Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.819114 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:47Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.831708 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938f56b18fe369ceeb13a74b0c257fd1b7fafeeb4d331327d4508f482caca32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:47Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.846045 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s7q5x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12808c49-1bed-4251-bcbe-fad6207eea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://397ae2103d09a79613ccc4281fb5318128fa7dc84e8013a5327cf3cd31490049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84swg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s7q5x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:47Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.856687 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.856745 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.856756 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.856778 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.856792 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:47Z","lastTransitionTime":"2025-09-30T13:58:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.870049 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9343584b-a3e2-45a2-9f71-b815d206f44c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://168155413f2783887fde890a607fd98c1823da5a45b7fc550ef9358750451da7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69000b0d87b6f1e68fb36dd2349f95de211b17cc4e7bd7ec489969feca3810d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0228750af194e47c41da2795fb0b21b2c44b61e7d5a8943a943793954b33e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://288cb819756dee05f608c0e6875f3a2b4eb629a5d07db35aaf706a4bc4110899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9e51dab4ec7df53e36f707786a013e4fca0909f0d69cb9113dfdff6e0c9a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:47Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.887871 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2355fa03-bfc0-4ba0-941c-ceee570ab4b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb64f69d88447dcfe2a2e206f1e59b4b10f370ac02a3f9653e0b95e6d94529c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f628b643cc4c4a9a0d4b2c6d923d6ee5839b325a873da093e84dfc48255548f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86ba7b90dcf7a1fcd028d2fcf19bb86de8c58446ef34191d50a5055f4042bb71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5591cefc1293c938f8a2a8db06c7287d0e54d55a9d99c65a40126ca77abccd80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:47Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.905752 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e12a8bf70722b099302a9c3b8d88526bbea68746ea168b2bf41306a3e18641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:47Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.923616 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86284cfd-7d99-4f3b-bb3a-593b66737ae3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf15b6b87f6e760bf922b02b49658153875a79c4253bc50a895372c460e0f077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f838844cf38dd9b54b6054863a395010ea5cb87ca8066642ffe388365a91f5a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://151ae8ebe882d8c0150cd97088a56e52f55f0ae066819f8e4aaae2279a61881f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff5fec130c99031e2cb7164e0a3d5c4af84e2b2ac25d3cf255008dc5c1ab967\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3049f477ecf683e57b583e8f15aa9e5335347be632ffc02799ef279fefe9b446\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 13:58:31.152502 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:58:31.168708 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3623665920/tls.crt::/tmp/serving-cert-3623665920/tls.key\\\\\\\"\\\\nI0930 13:58:36.402239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:58:36.407288 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:58:36.407313 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:58:36.407332 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:58:36.407337 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:58:36.414209 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:58:36.414232 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414236 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:58:36.414244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:58:36.414247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:58:36.414250 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:58:36.414249 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:58:36.417272 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e2c7ce3911aaca47ffc14e5628e9efb3b32594d812028b2659b8f8ca681d53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:47Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.960442 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.960521 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.960536 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.960563 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:47 crc kubenswrapper[4676]: I0930 13:58:47.960584 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:47Z","lastTransitionTime":"2025-09-30T13:58:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:48 crc kubenswrapper[4676]: I0930 13:58:48.063063 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:48 crc kubenswrapper[4676]: I0930 13:58:48.063155 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:48 crc kubenswrapper[4676]: I0930 13:58:48.063169 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:48 crc kubenswrapper[4676]: I0930 13:58:48.063197 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:48 crc kubenswrapper[4676]: I0930 13:58:48.063211 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:48Z","lastTransitionTime":"2025-09-30T13:58:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:48 crc kubenswrapper[4676]: I0930 13:58:48.166381 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:48 crc kubenswrapper[4676]: I0930 13:58:48.166443 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:48 crc kubenswrapper[4676]: I0930 13:58:48.166460 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:48 crc kubenswrapper[4676]: I0930 13:58:48.166489 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:48 crc kubenswrapper[4676]: I0930 13:58:48.166518 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:48Z","lastTransitionTime":"2025-09-30T13:58:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:48 crc kubenswrapper[4676]: I0930 13:58:48.268668 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:48 crc kubenswrapper[4676]: I0930 13:58:48.268696 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:48 crc kubenswrapper[4676]: I0930 13:58:48.268704 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:48 crc kubenswrapper[4676]: I0930 13:58:48.268721 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:48 crc kubenswrapper[4676]: I0930 13:58:48.268730 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:48Z","lastTransitionTime":"2025-09-30T13:58:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:48 crc kubenswrapper[4676]: I0930 13:58:48.372103 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:48 crc kubenswrapper[4676]: I0930 13:58:48.372171 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:48 crc kubenswrapper[4676]: I0930 13:58:48.372182 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:48 crc kubenswrapper[4676]: I0930 13:58:48.372207 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:48 crc kubenswrapper[4676]: I0930 13:58:48.372253 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:48Z","lastTransitionTime":"2025-09-30T13:58:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:48 crc kubenswrapper[4676]: I0930 13:58:48.433010 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:58:48 crc kubenswrapper[4676]: I0930 13:58:48.433045 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:58:48 crc kubenswrapper[4676]: I0930 13:58:48.433009 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:58:48 crc kubenswrapper[4676]: E0930 13:58:48.433204 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:58:48 crc kubenswrapper[4676]: E0930 13:58:48.433468 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:58:48 crc kubenswrapper[4676]: E0930 13:58:48.433369 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:58:48 crc kubenswrapper[4676]: I0930 13:58:48.475333 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:48 crc kubenswrapper[4676]: I0930 13:58:48.475387 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:48 crc kubenswrapper[4676]: I0930 13:58:48.475400 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:48 crc kubenswrapper[4676]: I0930 13:58:48.475420 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:48 crc kubenswrapper[4676]: I0930 13:58:48.475433 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:48Z","lastTransitionTime":"2025-09-30T13:58:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:48 crc kubenswrapper[4676]: I0930 13:58:48.578136 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:48 crc kubenswrapper[4676]: I0930 13:58:48.578260 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:48 crc kubenswrapper[4676]: I0930 13:58:48.578281 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:48 crc kubenswrapper[4676]: I0930 13:58:48.578313 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:48 crc kubenswrapper[4676]: I0930 13:58:48.578332 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:48Z","lastTransitionTime":"2025-09-30T13:58:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:48 crc kubenswrapper[4676]: I0930 13:58:48.631601 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ksfzg" event={"ID":"70e11f41-7d9e-49ef-a2f5-0691d5f8f631","Type":"ContainerStarted","Data":"4452138f5ffc57e76d26761a49ebe4f782695f4d0298426e19d9fc9f87202645"} Sep 30 13:58:48 crc kubenswrapper[4676]: I0930 13:58:48.681779 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:48 crc kubenswrapper[4676]: I0930 13:58:48.681836 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:48 crc kubenswrapper[4676]: I0930 13:58:48.681845 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:48 crc kubenswrapper[4676]: I0930 13:58:48.681868 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:48 crc kubenswrapper[4676]: I0930 13:58:48.681911 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:48Z","lastTransitionTime":"2025-09-30T13:58:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:48 crc kubenswrapper[4676]: I0930 13:58:48.786321 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:48 crc kubenswrapper[4676]: I0930 13:58:48.786756 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:48 crc kubenswrapper[4676]: I0930 13:58:48.786773 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:48 crc kubenswrapper[4676]: I0930 13:58:48.786801 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:48 crc kubenswrapper[4676]: I0930 13:58:48.786821 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:48Z","lastTransitionTime":"2025-09-30T13:58:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:48 crc kubenswrapper[4676]: I0930 13:58:48.890439 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:48 crc kubenswrapper[4676]: I0930 13:58:48.890511 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:48 crc kubenswrapper[4676]: I0930 13:58:48.890533 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:48 crc kubenswrapper[4676]: I0930 13:58:48.890558 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:48 crc kubenswrapper[4676]: I0930 13:58:48.890577 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:48Z","lastTransitionTime":"2025-09-30T13:58:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:48 crc kubenswrapper[4676]: I0930 13:58:48.993604 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:48 crc kubenswrapper[4676]: I0930 13:58:48.993686 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:48 crc kubenswrapper[4676]: I0930 13:58:48.993701 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:48 crc kubenswrapper[4676]: I0930 13:58:48.993722 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:48 crc kubenswrapper[4676]: I0930 13:58:48.993737 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:48Z","lastTransitionTime":"2025-09-30T13:58:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.097025 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.097099 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.097116 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.097152 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.097205 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:49Z","lastTransitionTime":"2025-09-30T13:58:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.201124 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.201667 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.201705 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.201716 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.201730 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.201741 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:49Z","lastTransitionTime":"2025-09-30T13:58:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.216754 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86284cfd-7d99-4f3b-bb3a-593b66737ae3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf15b6b87f6e760bf922b02b49658153875a79c4253bc50a895372c460e0f077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f838844cf38dd9b54b6054863a395010ea5cb87ca8066642ffe388365a91f5a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://151ae8ebe882d8c0150cd97088a56e52f55f0ae066819f8e4aaae2279a61881f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff5fec130c99031e2cb7164e0a3d5c4af84e2b2ac25d3cf255008dc5c1ab967\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3049f477ecf683e57b583e8f15aa9e5335347be632ffc02799ef279fefe9b446\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 13:58:31.152502 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:58:31.168708 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3623665920/tls.crt::/tmp/serving-cert-3623665920/tls.key\\\\\\\"\\\\nI0930 13:58:36.402239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:58:36.407288 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:58:36.407313 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:58:36.407332 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:58:36.407337 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:58:36.414209 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:58:36.414232 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414236 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:58:36.414244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:58:36.414247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:58:36.414250 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:58:36.414249 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:58:36.417272 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e2c7ce3911aaca47ffc14e5628e9efb3b32594d812028b2659b8f8ca681d53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:49Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.231066 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ksfzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70e11f41-7d9e-49ef-a2f5-0691d5f8f631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e68434ff5ca08dde9677cf96ed4703958895aa880048ef99eead372a1a06bcf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e68434ff5ca08dde9677cf96ed4703958895aa880048ef99eead372a1a06bcf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f888604a37c64c17b61fbe754fb766b3a8dd363cd84342d3f464ac3b377b703c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f888604a37c64c17b61fbe754fb766b3a8dd363cd84342d3f464ac3b377b703c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a4a6ea3224024cc371c57de0a5be4082b7bd9092090f7af42bedda1a897d78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70a4a6ea3224024cc371c57de0a5be4082b7bd9092090f7af42bedda1a897d78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62090d66b5fc63638db525937297cd84b886fc27446d8df9935ee6b45f7d8610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62090d66b5fc63638db525937297cd84b886fc27446d8df9935ee6b45f7d8610\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ksfzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:49Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.242692 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af133cb7-f0e4-428e-b348-c6e81493fc1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8aa7f7317a33eb0f00d05f59ec96a210aa79be528b5dc94d099b25fa019511a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7djb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ba88aa74993edfc49871c7cb7c45c498fc9a6d1e2cc573864d4656d4ade55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7djb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4k2dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:49Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.257813 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:49Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.269343 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:49Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.278968 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7stxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e065baf-f38b-4397-bbbe-ef52eea10f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f5fe4110f265eb89d70f5d85485514e7fea4ed909359edaa0c76d305d1a9c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nkkjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7stxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:49Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.288451 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qmgfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0fc06a-eb2b-4fa4-8554-ce8a0fd62074\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb44b6e0c8081c62bace086876620d80d412864b9afb7fd495413baf0363c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mwsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qmgfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:49Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.304438 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.304485 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.304496 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.304511 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.304522 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:49Z","lastTransitionTime":"2025-09-30T13:58:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.305441 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fae6bdf-2a3f-4961-934d-b8f653412538\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6b55521ba448057d6f78ab17ef0a1a7d4c8e593e5c7a23bebb4db492fb4b54f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6b55521ba448057d6f78ab17ef0a1a7d4c8e593e5c7a23bebb4db492fb4b54f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9775s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:49Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.317258 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938f56b18fe369ceeb13a74b0c257fd1b7fafeeb4d331327d4508f482caca32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:49Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.333093 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s7q5x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12808c49-1bed-4251-bcbe-fad6207eea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://397ae2103d09a79613ccc4281fb5318128fa7dc84e8013a5327cf3cd31490049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84swg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s7q5x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:49Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.358993 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9343584b-a3e2-45a2-9f71-b815d206f44c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://168155413f2783887fde890a607fd98c1823da5a45b7fc550ef9358750451da7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69000b0d87b6f1e68fb36dd2349f95de211b17cc4e7bd7ec489969feca3810d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0228750af194e47c41da2795fb0b21b2c44b61e7d5a8943a943793954b33e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://288cb819756dee05f608c0e6875f3a2b4eb629a5d07db35aaf706a4bc4110899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9e51dab4ec7df53e36f707786a013e4fca0909f0d69cb9113dfdff6e0c9a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:49Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.373331 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2355fa03-bfc0-4ba0-941c-ceee570ab4b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb64f69d88447dcfe2a2e206f1e59b4b10f370ac02a3f9653e0b95e6d94529c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f628b643cc4c4a9a0d4b2c6d923d6ee5839b325a873da093e84dfc48255548f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86ba7b90dcf7a1fcd028d2fcf19bb86de8c58446ef34191d50a5055f4042bb71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5591cefc1293c938f8a2a8db06c7287d0e54d55a9d99c65a40126ca77abccd80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:49Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.386339 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e12a8bf70722b099302a9c3b8d88526bbea68746ea168b2bf41306a3e18641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:49Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.396283 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e1c7778cf1d8d9f6dc68d72d153ea277d19d66d775fa02ed06687e3392040c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98be87b4cb316839099b69460cc1c1ecd98d5162c7dfc995c8f6d86ad477fedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:49Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.407247 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.407282 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.407291 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.407304 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.407313 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:49Z","lastTransitionTime":"2025-09-30T13:58:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.408395 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:49Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.509708 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.510033 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.510165 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.510296 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.510404 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:49Z","lastTransitionTime":"2025-09-30T13:58:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.613224 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.613512 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.613584 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.613648 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.613701 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:49Z","lastTransitionTime":"2025-09-30T13:58:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.649503 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e12a8bf70722b099302a9c3b8d88526bbea68746ea168b2bf41306a3e18641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:49Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.665571 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e1c7778cf1d8d9f6dc68d72d153ea277d19d66d775fa02ed06687e3392040c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98be87b4cb316839099b69460cc1c1ecd98d5162c7dfc995c8f6d86ad477fedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:49Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.681871 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:49Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.694251 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938f56b18fe369ceeb13a74b0c257fd1b7fafeeb4d331327d4508f482caca32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:49Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.708997 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s7q5x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12808c49-1bed-4251-bcbe-fad6207eea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://397ae2103d09a79613ccc4281fb5318128fa7dc84e8013a5327cf3cd31490049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84swg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s7q5x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:49Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.716215 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.716257 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.716269 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.716290 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.716302 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:49Z","lastTransitionTime":"2025-09-30T13:58:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.732384 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9343584b-a3e2-45a2-9f71-b815d206f44c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://168155413f2783887fde890a607fd98c1823da5a45b7fc550ef9358750451da7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69000b0d87b6f1e68fb36dd2349f95de211b17cc4e7bd7ec489969feca3810d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0228750af194e47c41da2795fb0b21b2c44b61e7d5a8943a943793954b33e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://288cb819756dee05f608c0e6875f3a2b4eb629a5d07db35aaf706a4bc4110899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9e51dab4ec7df53e36f707786a013e4fca0909f0d69cb9113dfdff6e0c9a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:49Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.746699 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2355fa03-bfc0-4ba0-941c-ceee570ab4b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb64f69d88447dcfe2a2e206f1e59b4b10f370ac02a3f9653e0b95e6d94529c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f628b643cc4c4a9a0d4b2c6d923d6ee5839b325a873da093e84dfc48255548f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86ba7b90dcf7a1fcd028d2fcf19bb86de8c58446ef34191d50a5055f4042bb71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5591cefc1293c938f8a2a8db06c7287d0e54d55a9d99c65a40126ca77abccd80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:49Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.765741 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86284cfd-7d99-4f3b-bb3a-593b66737ae3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf15b6b87f6e760bf922b02b49658153875a79c4253bc50a895372c460e0f077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f838844cf38dd9b54b6054863a395010ea5cb87ca8066642ffe388365a91f5a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://151ae8ebe882d8c0150cd97088a56e52f55f0ae066819f8e4aaae2279a61881f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff5fec130c99031e2cb7164e0a3d5c4af84e2b2ac25d3cf255008dc5c1ab967\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3049f477ecf683e57b583e8f15aa9e5335347be632ffc02799ef279fefe9b446\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 13:58:31.152502 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:58:31.168708 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3623665920/tls.crt::/tmp/serving-cert-3623665920/tls.key\\\\\\\"\\\\nI0930 13:58:36.402239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:58:36.407288 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:58:36.407313 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:58:36.407332 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:58:36.407337 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:58:36.414209 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:58:36.414232 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414236 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:58:36.414244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:58:36.414247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:58:36.414250 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:58:36.414249 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:58:36.417272 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e2c7ce3911aaca47ffc14e5628e9efb3b32594d812028b2659b8f8ca681d53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:49Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.777616 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af133cb7-f0e4-428e-b348-c6e81493fc1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8aa7f7317a33eb0f00d05f59ec96a210aa79be528b5dc94d099b25fa019511a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7djb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ba88aa74993edfc49871c7cb7c45c498fc9a6d1e2cc573864d4656d4ade55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7djb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4k2dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:49Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.789806 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ksfzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70e11f41-7d9e-49ef-a2f5-0691d5f8f631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e68434ff5ca08dde9677cf96ed4703958895aa880048ef99eead372a1a06bcf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e68434ff5ca08dde9677cf96ed4703958895aa880048ef99eead372a1a06bcf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f888604a37c64c17b61fbe754fb766b3a8dd363cd84342d3f464ac3b377b703c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f888604a37c64c17b61fbe754fb766b3a8dd363cd84342d3f464ac3b377b703c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a4a6ea3224024cc371c57de0a5be4082b7bd9092090f7af42bedda1a897d78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70a4a6ea3224024cc371c57de0a5be4082b7bd9092090f7af42bedda1a897d78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62090d66b5fc63638db525937297cd84b886fc27446d8df9935ee6b45f7d8610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62090d66b5fc63638db525937297cd84b886fc27446d8df9935ee6b45f7d8610\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4452138f5ffc57e76d26761a49ebe4f782695f4d0298426e19d9fc9f87202645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ksfzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:49Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.798167 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7stxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e065baf-f38b-4397-bbbe-ef52eea10f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f5fe4110f265eb89d70f5d85485514e7fea4ed909359edaa0c76d305d1a9c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nkkjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7stxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:49Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.807913 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qmgfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0fc06a-eb2b-4fa4-8554-ce8a0fd62074\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb44b6e0c8081c62bace086876620d80d412864b9afb7fd495413baf0363c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mwsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qmgfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:49Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.817940 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.817995 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.818009 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.818026 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.818037 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:49Z","lastTransitionTime":"2025-09-30T13:58:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.824484 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fae6bdf-2a3f-4961-934d-b8f653412538\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6b55521ba448057d6f78ab17ef0a1a7d4c8e593e5c7a23bebb4db492fb4b54f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6b55521ba448057d6f78ab17ef0a1a7d4c8e593e5c7a23bebb4db492fb4b54f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9775s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:49Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.839343 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:49Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.852014 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:49Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.920712 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.920765 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.920775 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.920789 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:49 crc kubenswrapper[4676]: I0930 13:58:49.920800 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:49Z","lastTransitionTime":"2025-09-30T13:58:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.022999 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.023034 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.023043 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.023056 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.023066 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:50Z","lastTransitionTime":"2025-09-30T13:58:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.125048 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.125083 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.125092 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.125107 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.125117 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:50Z","lastTransitionTime":"2025-09-30T13:58:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.227585 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.227625 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.227634 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.227649 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.227659 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:50Z","lastTransitionTime":"2025-09-30T13:58:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.329630 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.329657 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.329666 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.329679 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.329691 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:50Z","lastTransitionTime":"2025-09-30T13:58:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.432018 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.432012 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.432157 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:58:50 crc kubenswrapper[4676]: E0930 13:58:50.432297 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:58:50 crc kubenswrapper[4676]: E0930 13:58:50.432441 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:58:50 crc kubenswrapper[4676]: E0930 13:58:50.432517 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.432964 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.433008 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.433023 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.433042 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.433056 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:50Z","lastTransitionTime":"2025-09-30T13:58:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.536032 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.536072 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.536086 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.536107 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.536120 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:50Z","lastTransitionTime":"2025-09-30T13:58:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.639543 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.639618 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.639635 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.639654 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.639665 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:50Z","lastTransitionTime":"2025-09-30T13:58:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.642927 4676 generic.go:334] "Generic (PLEG): container finished" podID="70e11f41-7d9e-49ef-a2f5-0691d5f8f631" containerID="4452138f5ffc57e76d26761a49ebe4f782695f4d0298426e19d9fc9f87202645" exitCode=0 Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.643030 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ksfzg" event={"ID":"70e11f41-7d9e-49ef-a2f5-0691d5f8f631","Type":"ContainerDied","Data":"4452138f5ffc57e76d26761a49ebe4f782695f4d0298426e19d9fc9f87202645"} Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.648696 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" event={"ID":"4fae6bdf-2a3f-4961-934d-b8f653412538","Type":"ContainerStarted","Data":"73cabf42ce5073083d3f5db7b98a621247f45be0b8c0d6dcd43e3bc2810c6f32"} Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.648959 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.649037 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.666062 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9343584b-a3e2-45a2-9f71-b815d206f44c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://168155413f2783887fde890a607fd98c1823da5a45b7fc550ef9358750451da7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69000b0d87b6f1e68fb36dd2349f95de211b17cc4e7bd7ec489969feca3810d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0228750af194e47c41da2795fb0b21b2c44b61e7d5a8943a943793954b33e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://288cb819756dee05f608c0e6875f3a2b4eb629a5d07db35aaf706a4bc4110899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9e51dab4ec7df53e36f707786a013e4fca0909f0d69cb9113dfdff6e0c9a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:50Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.685168 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2355fa03-bfc0-4ba0-941c-ceee570ab4b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb64f69d88447dcfe2a2e206f1e59b4b10f370ac02a3f9653e0b95e6d94529c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f628b643cc4c4a9a0d4b2c6d923d6ee5839b325a873da093e84dfc48255548f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86ba7b90dcf7a1fcd028d2fcf19bb86de8c58446ef34191d50a5055f4042bb71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5591cefc1293c938f8a2a8db06c7287d0e54d55a9d99c65a40126ca77abccd80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:50Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.699079 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e12a8bf70722b099302a9c3b8d88526bbea68746ea168b2bf41306a3e18641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:50Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.706033 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.706093 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.713508 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e1c7778cf1d8d9f6dc68d72d153ea277d19d66d775fa02ed06687e3392040c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98be87b4cb316839099b69460cc1c1ecd98d5162c7dfc995c8f6d86ad477fedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:50Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.727746 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:50Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.741811 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938f56b18fe369ceeb13a74b0c257fd1b7fafeeb4d331327d4508f482caca32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:50Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.744164 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.744196 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.744207 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.744222 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.744235 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:50Z","lastTransitionTime":"2025-09-30T13:58:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.754582 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s7q5x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12808c49-1bed-4251-bcbe-fad6207eea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://397ae2103d09a79613ccc4281fb5318128fa7dc84e8013a5327cf3cd31490049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84swg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s7q5x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:50Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.771257 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86284cfd-7d99-4f3b-bb3a-593b66737ae3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf15b6b87f6e760bf922b02b49658153875a79c4253bc50a895372c460e0f077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f838844cf38dd9b54b6054863a395010ea5cb87ca8066642ffe388365a91f5a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://151ae8ebe882d8c0150cd97088a56e52f55f0ae066819f8e4aaae2279a61881f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff5fec130c99031e2cb7164e0a3d5c4af84e2b2ac25d3cf255008dc5c1ab967\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3049f477ecf683e57b583e8f15aa9e5335347be632ffc02799ef279fefe9b446\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 13:58:31.152502 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:58:31.168708 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3623665920/tls.crt::/tmp/serving-cert-3623665920/tls.key\\\\\\\"\\\\nI0930 13:58:36.402239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:58:36.407288 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:58:36.407313 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:58:36.407332 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:58:36.407337 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:58:36.414209 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:58:36.414232 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414236 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:58:36.414244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:58:36.414247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:58:36.414250 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:58:36.414249 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:58:36.417272 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e2c7ce3911aaca47ffc14e5628e9efb3b32594d812028b2659b8f8ca681d53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:50Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.791458 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ksfzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70e11f41-7d9e-49ef-a2f5-0691d5f8f631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e68434ff5ca08dde9677cf96ed4703958895aa880048ef99eead372a1a06bcf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e68434ff5ca08dde9677cf96ed4703958895aa880048ef99eead372a1a06bcf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f888604a37c64c17b61fbe754fb766b3a8dd363cd84342d3f464ac3b377b703c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f888604a37c64c17b61fbe754fb766b3a8dd363cd84342d3f464ac3b377b703c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a4a6ea3224024cc371c57de0a5be4082b7bd9092090f7af42bedda1a897d78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70a4a6ea3224024cc371c57de0a5be4082b7bd9092090f7af42bedda1a897d78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62090d66b5fc63638db525937297cd84b886fc27446d8df9935ee6b45f7d8610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62090d66b5fc63638db525937297cd84b886fc27446d8df9935ee6b45f7d8610\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4452138f5ffc57e76d26761a49ebe4f782695f4d0298426e19d9fc9f87202645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4452138f5ffc57e76d26761a49ebe4f782695f4d0298426e19d9fc9f87202645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ksfzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:50Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.804912 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af133cb7-f0e4-428e-b348-c6e81493fc1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8aa7f7317a33eb0f00d05f59ec96a210aa79be528b5dc94d099b25fa019511a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7djb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ba88aa74993edfc49871c7cb7c45c498fc9a6d1e2cc573864d4656d4ade55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7djb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4k2dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:50Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.815980 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:50Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.826922 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:50Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.835349 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7stxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e065baf-f38b-4397-bbbe-ef52eea10f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f5fe4110f265eb89d70f5d85485514e7fea4ed909359edaa0c76d305d1a9c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nkkjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7stxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:50Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.846656 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qmgfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0fc06a-eb2b-4fa4-8554-ce8a0fd62074\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb44b6e0c8081c62bace086876620d80d412864b9afb7fd495413baf0363c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mwsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qmgfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:50Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.847222 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.847251 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.847260 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.847273 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.847283 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:50Z","lastTransitionTime":"2025-09-30T13:58:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.864388 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fae6bdf-2a3f-4961-934d-b8f653412538\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6b55521ba448057d6f78ab17ef0a1a7d4c8e593e5c7a23bebb4db492fb4b54f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6b55521ba448057d6f78ab17ef0a1a7d4c8e593e5c7a23bebb4db492fb4b54f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9775s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:50Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.878210 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ksfzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70e11f41-7d9e-49ef-a2f5-0691d5f8f631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e68434ff5ca08dde9677cf96ed4703958895aa880048ef99eead372a1a06bcf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e68434ff5ca08dde9677cf96ed4703958895aa880048ef99eead372a1a06bcf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f888604a37c64c17b61fbe754fb766b3a8dd363cd84342d3f464ac3b377b703c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f888604a37c64c17b61fbe754fb766b3a8dd363cd84342d3f464ac3b377b703c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a4a6ea3224024cc371c57de0a5be4082b7bd9092090f7af42bedda1a897d78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70a4a6ea3224024cc371c57de0a5be4082b7bd9092090f7af42bedda1a897d78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62090d66b5fc63638db525937297cd84b886fc27446d8df9935ee6b45f7d8610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62090d66b5fc63638db525937297cd84b886fc27446d8df9935ee6b45f7d8610\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4452138f5ffc57e76d26761a49ebe4f782695f4d0298426e19d9fc9f87202645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4452138f5ffc57e76d26761a49ebe4f782695f4d0298426e19d9fc9f87202645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ksfzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:50Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.889389 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af133cb7-f0e4-428e-b348-c6e81493fc1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8aa7f7317a33eb0f00d05f59ec96a210aa79be528b5dc94d099b25fa019511a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7djb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ba88aa74993edfc49871c7cb7c45c498fc9a6d1e2cc573864d4656d4ade55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7djb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4k2dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:50Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.900084 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:50Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.912669 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:50Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.922308 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7stxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e065baf-f38b-4397-bbbe-ef52eea10f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f5fe4110f265eb89d70f5d85485514e7fea4ed909359edaa0c76d305d1a9c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nkkjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7stxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:50Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.931098 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qmgfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0fc06a-eb2b-4fa4-8554-ce8a0fd62074\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb44b6e0c8081c62bace086876620d80d412864b9afb7fd495413baf0363c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mwsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qmgfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:50Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.948155 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fae6bdf-2a3f-4961-934d-b8f653412538\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dbb7809575c05151462f55a6d1d4eca91ba859d817d152220688e61385beff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2def96f529a78f915d69adfd74b48057081c10bf8f21bd69fce0d4b305f80a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a8ea8d40cefaa9290dfd65c7f1fa80963910c38b71de3992bdcf77fde19eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1323c0deb00658d6452b2d6aa4515233551626a4c4def3f4fff539db4a3f5038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6836ea3a8ec1a7fdff3ef83bd00963c21caeb4e8264f3b6775201c90b83d4653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f1d81c0d66a327a4174d0f39a9540f66d89a6c2049490ddad67174aa684b773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cabf42ce5073083d3f5db7b98a621247f45be0b8c0d6dcd43e3bc2810c6f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d05c6ffe2c46461b1604661e27c75ac6f150de39e30e983d4c4dfbe9c84092e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6b55521ba448057d6f78ab17ef0a1a7d4c8e593e5c7a23bebb4db492fb4b54f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6b55521ba448057d6f78ab17ef0a1a7d4c8e593e5c7a23bebb4db492fb4b54f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9775s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:50Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.949823 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.949849 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.949857 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.949872 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.949895 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:50Z","lastTransitionTime":"2025-09-30T13:58:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.969138 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9343584b-a3e2-45a2-9f71-b815d206f44c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://168155413f2783887fde890a607fd98c1823da5a45b7fc550ef9358750451da7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69000b0d87b6f1e68fb36dd2349f95de211b17cc4e7bd7ec489969feca3810d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0228750af194e47c41da2795fb0b21b2c44b61e7d5a8943a943793954b33e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://288cb819756dee05f608c0e6875f3a2b4eb629a5d07db35aaf706a4bc4110899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9e51dab4ec7df53e36f707786a013e4fca0909f0d69cb9113dfdff6e0c9a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:50Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.980845 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2355fa03-bfc0-4ba0-941c-ceee570ab4b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb64f69d88447dcfe2a2e206f1e59b4b10f370ac02a3f9653e0b95e6d94529c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f628b643cc4c4a9a0d4b2c6d923d6ee5839b325a873da093e84dfc48255548f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86ba7b90dcf7a1fcd028d2fcf19bb86de8c58446ef34191d50a5055f4042bb71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5591cefc1293c938f8a2a8db06c7287d0e54d55a9d99c65a40126ca77abccd80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:50Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:50 crc kubenswrapper[4676]: I0930 13:58:50.993281 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e12a8bf70722b099302a9c3b8d88526bbea68746ea168b2bf41306a3e18641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:50Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.005345 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e1c7778cf1d8d9f6dc68d72d153ea277d19d66d775fa02ed06687e3392040c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98be87b4cb316839099b69460cc1c1ecd98d5162c7dfc995c8f6d86ad477fedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:51Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.016482 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:51Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.026371 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938f56b18fe369ceeb13a74b0c257fd1b7fafeeb4d331327d4508f482caca32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:51Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.038390 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s7q5x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12808c49-1bed-4251-bcbe-fad6207eea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://397ae2103d09a79613ccc4281fb5318128fa7dc84e8013a5327cf3cd31490049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84swg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s7q5x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:51Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.051686 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.051727 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.051736 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.051751 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.051759 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:51Z","lastTransitionTime":"2025-09-30T13:58:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.059567 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86284cfd-7d99-4f3b-bb3a-593b66737ae3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf15b6b87f6e760bf922b02b49658153875a79c4253bc50a895372c460e0f077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f838844cf38dd9b54b6054863a395010ea5cb87ca8066642ffe388365a91f5a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://151ae8ebe882d8c0150cd97088a56e52f55f0ae066819f8e4aaae2279a61881f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff5fec130c99031e2cb7164e0a3d5c4af84e2b2ac25d3cf255008dc5c1ab967\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3049f477ecf683e57b583e8f15aa9e5335347be632ffc02799ef279fefe9b446\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 13:58:31.152502 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:58:31.168708 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3623665920/tls.crt::/tmp/serving-cert-3623665920/tls.key\\\\\\\"\\\\nI0930 13:58:36.402239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:58:36.407288 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:58:36.407313 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:58:36.407332 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:58:36.407337 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:58:36.414209 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:58:36.414232 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414236 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:58:36.414244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:58:36.414247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:58:36.414250 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:58:36.414249 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:58:36.417272 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e2c7ce3911aaca47ffc14e5628e9efb3b32594d812028b2659b8f8ca681d53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:51Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.154343 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.154380 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.154390 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.154408 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.154421 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:51Z","lastTransitionTime":"2025-09-30T13:58:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.256301 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.256353 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.256362 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.256377 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.256387 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:51Z","lastTransitionTime":"2025-09-30T13:58:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.358216 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.358259 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.358271 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.358288 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.358300 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:51Z","lastTransitionTime":"2025-09-30T13:58:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.461306 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.461431 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.461458 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.461489 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.461508 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:51Z","lastTransitionTime":"2025-09-30T13:58:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.563647 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.563680 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.563690 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.563714 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.563725 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:51Z","lastTransitionTime":"2025-09-30T13:58:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.654280 4676 generic.go:334] "Generic (PLEG): container finished" podID="70e11f41-7d9e-49ef-a2f5-0691d5f8f631" containerID="3c2ac4de1ea55982e71a3ecd8eaffb3d44648249635c4cdfbe0181f0497a2d1d" exitCode=0 Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.654340 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ksfzg" event={"ID":"70e11f41-7d9e-49ef-a2f5-0691d5f8f631","Type":"ContainerDied","Data":"3c2ac4de1ea55982e71a3ecd8eaffb3d44648249635c4cdfbe0181f0497a2d1d"} Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.654434 4676 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.666183 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.666222 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.666234 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.666253 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.666265 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:51Z","lastTransitionTime":"2025-09-30T13:58:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.669174 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86284cfd-7d99-4f3b-bb3a-593b66737ae3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf15b6b87f6e760bf922b02b49658153875a79c4253bc50a895372c460e0f077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f838844cf38dd9b54b6054863a395010ea5cb87ca8066642ffe388365a91f5a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://151ae8ebe882d8c0150cd97088a56e52f55f0ae066819f8e4aaae2279a61881f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff5fec130c99031e2cb7164e0a3d5c4af84e2b2ac25d3cf255008dc5c1ab967\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3049f477ecf683e57b583e8f15aa9e5335347be632ffc02799ef279fefe9b446\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 13:58:31.152502 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:58:31.168708 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3623665920/tls.crt::/tmp/serving-cert-3623665920/tls.key\\\\\\\"\\\\nI0930 13:58:36.402239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:58:36.407288 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:58:36.407313 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:58:36.407332 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:58:36.407337 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:58:36.414209 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:58:36.414232 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414236 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:58:36.414244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:58:36.414247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:58:36.414250 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:58:36.414249 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:58:36.417272 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e2c7ce3911aaca47ffc14e5628e9efb3b32594d812028b2659b8f8ca681d53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:51Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.684575 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ksfzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70e11f41-7d9e-49ef-a2f5-0691d5f8f631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e68434ff5ca08dde9677cf96ed4703958895aa880048ef99eead372a1a06bcf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e68434ff5ca08dde9677cf96ed4703958895aa880048ef99eead372a1a06bcf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f888604a37c64c17b61fbe754fb766b3a8dd363cd84342d3f464ac3b377b703c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f888604a37c64c17b61fbe754fb766b3a8dd363cd84342d3f464ac3b377b703c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a4a6ea3224024cc371c57de0a5be4082b7bd9092090f7af42bedda1a897d78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70a4a6ea3224024cc371c57de0a5be4082b7bd9092090f7af42bedda1a897d78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62090d66b5fc63638db525937297cd84b886fc27446d8df9935ee6b45f7d8610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62090d66b5fc63638db525937297cd84b886fc27446d8df9935ee6b45f7d8610\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4452138f5ffc57e76d26761a49ebe4f782695f4d0298426e19d9fc9f87202645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4452138f5ffc57e76d26761a49ebe4f782695f4d0298426e19d9fc9f87202645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2ac4de1ea55982e71a3ecd8eaffb3d44648249635c4cdfbe0181f0497a2d1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c2ac4de1ea55982e71a3ecd8eaffb3d44648249635c4cdfbe0181f0497a2d1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ksfzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:51Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.695352 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af133cb7-f0e4-428e-b348-c6e81493fc1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8aa7f7317a33eb0f00d05f59ec96a210aa79be528b5dc94d099b25fa019511a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7djb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ba88aa74993edfc49871c7cb7c45c498fc9a6d1e2cc573864d4656d4ade55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7djb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4k2dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:51Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.715742 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fae6bdf-2a3f-4961-934d-b8f653412538\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dbb7809575c05151462f55a6d1d4eca91ba859d817d152220688e61385beff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2def96f529a78f915d69adfd74b48057081c10bf8f21bd69fce0d4b305f80a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a8ea8d40cefaa9290dfd65c7f1fa80963910c38b71de3992bdcf77fde19eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1323c0deb00658d6452b2d6aa4515233551626a4c4def3f4fff539db4a3f5038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6836ea3a8ec1a7fdff3ef83bd00963c21caeb4e8264f3b6775201c90b83d4653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f1d81c0d66a327a4174d0f39a9540f66d89a6c2049490ddad67174aa684b773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cabf42ce5073083d3f5db7b98a621247f45be0b8c0d6dcd43e3bc2810c6f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d05c6ffe2c46461b1604661e27c75ac6f150de39e30e983d4c4dfbe9c84092e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6b55521ba448057d6f78ab17ef0a1a7d4c8e593e5c7a23bebb4db492fb4b54f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6b55521ba448057d6f78ab17ef0a1a7d4c8e593e5c7a23bebb4db492fb4b54f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9775s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:51Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.726590 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:51Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.739591 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:51Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.748818 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7stxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e065baf-f38b-4397-bbbe-ef52eea10f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f5fe4110f265eb89d70f5d85485514e7fea4ed909359edaa0c76d305d1a9c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nkkjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7stxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:51Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.758487 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qmgfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0fc06a-eb2b-4fa4-8554-ce8a0fd62074\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb44b6e0c8081c62bace086876620d80d412864b9afb7fd495413baf0363c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mwsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qmgfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:51Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.769296 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.769376 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.769389 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.769406 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.769417 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:51Z","lastTransitionTime":"2025-09-30T13:58:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.770503 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e1c7778cf1d8d9f6dc68d72d153ea277d19d66d775fa02ed06687e3392040c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98be87b4cb316839099b69460cc1c1ecd98d5162c7dfc995c8f6d86ad477fedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:51Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.780748 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:51Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.790452 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938f56b18fe369ceeb13a74b0c257fd1b7fafeeb4d331327d4508f482caca32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:51Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.801849 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s7q5x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12808c49-1bed-4251-bcbe-fad6207eea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://397ae2103d09a79613ccc4281fb5318128fa7dc84e8013a5327cf3cd31490049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84swg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s7q5x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:51Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.825580 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9343584b-a3e2-45a2-9f71-b815d206f44c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://168155413f2783887fde890a607fd98c1823da5a45b7fc550ef9358750451da7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69000b0d87b6f1e68fb36dd2349f95de211b17cc4e7bd7ec489969feca3810d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0228750af194e47c41da2795fb0b21b2c44b61e7d5a8943a943793954b33e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://288cb819756dee05f608c0e6875f3a2b4eb629a5d07db35aaf706a4bc4110899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9e51dab4ec7df53e36f707786a013e4fca0909f0d69cb9113dfdff6e0c9a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:51Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.844049 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2355fa03-bfc0-4ba0-941c-ceee570ab4b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb64f69d88447dcfe2a2e206f1e59b4b10f370ac02a3f9653e0b95e6d94529c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f628b643cc4c4a9a0d4b2c6d923d6ee5839b325a873da093e84dfc48255548f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86ba7b90dcf7a1fcd028d2fcf19bb86de8c58446ef34191d50a5055f4042bb71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5591cefc1293c938f8a2a8db06c7287d0e54d55a9d99c65a40126ca77abccd80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:51Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.858126 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e12a8bf70722b099302a9c3b8d88526bbea68746ea168b2bf41306a3e18641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:51Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.872316 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.872354 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.872364 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.872380 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.872407 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:51Z","lastTransitionTime":"2025-09-30T13:58:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.974429 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.974457 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.974465 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.974477 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:51 crc kubenswrapper[4676]: I0930 13:58:51.974485 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:51Z","lastTransitionTime":"2025-09-30T13:58:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.076862 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.076934 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.076945 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.076959 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.076968 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:52Z","lastTransitionTime":"2025-09-30T13:58:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.096012 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.096115 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.096156 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:58:52 crc kubenswrapper[4676]: E0930 13:58:52.096250 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:59:08.09622334 +0000 UTC m=+52.079311769 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:58:52 crc kubenswrapper[4676]: E0930 13:58:52.096276 4676 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 13:58:52 crc kubenswrapper[4676]: E0930 13:58:52.096283 4676 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 13:58:52 crc kubenswrapper[4676]: E0930 13:58:52.096330 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 13:59:08.096318043 +0000 UTC m=+52.079406472 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 13:58:52 crc kubenswrapper[4676]: E0930 13:58:52.096346 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 13:59:08.096338733 +0000 UTC m=+52.079427162 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.180431 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.180475 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.180487 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.180513 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.180524 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:52Z","lastTransitionTime":"2025-09-30T13:58:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.197300 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.197402 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:58:52 crc kubenswrapper[4676]: E0930 13:58:52.197554 4676 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 13:58:52 crc kubenswrapper[4676]: E0930 13:58:52.197591 4676 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 13:58:52 crc kubenswrapper[4676]: E0930 13:58:52.197606 4676 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:58:52 crc kubenswrapper[4676]: E0930 13:58:52.197636 4676 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 13:58:52 crc kubenswrapper[4676]: E0930 13:58:52.197668 4676 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 13:58:52 crc kubenswrapper[4676]: E0930 13:58:52.197695 4676 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:58:52 crc kubenswrapper[4676]: E0930 13:58:52.197673 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 13:59:08.197653676 +0000 UTC m=+52.180742105 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:58:52 crc kubenswrapper[4676]: E0930 13:58:52.197804 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 13:59:08.19777732 +0000 UTC m=+52.180865789 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.283421 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.283467 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.283480 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.283496 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.283506 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:52Z","lastTransitionTime":"2025-09-30T13:58:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.385909 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.385952 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.385961 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.385974 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.385983 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:52Z","lastTransitionTime":"2025-09-30T13:58:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.432983 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.433059 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:58:52 crc kubenswrapper[4676]: E0930 13:58:52.433123 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:58:52 crc kubenswrapper[4676]: E0930 13:58:52.433202 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.433272 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:58:52 crc kubenswrapper[4676]: E0930 13:58:52.433405 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.486337 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.487641 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.487680 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.487692 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.487708 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.487719 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:52Z","lastTransitionTime":"2025-09-30T13:58:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.589905 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.589945 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.589954 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.589967 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.589976 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:52Z","lastTransitionTime":"2025-09-30T13:58:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.660662 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ksfzg" event={"ID":"70e11f41-7d9e-49ef-a2f5-0691d5f8f631","Type":"ContainerStarted","Data":"72f096ef608f833a542cea2dde328dd04393d90bcf9c20601499b9bb49734e8e"} Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.677604 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ksfzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70e11f41-7d9e-49ef-a2f5-0691d5f8f631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72f096ef608f833a542cea2dde328dd04393d90bcf9c20601499b9bb49734e8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e68434ff5ca08dde9677cf96ed4703958895aa880048ef99eead372a1a06bcf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e68434ff5ca08dde9677cf96ed4703958895aa880048ef99eead372a1a06bcf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f888604a37c64c17b61fbe754fb766b3a8dd363cd84342d3f464ac3b377b703c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f888604a37c64c17b61fbe754fb766b3a8dd363cd84342d3f464ac3b377b703c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a4a6ea3224024cc371c57de0a5be4082b7bd9092090f7af42bedda1a897d78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70a4a6ea3224024cc371c57de0a5be4082b7bd9092090f7af42bedda1a897d78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62090d66b5fc63638db525937297cd84b886fc27446d8df9935ee6b45f7d8610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62090d66b5fc63638db525937297cd84b886fc27446d8df9935ee6b45f7d8610\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4452138f5ffc57e76d26761a49ebe4f782695f4d0298426e19d9fc9f87202645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4452138f5ffc57e76d26761a49ebe4f782695f4d0298426e19d9fc9f87202645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2ac4de1ea55982e71a3ecd8eaffb3d44648249635c4cdfbe0181f0497a2d1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c2ac4de1ea55982e71a3ecd8eaffb3d44648249635c4cdfbe0181f0497a2d1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ksfzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:52Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.687446 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af133cb7-f0e4-428e-b348-c6e81493fc1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8aa7f7317a33eb0f00d05f59ec96a210aa79be528b5dc94d099b25fa019511a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7djb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ba88aa74993edfc49871c7cb7c45c498fc9a6d1e2cc573864d4656d4ade55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7djb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4k2dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:52Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.691713 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.691790 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.691805 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.691824 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.691837 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:52Z","lastTransitionTime":"2025-09-30T13:58:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.699811 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:52Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.712991 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:52Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.724695 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7stxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e065baf-f38b-4397-bbbe-ef52eea10f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f5fe4110f265eb89d70f5d85485514e7fea4ed909359edaa0c76d305d1a9c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nkkjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7stxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:52Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.736498 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qmgfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0fc06a-eb2b-4fa4-8554-ce8a0fd62074\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb44b6e0c8081c62bace086876620d80d412864b9afb7fd495413baf0363c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mwsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qmgfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:52Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.757020 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fae6bdf-2a3f-4961-934d-b8f653412538\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dbb7809575c05151462f55a6d1d4eca91ba859d817d152220688e61385beff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2def96f529a78f915d69adfd74b48057081c10bf8f21bd69fce0d4b305f80a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a8ea8d40cefaa9290dfd65c7f1fa80963910c38b71de3992bdcf77fde19eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1323c0deb00658d6452b2d6aa4515233551626a4c4def3f4fff539db4a3f5038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6836ea3a8ec1a7fdff3ef83bd00963c21caeb4e8264f3b6775201c90b83d4653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f1d81c0d66a327a4174d0f39a9540f66d89a6c2049490ddad67174aa684b773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cabf42ce5073083d3f5db7b98a621247f45be0b8c0d6dcd43e3bc2810c6f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d05c6ffe2c46461b1604661e27c75ac6f150de39e30e983d4c4dfbe9c84092e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6b55521ba448057d6f78ab17ef0a1a7d4c8e593e5c7a23bebb4db492fb4b54f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6b55521ba448057d6f78ab17ef0a1a7d4c8e593e5c7a23bebb4db492fb4b54f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9775s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:52Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.770518 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s7q5x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12808c49-1bed-4251-bcbe-fad6207eea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://397ae2103d09a79613ccc4281fb5318128fa7dc84e8013a5327cf3cd31490049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84swg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s7q5x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:52Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.789089 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9343584b-a3e2-45a2-9f71-b815d206f44c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://168155413f2783887fde890a607fd98c1823da5a45b7fc550ef9358750451da7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69000b0d87b6f1e68fb36dd2349f95de211b17cc4e7bd7ec489969feca3810d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0228750af194e47c41da2795fb0b21b2c44b61e7d5a8943a943793954b33e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://288cb819756dee05f608c0e6875f3a2b4eb629a5d07db35aaf706a4bc4110899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9e51dab4ec7df53e36f707786a013e4fca0909f0d69cb9113dfdff6e0c9a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:52Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.793514 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.793545 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.793555 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.793568 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.793578 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:52Z","lastTransitionTime":"2025-09-30T13:58:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.801440 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2355fa03-bfc0-4ba0-941c-ceee570ab4b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb64f69d88447dcfe2a2e206f1e59b4b10f370ac02a3f9653e0b95e6d94529c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f628b643cc4c4a9a0d4b2c6d923d6ee5839b325a873da093e84dfc48255548f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86ba7b90dcf7a1fcd028d2fcf19bb86de8c58446ef34191d50a5055f4042bb71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5591cefc1293c938f8a2a8db06c7287d0e54d55a9d99c65a40126ca77abccd80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:52Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.812816 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e12a8bf70722b099302a9c3b8d88526bbea68746ea168b2bf41306a3e18641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:52Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.823296 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e1c7778cf1d8d9f6dc68d72d153ea277d19d66d775fa02ed06687e3392040c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98be87b4cb316839099b69460cc1c1ecd98d5162c7dfc995c8f6d86ad477fedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:52Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.833814 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:52Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.844844 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938f56b18fe369ceeb13a74b0c257fd1b7fafeeb4d331327d4508f482caca32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:52Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.859203 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86284cfd-7d99-4f3b-bb3a-593b66737ae3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf15b6b87f6e760bf922b02b49658153875a79c4253bc50a895372c460e0f077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f838844cf38dd9b54b6054863a395010ea5cb87ca8066642ffe388365a91f5a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://151ae8ebe882d8c0150cd97088a56e52f55f0ae066819f8e4aaae2279a61881f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff5fec130c99031e2cb7164e0a3d5c4af84e2b2ac25d3cf255008dc5c1ab967\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3049f477ecf683e57b583e8f15aa9e5335347be632ffc02799ef279fefe9b446\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 13:58:31.152502 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:58:31.168708 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3623665920/tls.crt::/tmp/serving-cert-3623665920/tls.key\\\\\\\"\\\\nI0930 13:58:36.402239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:58:36.407288 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:58:36.407313 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:58:36.407332 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:58:36.407337 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:58:36.414209 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:58:36.414232 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414236 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:58:36.414244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:58:36.414247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:58:36.414250 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:58:36.414249 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:58:36.417272 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e2c7ce3911aaca47ffc14e5628e9efb3b32594d812028b2659b8f8ca681d53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:52Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.896209 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.896252 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.896262 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.896277 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.896287 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:52Z","lastTransitionTime":"2025-09-30T13:58:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.999116 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.999169 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.999179 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.999196 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:52 crc kubenswrapper[4676]: I0930 13:58:52.999205 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:52Z","lastTransitionTime":"2025-09-30T13:58:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:53 crc kubenswrapper[4676]: I0930 13:58:53.101111 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:53 crc kubenswrapper[4676]: I0930 13:58:53.101154 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:53 crc kubenswrapper[4676]: I0930 13:58:53.101164 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:53 crc kubenswrapper[4676]: I0930 13:58:53.101178 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:53 crc kubenswrapper[4676]: I0930 13:58:53.101187 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:53Z","lastTransitionTime":"2025-09-30T13:58:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:53 crc kubenswrapper[4676]: I0930 13:58:53.119667 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:53 crc kubenswrapper[4676]: I0930 13:58:53.119703 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:53 crc kubenswrapper[4676]: I0930 13:58:53.119711 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:53 crc kubenswrapper[4676]: I0930 13:58:53.119727 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:53 crc kubenswrapper[4676]: I0930 13:58:53.119738 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:53Z","lastTransitionTime":"2025-09-30T13:58:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:53 crc kubenswrapper[4676]: E0930 13:58:53.132160 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:58:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:58:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:58:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:58:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1d613b32-e65a-4ccd-985a-b2960fc60b41\\\",\\\"systemUUID\\\":\\\"e459744e-c3c7-46eb-b48a-f9d168ffc645\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:53Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:53 crc kubenswrapper[4676]: I0930 13:58:53.135620 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:53 crc kubenswrapper[4676]: I0930 13:58:53.135664 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:53 crc kubenswrapper[4676]: I0930 13:58:53.135678 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:53 crc kubenswrapper[4676]: I0930 13:58:53.135696 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:53 crc kubenswrapper[4676]: I0930 13:58:53.135711 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:53Z","lastTransitionTime":"2025-09-30T13:58:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:53 crc kubenswrapper[4676]: E0930 13:58:53.149190 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:58:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:58:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:58:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:58:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1d613b32-e65a-4ccd-985a-b2960fc60b41\\\",\\\"systemUUID\\\":\\\"e459744e-c3c7-46eb-b48a-f9d168ffc645\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:53Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:53 crc kubenswrapper[4676]: I0930 13:58:53.152176 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:53 crc kubenswrapper[4676]: I0930 13:58:53.152212 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:53 crc kubenswrapper[4676]: I0930 13:58:53.152226 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:53 crc kubenswrapper[4676]: I0930 13:58:53.152243 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:53 crc kubenswrapper[4676]: I0930 13:58:53.152256 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:53Z","lastTransitionTime":"2025-09-30T13:58:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:53 crc kubenswrapper[4676]: E0930 13:58:53.167027 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:58:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:58:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:58:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:58:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1d613b32-e65a-4ccd-985a-b2960fc60b41\\\",\\\"systemUUID\\\":\\\"e459744e-c3c7-46eb-b48a-f9d168ffc645\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:53Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:53 crc kubenswrapper[4676]: I0930 13:58:53.170869 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:53 crc kubenswrapper[4676]: I0930 13:58:53.170943 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:53 crc kubenswrapper[4676]: I0930 13:58:53.170957 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:53 crc kubenswrapper[4676]: I0930 13:58:53.170977 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:53 crc kubenswrapper[4676]: I0930 13:58:53.170995 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:53Z","lastTransitionTime":"2025-09-30T13:58:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:53 crc kubenswrapper[4676]: E0930 13:58:53.183840 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:58:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:58:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:58:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:58:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1d613b32-e65a-4ccd-985a-b2960fc60b41\\\",\\\"systemUUID\\\":\\\"e459744e-c3c7-46eb-b48a-f9d168ffc645\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:53Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:53 crc kubenswrapper[4676]: I0930 13:58:53.187754 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:53 crc kubenswrapper[4676]: I0930 13:58:53.187794 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:53 crc kubenswrapper[4676]: I0930 13:58:53.187806 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:53 crc kubenswrapper[4676]: I0930 13:58:53.187822 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:53 crc kubenswrapper[4676]: I0930 13:58:53.187833 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:53Z","lastTransitionTime":"2025-09-30T13:58:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:53 crc kubenswrapper[4676]: E0930 13:58:53.199203 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:58:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:58:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:58:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:58:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1d613b32-e65a-4ccd-985a-b2960fc60b41\\\",\\\"systemUUID\\\":\\\"e459744e-c3c7-46eb-b48a-f9d168ffc645\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:53Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:53 crc kubenswrapper[4676]: E0930 13:58:53.199318 4676 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 13:58:53 crc kubenswrapper[4676]: I0930 13:58:53.203796 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:53 crc kubenswrapper[4676]: I0930 13:58:53.203829 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:53 crc kubenswrapper[4676]: I0930 13:58:53.203839 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:53 crc kubenswrapper[4676]: I0930 13:58:53.203853 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:53 crc kubenswrapper[4676]: I0930 13:58:53.203862 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:53Z","lastTransitionTime":"2025-09-30T13:58:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:53 crc kubenswrapper[4676]: I0930 13:58:53.306089 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:53 crc kubenswrapper[4676]: I0930 13:58:53.306394 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:53 crc kubenswrapper[4676]: I0930 13:58:53.306487 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:53 crc kubenswrapper[4676]: I0930 13:58:53.306551 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:53 crc kubenswrapper[4676]: I0930 13:58:53.306611 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:53Z","lastTransitionTime":"2025-09-30T13:58:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:53 crc kubenswrapper[4676]: I0930 13:58:53.408923 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:53 crc kubenswrapper[4676]: I0930 13:58:53.409009 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:53 crc kubenswrapper[4676]: I0930 13:58:53.409032 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:53 crc kubenswrapper[4676]: I0930 13:58:53.409060 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:53 crc kubenswrapper[4676]: I0930 13:58:53.409078 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:53Z","lastTransitionTime":"2025-09-30T13:58:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:53 crc kubenswrapper[4676]: I0930 13:58:53.511426 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:53 crc kubenswrapper[4676]: I0930 13:58:53.511785 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:53 crc kubenswrapper[4676]: I0930 13:58:53.511964 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:53 crc kubenswrapper[4676]: I0930 13:58:53.512151 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:53 crc kubenswrapper[4676]: I0930 13:58:53.512316 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:53Z","lastTransitionTime":"2025-09-30T13:58:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:53 crc kubenswrapper[4676]: I0930 13:58:53.615339 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:53 crc kubenswrapper[4676]: I0930 13:58:53.615394 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:53 crc kubenswrapper[4676]: I0930 13:58:53.615403 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:53 crc kubenswrapper[4676]: I0930 13:58:53.615416 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:53 crc kubenswrapper[4676]: I0930 13:58:53.615426 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:53Z","lastTransitionTime":"2025-09-30T13:58:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:53 crc kubenswrapper[4676]: I0930 13:58:53.717221 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:53 crc kubenswrapper[4676]: I0930 13:58:53.717271 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:53 crc kubenswrapper[4676]: I0930 13:58:53.717281 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:53 crc kubenswrapper[4676]: I0930 13:58:53.717295 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:53 crc kubenswrapper[4676]: I0930 13:58:53.717304 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:53Z","lastTransitionTime":"2025-09-30T13:58:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:53 crc kubenswrapper[4676]: I0930 13:58:53.819647 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:53 crc kubenswrapper[4676]: I0930 13:58:53.819685 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:53 crc kubenswrapper[4676]: I0930 13:58:53.819697 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:53 crc kubenswrapper[4676]: I0930 13:58:53.819709 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:53 crc kubenswrapper[4676]: I0930 13:58:53.819718 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:53Z","lastTransitionTime":"2025-09-30T13:58:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:53 crc kubenswrapper[4676]: I0930 13:58:53.922386 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:53 crc kubenswrapper[4676]: I0930 13:58:53.922437 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:53 crc kubenswrapper[4676]: I0930 13:58:53.922453 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:53 crc kubenswrapper[4676]: I0930 13:58:53.922471 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:53 crc kubenswrapper[4676]: I0930 13:58:53.922481 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:53Z","lastTransitionTime":"2025-09-30T13:58:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.024618 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.024655 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.024666 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.024681 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.024690 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:54Z","lastTransitionTime":"2025-09-30T13:58:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.126914 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.126962 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.126973 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.126988 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.126997 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:54Z","lastTransitionTime":"2025-09-30T13:58:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.229300 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.229347 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.229365 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.229382 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.229393 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:54Z","lastTransitionTime":"2025-09-30T13:58:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.331596 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.331660 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.331675 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.331691 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.331720 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:54Z","lastTransitionTime":"2025-09-30T13:58:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.432799 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.432843 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.432917 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:58:54 crc kubenswrapper[4676]: E0930 13:58:54.432967 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:58:54 crc kubenswrapper[4676]: E0930 13:58:54.433029 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:58:54 crc kubenswrapper[4676]: E0930 13:58:54.433100 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.434734 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.434775 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.434791 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.434806 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.434818 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:54Z","lastTransitionTime":"2025-09-30T13:58:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.489908 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pw2gp"] Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.490277 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pw2gp" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.492556 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.492792 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.505476 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86284cfd-7d99-4f3b-bb3a-593b66737ae3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf15b6b87f6e760bf922b02b49658153875a79c4253bc50a895372c460e0f077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f838844cf38dd9b54b6054863a395010ea5cb87ca8066642ffe388365a91f5a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://151ae8ebe882d8c0150cd97088a56e52f55f0ae066819f8e4aaae2279a61881f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff5fec130c99031e2cb7164e0a3d5c4af84e2b2ac25d3cf255008dc5c1ab967\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3049f477ecf683e57b583e8f15aa9e5335347be632ffc02799ef279fefe9b446\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 13:58:31.152502 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:58:31.168708 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3623665920/tls.crt::/tmp/serving-cert-3623665920/tls.key\\\\\\\"\\\\nI0930 13:58:36.402239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:58:36.407288 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:58:36.407313 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:58:36.407332 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:58:36.407337 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:58:36.414209 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:58:36.414232 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414236 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:58:36.414244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:58:36.414247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:58:36.414250 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:58:36.414249 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:58:36.417272 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e2c7ce3911aaca47ffc14e5628e9efb3b32594d812028b2659b8f8ca681d53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:54Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.519256 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pw2gp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f8ff358-e9b0-478e-acbf-e30059107be1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kq87r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kq87r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pw2gp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:54Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.533638 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af133cb7-f0e4-428e-b348-c6e81493fc1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8aa7f7317a33eb0f00d05f59ec96a210aa79be528b5dc94d099b25fa019511a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7djb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ba88aa74993edfc49871c7cb7c45c498fc9a6d1e2cc573864d4656d4ade55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7djb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4k2dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:54Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.537406 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.537442 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.537452 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.537465 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.537475 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:54Z","lastTransitionTime":"2025-09-30T13:58:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.551092 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ksfzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70e11f41-7d9e-49ef-a2f5-0691d5f8f631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72f096ef608f833a542cea2dde328dd04393d90bcf9c20601499b9bb49734e8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e68434ff5ca08dde9677cf96ed4703958895aa880048ef99eead372a1a06bcf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e68434ff5ca08dde9677cf96ed4703958895aa880048ef99eead372a1a06bcf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f888604a37c64c17b61fbe754fb766b3a8dd363cd84342d3f464ac3b377b703c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f888604a37c64c17b61fbe754fb766b3a8dd363cd84342d3f464ac3b377b703c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a4a6ea3224024cc371c57de0a5be4082b7bd9092090f7af42bedda1a897d78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70a4a6ea3224024cc371c57de0a5be4082b7bd9092090f7af42bedda1a897d78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62090d66b5fc63638db525937297cd84b886fc27446d8df9935ee6b45f7d8610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62090d66b5fc63638db525937297cd84b886fc27446d8df9935ee6b45f7d8610\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4452138f5ffc57e76d26761a49ebe4f782695f4d0298426e19d9fc9f87202645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4452138f5ffc57e76d26761a49ebe4f782695f4d0298426e19d9fc9f87202645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2ac4de1ea55982e71a3ecd8eaffb3d44648249635c4cdfbe0181f0497a2d1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c2ac4de1ea55982e71a3ecd8eaffb3d44648249635c4cdfbe0181f0497a2d1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ksfzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:54Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.564715 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7stxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e065baf-f38b-4397-bbbe-ef52eea10f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f5fe4110f265eb89d70f5d85485514e7fea4ed909359edaa0c76d305d1a9c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nkkjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7stxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:54Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.580028 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qmgfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0fc06a-eb2b-4fa4-8554-ce8a0fd62074\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb44b6e0c8081c62bace086876620d80d412864b9afb7fd495413baf0363c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mwsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qmgfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:54Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.602275 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fae6bdf-2a3f-4961-934d-b8f653412538\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dbb7809575c05151462f55a6d1d4eca91ba859d817d152220688e61385beff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2def96f529a78f915d69adfd74b48057081c10bf8f21bd69fce0d4b305f80a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a8ea8d40cefaa9290dfd65c7f1fa80963910c38b71de3992bdcf77fde19eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1323c0deb00658d6452b2d6aa4515233551626a4c4def3f4fff539db4a3f5038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6836ea3a8ec1a7fdff3ef83bd00963c21caeb4e8264f3b6775201c90b83d4653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f1d81c0d66a327a4174d0f39a9540f66d89a6c2049490ddad67174aa684b773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cabf42ce5073083d3f5db7b98a621247f45be0b8c0d6dcd43e3bc2810c6f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d05c6ffe2c46461b1604661e27c75ac6f150de39e30e983d4c4dfbe9c84092e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6b55521ba448057d6f78ab17ef0a1a7d4c8e593e5c7a23bebb4db492fb4b54f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6b55521ba448057d6f78ab17ef0a1a7d4c8e593e5c7a23bebb4db492fb4b54f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9775s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:54Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.615839 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:54Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.620668 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9f8ff358-e9b0-478e-acbf-e30059107be1-env-overrides\") pod \"ovnkube-control-plane-749d76644c-pw2gp\" (UID: \"9f8ff358-e9b0-478e-acbf-e30059107be1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pw2gp" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.620929 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kq87r\" (UniqueName: \"kubernetes.io/projected/9f8ff358-e9b0-478e-acbf-e30059107be1-kube-api-access-kq87r\") pod \"ovnkube-control-plane-749d76644c-pw2gp\" (UID: \"9f8ff358-e9b0-478e-acbf-e30059107be1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pw2gp" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.621079 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9f8ff358-e9b0-478e-acbf-e30059107be1-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-pw2gp\" (UID: \"9f8ff358-e9b0-478e-acbf-e30059107be1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pw2gp" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.621202 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9f8ff358-e9b0-478e-acbf-e30059107be1-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-pw2gp\" (UID: \"9f8ff358-e9b0-478e-acbf-e30059107be1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pw2gp" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.628461 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:54Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.639451 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.639495 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.639506 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.639523 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.639536 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:54Z","lastTransitionTime":"2025-09-30T13:58:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.641251 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e12a8bf70722b099302a9c3b8d88526bbea68746ea168b2bf41306a3e18641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:54Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.652645 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e1c7778cf1d8d9f6dc68d72d153ea277d19d66d775fa02ed06687e3392040c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98be87b4cb316839099b69460cc1c1ecd98d5162c7dfc995c8f6d86ad477fedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:54Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.664605 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:54Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.676191 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938f56b18fe369ceeb13a74b0c257fd1b7fafeeb4d331327d4508f482caca32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:54Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.689675 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s7q5x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12808c49-1bed-4251-bcbe-fad6207eea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://397ae2103d09a79613ccc4281fb5318128fa7dc84e8013a5327cf3cd31490049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84swg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s7q5x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:54Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.722506 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9f8ff358-e9b0-478e-acbf-e30059107be1-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-pw2gp\" (UID: \"9f8ff358-e9b0-478e-acbf-e30059107be1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pw2gp" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.722551 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9f8ff358-e9b0-478e-acbf-e30059107be1-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-pw2gp\" (UID: \"9f8ff358-e9b0-478e-acbf-e30059107be1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pw2gp" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.722613 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9f8ff358-e9b0-478e-acbf-e30059107be1-env-overrides\") pod \"ovnkube-control-plane-749d76644c-pw2gp\" (UID: \"9f8ff358-e9b0-478e-acbf-e30059107be1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pw2gp" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.722640 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kq87r\" (UniqueName: \"kubernetes.io/projected/9f8ff358-e9b0-478e-acbf-e30059107be1-kube-api-access-kq87r\") pod \"ovnkube-control-plane-749d76644c-pw2gp\" (UID: \"9f8ff358-e9b0-478e-acbf-e30059107be1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pw2gp" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.723288 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9f8ff358-e9b0-478e-acbf-e30059107be1-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-pw2gp\" (UID: \"9f8ff358-e9b0-478e-acbf-e30059107be1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pw2gp" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.723325 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9f8ff358-e9b0-478e-acbf-e30059107be1-env-overrides\") pod \"ovnkube-control-plane-749d76644c-pw2gp\" (UID: \"9f8ff358-e9b0-478e-acbf-e30059107be1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pw2gp" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.739394 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9f8ff358-e9b0-478e-acbf-e30059107be1-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-pw2gp\" (UID: \"9f8ff358-e9b0-478e-acbf-e30059107be1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pw2gp" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.741958 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.742051 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.742063 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.742093 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.742105 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:54Z","lastTransitionTime":"2025-09-30T13:58:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.743270 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9343584b-a3e2-45a2-9f71-b815d206f44c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://168155413f2783887fde890a607fd98c1823da5a45b7fc550ef9358750451da7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69000b0d87b6f1e68fb36dd2349f95de211b17cc4e7bd7ec489969feca3810d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0228750af194e47c41da2795fb0b21b2c44b61e7d5a8943a943793954b33e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://288cb819756dee05f608c0e6875f3a2b4eb629a5d07db35aaf706a4bc4110899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9e51dab4ec7df53e36f707786a013e4fca0909f0d69cb9113dfdff6e0c9a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:54Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.747566 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kq87r\" (UniqueName: \"kubernetes.io/projected/9f8ff358-e9b0-478e-acbf-e30059107be1-kube-api-access-kq87r\") pod \"ovnkube-control-plane-749d76644c-pw2gp\" (UID: \"9f8ff358-e9b0-478e-acbf-e30059107be1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pw2gp" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.777823 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2355fa03-bfc0-4ba0-941c-ceee570ab4b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb64f69d88447dcfe2a2e206f1e59b4b10f370ac02a3f9653e0b95e6d94529c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f628b643cc4c4a9a0d4b2c6d923d6ee5839b325a873da093e84dfc48255548f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86ba7b90dcf7a1fcd028d2fcf19bb86de8c58446ef34191d50a5055f4042bb71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5591cefc1293c938f8a2a8db06c7287d0e54d55a9d99c65a40126ca77abccd80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:54Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.806031 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pw2gp" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.844408 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.844450 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.844459 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.844474 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.844484 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:54Z","lastTransitionTime":"2025-09-30T13:58:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.947123 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.947166 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.947178 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.947194 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:54 crc kubenswrapper[4676]: I0930 13:58:54.947204 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:54Z","lastTransitionTime":"2025-09-30T13:58:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:55 crc kubenswrapper[4676]: I0930 13:58:55.049609 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:55 crc kubenswrapper[4676]: I0930 13:58:55.049648 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:55 crc kubenswrapper[4676]: I0930 13:58:55.049660 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:55 crc kubenswrapper[4676]: I0930 13:58:55.049676 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:55 crc kubenswrapper[4676]: I0930 13:58:55.049689 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:55Z","lastTransitionTime":"2025-09-30T13:58:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:55 crc kubenswrapper[4676]: I0930 13:58:55.152370 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:55 crc kubenswrapper[4676]: I0930 13:58:55.152414 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:55 crc kubenswrapper[4676]: I0930 13:58:55.152425 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:55 crc kubenswrapper[4676]: I0930 13:58:55.152442 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:55 crc kubenswrapper[4676]: I0930 13:58:55.152454 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:55Z","lastTransitionTime":"2025-09-30T13:58:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:55 crc kubenswrapper[4676]: I0930 13:58:55.254303 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:55 crc kubenswrapper[4676]: I0930 13:58:55.254345 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:55 crc kubenswrapper[4676]: I0930 13:58:55.254353 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:55 crc kubenswrapper[4676]: I0930 13:58:55.254368 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:55 crc kubenswrapper[4676]: I0930 13:58:55.254378 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:55Z","lastTransitionTime":"2025-09-30T13:58:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:55 crc kubenswrapper[4676]: I0930 13:58:55.356897 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:55 crc kubenswrapper[4676]: I0930 13:58:55.356945 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:55 crc kubenswrapper[4676]: I0930 13:58:55.356955 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:55 crc kubenswrapper[4676]: I0930 13:58:55.356971 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:55 crc kubenswrapper[4676]: I0930 13:58:55.356981 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:55Z","lastTransitionTime":"2025-09-30T13:58:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:55 crc kubenswrapper[4676]: I0930 13:58:55.459489 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:55 crc kubenswrapper[4676]: I0930 13:58:55.459530 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:55 crc kubenswrapper[4676]: I0930 13:58:55.459540 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:55 crc kubenswrapper[4676]: I0930 13:58:55.459556 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:55 crc kubenswrapper[4676]: I0930 13:58:55.459566 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:55Z","lastTransitionTime":"2025-09-30T13:58:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:55 crc kubenswrapper[4676]: I0930 13:58:55.562152 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:55 crc kubenswrapper[4676]: I0930 13:58:55.562197 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:55 crc kubenswrapper[4676]: I0930 13:58:55.562209 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:55 crc kubenswrapper[4676]: I0930 13:58:55.562225 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:55 crc kubenswrapper[4676]: I0930 13:58:55.562239 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:55Z","lastTransitionTime":"2025-09-30T13:58:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:55 crc kubenswrapper[4676]: I0930 13:58:55.664819 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:55 crc kubenswrapper[4676]: I0930 13:58:55.664858 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:55 crc kubenswrapper[4676]: I0930 13:58:55.664866 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:55 crc kubenswrapper[4676]: I0930 13:58:55.664898 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:55 crc kubenswrapper[4676]: I0930 13:58:55.664910 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:55Z","lastTransitionTime":"2025-09-30T13:58:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:55 crc kubenswrapper[4676]: I0930 13:58:55.669254 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pw2gp" event={"ID":"9f8ff358-e9b0-478e-acbf-e30059107be1","Type":"ContainerStarted","Data":"31f13ad12acc5942a8185367146d84394a5c3e8ab07cbbc773ce320b3b6c504a"} Sep 30 13:58:55 crc kubenswrapper[4676]: I0930 13:58:55.766493 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:55 crc kubenswrapper[4676]: I0930 13:58:55.766523 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:55 crc kubenswrapper[4676]: I0930 13:58:55.766531 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:55 crc kubenswrapper[4676]: I0930 13:58:55.766544 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:55 crc kubenswrapper[4676]: I0930 13:58:55.766562 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:55Z","lastTransitionTime":"2025-09-30T13:58:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:55 crc kubenswrapper[4676]: I0930 13:58:55.868853 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:55 crc kubenswrapper[4676]: I0930 13:58:55.868912 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:55 crc kubenswrapper[4676]: I0930 13:58:55.868921 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:55 crc kubenswrapper[4676]: I0930 13:58:55.868935 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:55 crc kubenswrapper[4676]: I0930 13:58:55.868944 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:55Z","lastTransitionTime":"2025-09-30T13:58:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:55 crc kubenswrapper[4676]: I0930 13:58:55.937437 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-sksn7"] Sep 30 13:58:55 crc kubenswrapper[4676]: I0930 13:58:55.937844 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sksn7" Sep 30 13:58:55 crc kubenswrapper[4676]: E0930 13:58:55.937928 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sksn7" podUID="e47ea2c6-e937-4411-b4c9-98048a5e5f05" Sep 30 13:58:55 crc kubenswrapper[4676]: I0930 13:58:55.952112 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86284cfd-7d99-4f3b-bb3a-593b66737ae3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf15b6b87f6e760bf922b02b49658153875a79c4253bc50a895372c460e0f077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f838844cf38dd9b54b6054863a395010ea5cb87ca8066642ffe388365a91f5a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://151ae8ebe882d8c0150cd97088a56e52f55f0ae066819f8e4aaae2279a61881f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff5fec130c99031e2cb7164e0a3d5c4af84e2b2ac25d3cf255008dc5c1ab967\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3049f477ecf683e57b583e8f15aa9e5335347be632ffc02799ef279fefe9b446\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 13:58:31.152502 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:58:31.168708 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3623665920/tls.crt::/tmp/serving-cert-3623665920/tls.key\\\\\\\"\\\\nI0930 13:58:36.402239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:58:36.407288 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:58:36.407313 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:58:36.407332 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:58:36.407337 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:58:36.414209 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:58:36.414232 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414236 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:58:36.414244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:58:36.414247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:58:36.414250 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:58:36.414249 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:58:36.417272 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e2c7ce3911aaca47ffc14e5628e9efb3b32594d812028b2659b8f8ca681d53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:55 crc kubenswrapper[4676]: I0930 13:58:55.963415 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pw2gp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f8ff358-e9b0-478e-acbf-e30059107be1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kq87r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kq87r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pw2gp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:55 crc kubenswrapper[4676]: I0930 13:58:55.970957 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:55 crc kubenswrapper[4676]: I0930 13:58:55.971003 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:55 crc kubenswrapper[4676]: I0930 13:58:55.971014 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:55 crc kubenswrapper[4676]: I0930 13:58:55.971028 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:55 crc kubenswrapper[4676]: I0930 13:58:55.971037 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:55Z","lastTransitionTime":"2025-09-30T13:58:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:55 crc kubenswrapper[4676]: I0930 13:58:55.973168 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sksn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47ea2c6-e937-4411-b4c9-98048a5e5f05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtbn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtbn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sksn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:55 crc kubenswrapper[4676]: I0930 13:58:55.986285 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ksfzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70e11f41-7d9e-49ef-a2f5-0691d5f8f631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72f096ef608f833a542cea2dde328dd04393d90bcf9c20601499b9bb49734e8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e68434ff5ca08dde9677cf96ed4703958895aa880048ef99eead372a1a06bcf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e68434ff5ca08dde9677cf96ed4703958895aa880048ef99eead372a1a06bcf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f888604a37c64c17b61fbe754fb766b3a8dd363cd84342d3f464ac3b377b703c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f888604a37c64c17b61fbe754fb766b3a8dd363cd84342d3f464ac3b377b703c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a4a6ea3224024cc371c57de0a5be4082b7bd9092090f7af42bedda1a897d78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70a4a6ea3224024cc371c57de0a5be4082b7bd9092090f7af42bedda1a897d78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62090d66b5fc63638db525937297cd84b886fc27446d8df9935ee6b45f7d8610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62090d66b5fc63638db525937297cd84b886fc27446d8df9935ee6b45f7d8610\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4452138f5ffc57e76d26761a49ebe4f782695f4d0298426e19d9fc9f87202645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4452138f5ffc57e76d26761a49ebe4f782695f4d0298426e19d9fc9f87202645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2ac4de1ea55982e71a3ecd8eaffb3d44648249635c4cdfbe0181f0497a2d1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c2ac4de1ea55982e71a3ecd8eaffb3d44648249635c4cdfbe0181f0497a2d1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ksfzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.001071 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af133cb7-f0e4-428e-b348-c6e81493fc1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8aa7f7317a33eb0f00d05f59ec96a210aa79be528b5dc94d099b25fa019511a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7djb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ba88aa74993edfc49871c7cb7c45c498fc9a6d1e2cc573864d4656d4ade55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7djb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4k2dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:55Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.016394 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:56Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.031529 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:56Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.045962 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7stxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e065baf-f38b-4397-bbbe-ef52eea10f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f5fe4110f265eb89d70f5d85485514e7fea4ed909359edaa0c76d305d1a9c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nkkjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7stxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:56Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.062118 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qmgfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0fc06a-eb2b-4fa4-8554-ce8a0fd62074\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb44b6e0c8081c62bace086876620d80d412864b9afb7fd495413baf0363c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mwsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qmgfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:56Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.074283 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.074422 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.074511 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.074630 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.074723 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:56Z","lastTransitionTime":"2025-09-30T13:58:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.084619 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fae6bdf-2a3f-4961-934d-b8f653412538\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dbb7809575c05151462f55a6d1d4eca91ba859d817d152220688e61385beff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2def96f529a78f915d69adfd74b48057081c10bf8f21bd69fce0d4b305f80a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a8ea8d40cefaa9290dfd65c7f1fa80963910c38b71de3992bdcf77fde19eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1323c0deb00658d6452b2d6aa4515233551626a4c4def3f4fff539db4a3f5038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6836ea3a8ec1a7fdff3ef83bd00963c21caeb4e8264f3b6775201c90b83d4653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f1d81c0d66a327a4174d0f39a9540f66d89a6c2049490ddad67174aa684b773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cabf42ce5073083d3f5db7b98a621247f45be0b8c0d6dcd43e3bc2810c6f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d05c6ffe2c46461b1604661e27c75ac6f150de39e30e983d4c4dfbe9c84092e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6b55521ba448057d6f78ab17ef0a1a7d4c8e593e5c7a23bebb4db492fb4b54f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6b55521ba448057d6f78ab17ef0a1a7d4c8e593e5c7a23bebb4db492fb4b54f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9775s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:56Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.097092 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938f56b18fe369ceeb13a74b0c257fd1b7fafeeb4d331327d4508f482caca32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:56Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.109847 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s7q5x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12808c49-1bed-4251-bcbe-fad6207eea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://397ae2103d09a79613ccc4281fb5318128fa7dc84e8013a5327cf3cd31490049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84swg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s7q5x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:56Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.130438 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9343584b-a3e2-45a2-9f71-b815d206f44c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://168155413f2783887fde890a607fd98c1823da5a45b7fc550ef9358750451da7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69000b0d87b6f1e68fb36dd2349f95de211b17cc4e7bd7ec489969feca3810d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0228750af194e47c41da2795fb0b21b2c44b61e7d5a8943a943793954b33e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://288cb819756dee05f608c0e6875f3a2b4eb629a5d07db35aaf706a4bc4110899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9e51dab4ec7df53e36f707786a013e4fca0909f0d69cb9113dfdff6e0c9a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:56Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.134857 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e47ea2c6-e937-4411-b4c9-98048a5e5f05-metrics-certs\") pod \"network-metrics-daemon-sksn7\" (UID: \"e47ea2c6-e937-4411-b4c9-98048a5e5f05\") " pod="openshift-multus/network-metrics-daemon-sksn7" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.134950 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtbn7\" (UniqueName: \"kubernetes.io/projected/e47ea2c6-e937-4411-b4c9-98048a5e5f05-kube-api-access-gtbn7\") pod \"network-metrics-daemon-sksn7\" (UID: \"e47ea2c6-e937-4411-b4c9-98048a5e5f05\") " pod="openshift-multus/network-metrics-daemon-sksn7" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.144425 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2355fa03-bfc0-4ba0-941c-ceee570ab4b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb64f69d88447dcfe2a2e206f1e59b4b10f370ac02a3f9653e0b95e6d94529c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f628b643cc4c4a9a0d4b2c6d923d6ee5839b325a873da093e84dfc48255548f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86ba7b90dcf7a1fcd028d2fcf19bb86de8c58446ef34191d50a5055f4042bb71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5591cefc1293c938f8a2a8db06c7287d0e54d55a9d99c65a40126ca77abccd80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:56Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.157814 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e12a8bf70722b099302a9c3b8d88526bbea68746ea168b2bf41306a3e18641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:56Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.171406 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e1c7778cf1d8d9f6dc68d72d153ea277d19d66d775fa02ed06687e3392040c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98be87b4cb316839099b69460cc1c1ecd98d5162c7dfc995c8f6d86ad477fedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:56Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.176982 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.177012 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.177020 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.177033 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.177045 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:56Z","lastTransitionTime":"2025-09-30T13:58:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.183224 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:56Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.235631 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e47ea2c6-e937-4411-b4c9-98048a5e5f05-metrics-certs\") pod \"network-metrics-daemon-sksn7\" (UID: \"e47ea2c6-e937-4411-b4c9-98048a5e5f05\") " pod="openshift-multus/network-metrics-daemon-sksn7" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.235708 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtbn7\" (UniqueName: \"kubernetes.io/projected/e47ea2c6-e937-4411-b4c9-98048a5e5f05-kube-api-access-gtbn7\") pod \"network-metrics-daemon-sksn7\" (UID: \"e47ea2c6-e937-4411-b4c9-98048a5e5f05\") " pod="openshift-multus/network-metrics-daemon-sksn7" Sep 30 13:58:56 crc kubenswrapper[4676]: E0930 13:58:56.235786 4676 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 13:58:56 crc kubenswrapper[4676]: E0930 13:58:56.235849 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e47ea2c6-e937-4411-b4c9-98048a5e5f05-metrics-certs podName:e47ea2c6-e937-4411-b4c9-98048a5e5f05 nodeName:}" failed. No retries permitted until 2025-09-30 13:58:56.735834303 +0000 UTC m=+40.718922732 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e47ea2c6-e937-4411-b4c9-98048a5e5f05-metrics-certs") pod "network-metrics-daemon-sksn7" (UID: "e47ea2c6-e937-4411-b4c9-98048a5e5f05") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.254368 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtbn7\" (UniqueName: \"kubernetes.io/projected/e47ea2c6-e937-4411-b4c9-98048a5e5f05-kube-api-access-gtbn7\") pod \"network-metrics-daemon-sksn7\" (UID: \"e47ea2c6-e937-4411-b4c9-98048a5e5f05\") " pod="openshift-multus/network-metrics-daemon-sksn7" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.278793 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.278832 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.278844 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.278858 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.278868 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:56Z","lastTransitionTime":"2025-09-30T13:58:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.381629 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.381679 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.381689 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.381705 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.381716 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:56Z","lastTransitionTime":"2025-09-30T13:58:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.432362 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.432419 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.432461 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:58:56 crc kubenswrapper[4676]: E0930 13:58:56.432505 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:58:56 crc kubenswrapper[4676]: E0930 13:58:56.432600 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:58:56 crc kubenswrapper[4676]: E0930 13:58:56.432856 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.484242 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.484283 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.484294 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.484311 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.484322 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:56Z","lastTransitionTime":"2025-09-30T13:58:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.586484 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.586785 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.586867 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.586995 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.587113 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:56Z","lastTransitionTime":"2025-09-30T13:58:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.674579 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pw2gp" event={"ID":"9f8ff358-e9b0-478e-acbf-e30059107be1","Type":"ContainerStarted","Data":"4f5e3bfb55994ad5563473790f09c8a934f98e96fe7c88bd1f0dc1ae3dfc7a0f"} Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.674626 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pw2gp" event={"ID":"9f8ff358-e9b0-478e-acbf-e30059107be1","Type":"ContainerStarted","Data":"5a57e948acbb622af47bb656f844fcdca4ff7dbf2256fc07d7a73951621b06b7"} Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.676594 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9775s_4fae6bdf-2a3f-4961-934d-b8f653412538/ovnkube-controller/0.log" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.679641 4676 generic.go:334] "Generic (PLEG): container finished" podID="4fae6bdf-2a3f-4961-934d-b8f653412538" containerID="73cabf42ce5073083d3f5db7b98a621247f45be0b8c0d6dcd43e3bc2810c6f32" exitCode=1 Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.679682 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" event={"ID":"4fae6bdf-2a3f-4961-934d-b8f653412538","Type":"ContainerDied","Data":"73cabf42ce5073083d3f5db7b98a621247f45be0b8c0d6dcd43e3bc2810c6f32"} Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.680290 4676 scope.go:117] "RemoveContainer" containerID="73cabf42ce5073083d3f5db7b98a621247f45be0b8c0d6dcd43e3bc2810c6f32" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.689304 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.689365 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.689380 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.689404 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.689421 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:56Z","lastTransitionTime":"2025-09-30T13:58:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.691376 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7stxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e065baf-f38b-4397-bbbe-ef52eea10f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f5fe4110f265eb89d70f5d85485514e7fea4ed909359edaa0c76d305d1a9c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nkkjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7stxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:56Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.701715 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qmgfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0fc06a-eb2b-4fa4-8554-ce8a0fd62074\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb44b6e0c8081c62bace086876620d80d412864b9afb7fd495413baf0363c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mwsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qmgfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:56Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.720218 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fae6bdf-2a3f-4961-934d-b8f653412538\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dbb7809575c05151462f55a6d1d4eca91ba859d817d152220688e61385beff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2def96f529a78f915d69adfd74b48057081c10bf8f21bd69fce0d4b305f80a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a8ea8d40cefaa9290dfd65c7f1fa80963910c38b71de3992bdcf77fde19eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1323c0deb00658d6452b2d6aa4515233551626a4c4def3f4fff539db4a3f5038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6836ea3a8ec1a7fdff3ef83bd00963c21caeb4e8264f3b6775201c90b83d4653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f1d81c0d66a327a4174d0f39a9540f66d89a6c2049490ddad67174aa684b773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cabf42ce5073083d3f5db7b98a621247f45be0b8c0d6dcd43e3bc2810c6f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d05c6ffe2c46461b1604661e27c75ac6f150de39e30e983d4c4dfbe9c84092e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6b55521ba448057d6f78ab17ef0a1a7d4c8e593e5c7a23bebb4db492fb4b54f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6b55521ba448057d6f78ab17ef0a1a7d4c8e593e5c7a23bebb4db492fb4b54f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9775s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:56Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.733138 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:56Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.740151 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e47ea2c6-e937-4411-b4c9-98048a5e5f05-metrics-certs\") pod \"network-metrics-daemon-sksn7\" (UID: \"e47ea2c6-e937-4411-b4c9-98048a5e5f05\") " pod="openshift-multus/network-metrics-daemon-sksn7" Sep 30 13:58:56 crc kubenswrapper[4676]: E0930 13:58:56.740658 4676 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 13:58:56 crc kubenswrapper[4676]: E0930 13:58:56.740751 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e47ea2c6-e937-4411-b4c9-98048a5e5f05-metrics-certs podName:e47ea2c6-e937-4411-b4c9-98048a5e5f05 nodeName:}" failed. No retries permitted until 2025-09-30 13:58:57.740727037 +0000 UTC m=+41.723815466 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e47ea2c6-e937-4411-b4c9-98048a5e5f05-metrics-certs") pod "network-metrics-daemon-sksn7" (UID: "e47ea2c6-e937-4411-b4c9-98048a5e5f05") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.744690 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:56Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.757821 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e12a8bf70722b099302a9c3b8d88526bbea68746ea168b2bf41306a3e18641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:56Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.769483 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e1c7778cf1d8d9f6dc68d72d153ea277d19d66d775fa02ed06687e3392040c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98be87b4cb316839099b69460cc1c1ecd98d5162c7dfc995c8f6d86ad477fedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:56Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.782440 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:56Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.791399 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.791433 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.791445 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.791463 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.791475 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:56Z","lastTransitionTime":"2025-09-30T13:58:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.793133 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938f56b18fe369ceeb13a74b0c257fd1b7fafeeb4d331327d4508f482caca32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:56Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.803701 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s7q5x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12808c49-1bed-4251-bcbe-fad6207eea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://397ae2103d09a79613ccc4281fb5318128fa7dc84e8013a5327cf3cd31490049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84swg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s7q5x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:56Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.825562 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9343584b-a3e2-45a2-9f71-b815d206f44c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://168155413f2783887fde890a607fd98c1823da5a45b7fc550ef9358750451da7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69000b0d87b6f1e68fb36dd2349f95de211b17cc4e7bd7ec489969feca3810d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0228750af194e47c41da2795fb0b21b2c44b61e7d5a8943a943793954b33e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://288cb819756dee05f608c0e6875f3a2b4eb629a5d07db35aaf706a4bc4110899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9e51dab4ec7df53e36f707786a013e4fca0909f0d69cb9113dfdff6e0c9a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:56Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.839249 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2355fa03-bfc0-4ba0-941c-ceee570ab4b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb64f69d88447dcfe2a2e206f1e59b4b10f370ac02a3f9653e0b95e6d94529c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f628b643cc4c4a9a0d4b2c6d923d6ee5839b325a873da093e84dfc48255548f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86ba7b90dcf7a1fcd028d2fcf19bb86de8c58446ef34191d50a5055f4042bb71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5591cefc1293c938f8a2a8db06c7287d0e54d55a9d99c65a40126ca77abccd80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:56Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.849948 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sksn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47ea2c6-e937-4411-b4c9-98048a5e5f05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtbn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtbn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sksn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:56Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.864623 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86284cfd-7d99-4f3b-bb3a-593b66737ae3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf15b6b87f6e760bf922b02b49658153875a79c4253bc50a895372c460e0f077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f838844cf38dd9b54b6054863a395010ea5cb87ca8066642ffe388365a91f5a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://151ae8ebe882d8c0150cd97088a56e52f55f0ae066819f8e4aaae2279a61881f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff5fec130c99031e2cb7164e0a3d5c4af84e2b2ac25d3cf255008dc5c1ab967\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3049f477ecf683e57b583e8f15aa9e5335347be632ffc02799ef279fefe9b446\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 13:58:31.152502 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:58:31.168708 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3623665920/tls.crt::/tmp/serving-cert-3623665920/tls.key\\\\\\\"\\\\nI0930 13:58:36.402239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:58:36.407288 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:58:36.407313 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:58:36.407332 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:58:36.407337 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:58:36.414209 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:58:36.414232 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414236 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:58:36.414244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:58:36.414247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:58:36.414250 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:58:36.414249 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:58:36.417272 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e2c7ce3911aaca47ffc14e5628e9efb3b32594d812028b2659b8f8ca681d53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:56Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.875247 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pw2gp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f8ff358-e9b0-478e-acbf-e30059107be1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a57e948acbb622af47bb656f844fcdca4ff7dbf2256fc07d7a73951621b06b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kq87r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f5e3bfb55994ad5563473790f09c8a934f98e96fe7c88bd1f0dc1ae3dfc7a0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kq87r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pw2gp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:56Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.886597 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af133cb7-f0e4-428e-b348-c6e81493fc1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8aa7f7317a33eb0f00d05f59ec96a210aa79be528b5dc94d099b25fa019511a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7djb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ba88aa74993edfc49871c7cb7c45c498fc9a6d1e2cc573864d4656d4ade55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7djb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4k2dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:56Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.893686 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.893726 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.893736 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.893750 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.893759 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:56Z","lastTransitionTime":"2025-09-30T13:58:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.900595 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ksfzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70e11f41-7d9e-49ef-a2f5-0691d5f8f631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72f096ef608f833a542cea2dde328dd04393d90bcf9c20601499b9bb49734e8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e68434ff5ca08dde9677cf96ed4703958895aa880048ef99eead372a1a06bcf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e68434ff5ca08dde9677cf96ed4703958895aa880048ef99eead372a1a06bcf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f888604a37c64c17b61fbe754fb766b3a8dd363cd84342d3f464ac3b377b703c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f888604a37c64c17b61fbe754fb766b3a8dd363cd84342d3f464ac3b377b703c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a4a6ea3224024cc371c57de0a5be4082b7bd9092090f7af42bedda1a897d78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70a4a6ea3224024cc371c57de0a5be4082b7bd9092090f7af42bedda1a897d78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62090d66b5fc63638db525937297cd84b886fc27446d8df9935ee6b45f7d8610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62090d66b5fc63638db525937297cd84b886fc27446d8df9935ee6b45f7d8610\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4452138f5ffc57e76d26761a49ebe4f782695f4d0298426e19d9fc9f87202645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4452138f5ffc57e76d26761a49ebe4f782695f4d0298426e19d9fc9f87202645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2ac4de1ea55982e71a3ecd8eaffb3d44648249635c4cdfbe0181f0497a2d1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c2ac4de1ea55982e71a3ecd8eaffb3d44648249635c4cdfbe0181f0497a2d1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ksfzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:56Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.914544 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86284cfd-7d99-4f3b-bb3a-593b66737ae3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf15b6b87f6e760bf922b02b49658153875a79c4253bc50a895372c460e0f077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f838844cf38dd9b54b6054863a395010ea5cb87ca8066642ffe388365a91f5a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://151ae8ebe882d8c0150cd97088a56e52f55f0ae066819f8e4aaae2279a61881f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff5fec130c99031e2cb7164e0a3d5c4af84e2b2ac25d3cf255008dc5c1ab967\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3049f477ecf683e57b583e8f15aa9e5335347be632ffc02799ef279fefe9b446\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 13:58:31.152502 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:58:31.168708 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3623665920/tls.crt::/tmp/serving-cert-3623665920/tls.key\\\\\\\"\\\\nI0930 13:58:36.402239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:58:36.407288 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:58:36.407313 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:58:36.407332 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:58:36.407337 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:58:36.414209 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:58:36.414232 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414236 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:58:36.414244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:58:36.414247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:58:36.414250 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:58:36.414249 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:58:36.417272 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e2c7ce3911aaca47ffc14e5628e9efb3b32594d812028b2659b8f8ca681d53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:56Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.924668 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pw2gp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f8ff358-e9b0-478e-acbf-e30059107be1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a57e948acbb622af47bb656f844fcdca4ff7dbf2256fc07d7a73951621b06b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kq87r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f5e3bfb55994ad5563473790f09c8a934f98e96fe7c88bd1f0dc1ae3dfc7a0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kq87r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pw2gp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:56Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.935396 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sksn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47ea2c6-e937-4411-b4c9-98048a5e5f05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtbn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtbn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sksn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:56Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.951097 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ksfzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70e11f41-7d9e-49ef-a2f5-0691d5f8f631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72f096ef608f833a542cea2dde328dd04393d90bcf9c20601499b9bb49734e8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e68434ff5ca08dde9677cf96ed4703958895aa880048ef99eead372a1a06bcf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e68434ff5ca08dde9677cf96ed4703958895aa880048ef99eead372a1a06bcf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f888604a37c64c17b61fbe754fb766b3a8dd363cd84342d3f464ac3b377b703c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f888604a37c64c17b61fbe754fb766b3a8dd363cd84342d3f464ac3b377b703c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a4a6ea3224024cc371c57de0a5be4082b7bd9092090f7af42bedda1a897d78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70a4a6ea3224024cc371c57de0a5be4082b7bd9092090f7af42bedda1a897d78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62090d66b5fc63638db525937297cd84b886fc27446d8df9935ee6b45f7d8610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62090d66b5fc63638db525937297cd84b886fc27446d8df9935ee6b45f7d8610\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4452138f5ffc57e76d26761a49ebe4f782695f4d0298426e19d9fc9f87202645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4452138f5ffc57e76d26761a49ebe4f782695f4d0298426e19d9fc9f87202645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2ac4de1ea55982e71a3ecd8eaffb3d44648249635c4cdfbe0181f0497a2d1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c2ac4de1ea55982e71a3ecd8eaffb3d44648249635c4cdfbe0181f0497a2d1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ksfzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:56Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.963394 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af133cb7-f0e4-428e-b348-c6e81493fc1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8aa7f7317a33eb0f00d05f59ec96a210aa79be528b5dc94d099b25fa019511a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7djb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ba88aa74993edfc49871c7cb7c45c498fc9a6d1e2cc573864d4656d4ade55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7djb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4k2dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:56Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.976573 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:56Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.991927 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:56Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.995707 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.995748 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.995758 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.995773 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:56 crc kubenswrapper[4676]: I0930 13:58:56.995783 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:56Z","lastTransitionTime":"2025-09-30T13:58:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.003198 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7stxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e065baf-f38b-4397-bbbe-ef52eea10f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f5fe4110f265eb89d70f5d85485514e7fea4ed909359edaa0c76d305d1a9c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nkkjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7stxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.013541 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qmgfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0fc06a-eb2b-4fa4-8554-ce8a0fd62074\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb44b6e0c8081c62bace086876620d80d412864b9afb7fd495413baf0363c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mwsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qmgfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.032268 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fae6bdf-2a3f-4961-934d-b8f653412538\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dbb7809575c05151462f55a6d1d4eca91ba859d817d152220688e61385beff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2def96f529a78f915d69adfd74b48057081c10bf8f21bd69fce0d4b305f80a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a8ea8d40cefaa9290dfd65c7f1fa80963910c38b71de3992bdcf77fde19eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1323c0deb00658d6452b2d6aa4515233551626a4c4def3f4fff539db4a3f5038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6836ea3a8ec1a7fdff3ef83bd00963c21caeb4e8264f3b6775201c90b83d4653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f1d81c0d66a327a4174d0f39a9540f66d89a6c2049490ddad67174aa684b773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cabf42ce5073083d3f5db7b98a621247f45be0b8c0d6dcd43e3bc2810c6f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73cabf42ce5073083d3f5db7b98a621247f45be0b8c0d6dcd43e3bc2810c6f32\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"message\\\":\\\"/client-go/informers/factory.go:160\\\\nI0930 13:58:54.768376 5975 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:58:54.768532 5975 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:58:54.769997 5975 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:58:54.772368 5975 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0930 13:58:54.772453 5975 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 13:58:54.772469 5975 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0930 13:58:54.772480 5975 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 13:58:54.772486 5975 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0930 13:58:54.772499 5975 factory.go:656] Stopping watch factory\\\\nI0930 13:58:54.772512 5975 ovnkube.go:599] Stopped ovnkube\\\\nI0930 13:58:54.772567 5975 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 13:58:54.772552 5975 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 13:58:54.772582 5975 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 13:58:54.772588 5975 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d05c6ffe2c46461b1604661e27c75ac6f150de39e30e983d4c4dfbe9c84092e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6b55521ba448057d6f78ab17ef0a1a7d4c8e593e5c7a23bebb4db492fb4b54f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6b55521ba448057d6f78ab17ef0a1a7d4c8e593e5c7a23bebb4db492fb4b54f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9775s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.047667 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.057850 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938f56b18fe369ceeb13a74b0c257fd1b7fafeeb4d331327d4508f482caca32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.072614 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s7q5x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12808c49-1bed-4251-bcbe-fad6207eea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://397ae2103d09a79613ccc4281fb5318128fa7dc84e8013a5327cf3cd31490049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84swg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s7q5x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.090904 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9343584b-a3e2-45a2-9f71-b815d206f44c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://168155413f2783887fde890a607fd98c1823da5a45b7fc550ef9358750451da7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69000b0d87b6f1e68fb36dd2349f95de211b17cc4e7bd7ec489969feca3810d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0228750af194e47c41da2795fb0b21b2c44b61e7d5a8943a943793954b33e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://288cb819756dee05f608c0e6875f3a2b4eb629a5d07db35aaf706a4bc4110899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9e51dab4ec7df53e36f707786a013e4fca0909f0d69cb9113dfdff6e0c9a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.098558 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.098591 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.098599 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.098613 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.098811 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:57Z","lastTransitionTime":"2025-09-30T13:58:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.103644 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2355fa03-bfc0-4ba0-941c-ceee570ab4b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb64f69d88447dcfe2a2e206f1e59b4b10f370ac02a3f9653e0b95e6d94529c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f628b643cc4c4a9a0d4b2c6d923d6ee5839b325a873da093e84dfc48255548f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86ba7b90dcf7a1fcd028d2fcf19bb86de8c58446ef34191d50a5055f4042bb71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5591cefc1293c938f8a2a8db06c7287d0e54d55a9d99c65a40126ca77abccd80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.114975 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e12a8bf70722b099302a9c3b8d88526bbea68746ea168b2bf41306a3e18641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.127527 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e1c7778cf1d8d9f6dc68d72d153ea277d19d66d775fa02ed06687e3392040c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98be87b4cb316839099b69460cc1c1ecd98d5162c7dfc995c8f6d86ad477fedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.200622 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.200658 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.200675 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.200690 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.200700 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:57Z","lastTransitionTime":"2025-09-30T13:58:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.302333 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.302372 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.302381 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.302398 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.302408 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:57Z","lastTransitionTime":"2025-09-30T13:58:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.404832 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.404866 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.404889 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.404903 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.404912 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:57Z","lastTransitionTime":"2025-09-30T13:58:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.432430 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sksn7" Sep 30 13:58:57 crc kubenswrapper[4676]: E0930 13:58:57.432551 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sksn7" podUID="e47ea2c6-e937-4411-b4c9-98048a5e5f05" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.445696 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ksfzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70e11f41-7d9e-49ef-a2f5-0691d5f8f631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72f096ef608f833a542cea2dde328dd04393d90bcf9c20601499b9bb49734e8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e68434ff5ca08dde9677cf96ed4703958895aa880048ef99eead372a1a06bcf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e68434ff5ca08dde9677cf96ed4703958895aa880048ef99eead372a1a06bcf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f888604a37c64c17b61fbe754fb766b3a8dd363cd84342d3f464ac3b377b703c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f888604a37c64c17b61fbe754fb766b3a8dd363cd84342d3f464ac3b377b703c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a4a6ea3224024cc371c57de0a5be4082b7bd9092090f7af42bedda1a897d78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70a4a6ea3224024cc371c57de0a5be4082b7bd9092090f7af42bedda1a897d78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62090d66b5fc63638db525937297cd84b886fc27446d8df9935ee6b45f7d8610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62090d66b5fc63638db525937297cd84b886fc27446d8df9935ee6b45f7d8610\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4452138f5ffc57e76d26761a49ebe4f782695f4d0298426e19d9fc9f87202645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4452138f5ffc57e76d26761a49ebe4f782695f4d0298426e19d9fc9f87202645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2ac4de1ea55982e71a3ecd8eaffb3d44648249635c4cdfbe0181f0497a2d1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c2ac4de1ea55982e71a3ecd8eaffb3d44648249635c4cdfbe0181f0497a2d1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ksfzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.457806 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af133cb7-f0e4-428e-b348-c6e81493fc1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8aa7f7317a33eb0f00d05f59ec96a210aa79be528b5dc94d099b25fa019511a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7djb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ba88aa74993edfc49871c7cb7c45c498fc9a6d1e2cc573864d4656d4ade55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7djb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4k2dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.470751 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.482838 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.492544 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7stxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e065baf-f38b-4397-bbbe-ef52eea10f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f5fe4110f265eb89d70f5d85485514e7fea4ed909359edaa0c76d305d1a9c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nkkjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7stxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.503785 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qmgfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0fc06a-eb2b-4fa4-8554-ce8a0fd62074\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb44b6e0c8081c62bace086876620d80d412864b9afb7fd495413baf0363c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mwsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qmgfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.507172 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.507207 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.507219 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.507235 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.507246 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:57Z","lastTransitionTime":"2025-09-30T13:58:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.522052 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fae6bdf-2a3f-4961-934d-b8f653412538\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dbb7809575c05151462f55a6d1d4eca91ba859d817d152220688e61385beff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2def96f529a78f915d69adfd74b48057081c10bf8f21bd69fce0d4b305f80a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a8ea8d40cefaa9290dfd65c7f1fa80963910c38b71de3992bdcf77fde19eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1323c0deb00658d6452b2d6aa4515233551626a4c4def3f4fff539db4a3f5038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6836ea3a8ec1a7fdff3ef83bd00963c21caeb4e8264f3b6775201c90b83d4653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f1d81c0d66a327a4174d0f39a9540f66d89a6c2049490ddad67174aa684b773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cabf42ce5073083d3f5db7b98a621247f45be0b8c0d6dcd43e3bc2810c6f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73cabf42ce5073083d3f5db7b98a621247f45be0b8c0d6dcd43e3bc2810c6f32\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"message\\\":\\\"/client-go/informers/factory.go:160\\\\nI0930 13:58:54.768376 5975 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:58:54.768532 5975 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:58:54.769997 5975 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:58:54.772368 5975 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0930 13:58:54.772453 5975 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 13:58:54.772469 5975 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0930 13:58:54.772480 5975 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 13:58:54.772486 5975 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0930 13:58:54.772499 5975 factory.go:656] Stopping watch factory\\\\nI0930 13:58:54.772512 5975 ovnkube.go:599] Stopped ovnkube\\\\nI0930 13:58:54.772567 5975 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 13:58:54.772552 5975 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 13:58:54.772582 5975 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 13:58:54.772588 5975 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d05c6ffe2c46461b1604661e27c75ac6f150de39e30e983d4c4dfbe9c84092e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6b55521ba448057d6f78ab17ef0a1a7d4c8e593e5c7a23bebb4db492fb4b54f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6b55521ba448057d6f78ab17ef0a1a7d4c8e593e5c7a23bebb4db492fb4b54f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9775s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.533794 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938f56b18fe369ceeb13a74b0c257fd1b7fafeeb4d331327d4508f482caca32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.553099 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s7q5x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12808c49-1bed-4251-bcbe-fad6207eea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://397ae2103d09a79613ccc4281fb5318128fa7dc84e8013a5327cf3cd31490049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84swg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s7q5x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.574622 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9343584b-a3e2-45a2-9f71-b815d206f44c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://168155413f2783887fde890a607fd98c1823da5a45b7fc550ef9358750451da7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69000b0d87b6f1e68fb36dd2349f95de211b17cc4e7bd7ec489969feca3810d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0228750af194e47c41da2795fb0b21b2c44b61e7d5a8943a943793954b33e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://288cb819756dee05f608c0e6875f3a2b4eb629a5d07db35aaf706a4bc4110899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9e51dab4ec7df53e36f707786a013e4fca0909f0d69cb9113dfdff6e0c9a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.586631 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2355fa03-bfc0-4ba0-941c-ceee570ab4b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb64f69d88447dcfe2a2e206f1e59b4b10f370ac02a3f9653e0b95e6d94529c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f628b643cc4c4a9a0d4b2c6d923d6ee5839b325a873da093e84dfc48255548f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86ba7b90dcf7a1fcd028d2fcf19bb86de8c58446ef34191d50a5055f4042bb71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5591cefc1293c938f8a2a8db06c7287d0e54d55a9d99c65a40126ca77abccd80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.598319 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e12a8bf70722b099302a9c3b8d88526bbea68746ea168b2bf41306a3e18641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.609313 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.609352 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.609361 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.609374 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.609383 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:57Z","lastTransitionTime":"2025-09-30T13:58:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.613138 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e1c7778cf1d8d9f6dc68d72d153ea277d19d66d775fa02ed06687e3392040c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98be87b4cb316839099b69460cc1c1ecd98d5162c7dfc995c8f6d86ad477fedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.626412 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.642914 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86284cfd-7d99-4f3b-bb3a-593b66737ae3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf15b6b87f6e760bf922b02b49658153875a79c4253bc50a895372c460e0f077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f838844cf38dd9b54b6054863a395010ea5cb87ca8066642ffe388365a91f5a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://151ae8ebe882d8c0150cd97088a56e52f55f0ae066819f8e4aaae2279a61881f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff5fec130c99031e2cb7164e0a3d5c4af84e2b2ac25d3cf255008dc5c1ab967\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3049f477ecf683e57b583e8f15aa9e5335347be632ffc02799ef279fefe9b446\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 13:58:31.152502 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:58:31.168708 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3623665920/tls.crt::/tmp/serving-cert-3623665920/tls.key\\\\\\\"\\\\nI0930 13:58:36.402239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:58:36.407288 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:58:36.407313 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:58:36.407332 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:58:36.407337 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:58:36.414209 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:58:36.414232 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414236 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:58:36.414244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:58:36.414247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:58:36.414250 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:58:36.414249 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:58:36.417272 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e2c7ce3911aaca47ffc14e5628e9efb3b32594d812028b2659b8f8ca681d53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.656384 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pw2gp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f8ff358-e9b0-478e-acbf-e30059107be1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a57e948acbb622af47bb656f844fcdca4ff7dbf2256fc07d7a73951621b06b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kq87r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f5e3bfb55994ad5563473790f09c8a934f98e96fe7c88bd1f0dc1ae3dfc7a0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kq87r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pw2gp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.666797 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sksn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47ea2c6-e937-4411-b4c9-98048a5e5f05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtbn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtbn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sksn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.687779 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9775s_4fae6bdf-2a3f-4961-934d-b8f653412538/ovnkube-controller/1.log" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.688347 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9775s_4fae6bdf-2a3f-4961-934d-b8f653412538/ovnkube-controller/0.log" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.690815 4676 generic.go:334] "Generic (PLEG): container finished" podID="4fae6bdf-2a3f-4961-934d-b8f653412538" containerID="bb3451ef827f096afd86039c4c7569ef452c57ddf12d48687f99553e0a851d4c" exitCode=1 Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.690874 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" event={"ID":"4fae6bdf-2a3f-4961-934d-b8f653412538","Type":"ContainerDied","Data":"bb3451ef827f096afd86039c4c7569ef452c57ddf12d48687f99553e0a851d4c"} Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.690960 4676 scope.go:117] "RemoveContainer" containerID="73cabf42ce5073083d3f5db7b98a621247f45be0b8c0d6dcd43e3bc2810c6f32" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.691648 4676 scope.go:117] "RemoveContainer" containerID="bb3451ef827f096afd86039c4c7569ef452c57ddf12d48687f99553e0a851d4c" Sep 30 13:58:57 crc kubenswrapper[4676]: E0930 13:58:57.691823 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-9775s_openshift-ovn-kubernetes(4fae6bdf-2a3f-4961-934d-b8f653412538)\"" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" podUID="4fae6bdf-2a3f-4961-934d-b8f653412538" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.707194 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ksfzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70e11f41-7d9e-49ef-a2f5-0691d5f8f631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72f096ef608f833a542cea2dde328dd04393d90bcf9c20601499b9bb49734e8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e68434ff5ca08dde9677cf96ed4703958895aa880048ef99eead372a1a06bcf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e68434ff5ca08dde9677cf96ed4703958895aa880048ef99eead372a1a06bcf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f888604a37c64c17b61fbe754fb766b3a8dd363cd84342d3f464ac3b377b703c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f888604a37c64c17b61fbe754fb766b3a8dd363cd84342d3f464ac3b377b703c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a4a6ea3224024cc371c57de0a5be4082b7bd9092090f7af42bedda1a897d78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70a4a6ea3224024cc371c57de0a5be4082b7bd9092090f7af42bedda1a897d78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62090d66b5fc63638db525937297cd84b886fc27446d8df9935ee6b45f7d8610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62090d66b5fc63638db525937297cd84b886fc27446d8df9935ee6b45f7d8610\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4452138f5ffc57e76d26761a49ebe4f782695f4d0298426e19d9fc9f87202645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4452138f5ffc57e76d26761a49ebe4f782695f4d0298426e19d9fc9f87202645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2ac4de1ea55982e71a3ecd8eaffb3d44648249635c4cdfbe0181f0497a2d1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c2ac4de1ea55982e71a3ecd8eaffb3d44648249635c4cdfbe0181f0497a2d1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ksfzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.711282 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.711316 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.711325 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.711342 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.711353 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:57Z","lastTransitionTime":"2025-09-30T13:58:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.718914 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af133cb7-f0e4-428e-b348-c6e81493fc1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8aa7f7317a33eb0f00d05f59ec96a210aa79be528b5dc94d099b25fa019511a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7djb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ba88aa74993edfc49871c7cb7c45c498fc9a6d1e2cc573864d4656d4ade55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7djb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4k2dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.737010 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fae6bdf-2a3f-4961-934d-b8f653412538\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dbb7809575c05151462f55a6d1d4eca91ba859d817d152220688e61385beff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2def96f529a78f915d69adfd74b48057081c10bf8f21bd69fce0d4b305f80a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a8ea8d40cefaa9290dfd65c7f1fa80963910c38b71de3992bdcf77fde19eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1323c0deb00658d6452b2d6aa4515233551626a4c4def3f4fff539db4a3f5038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6836ea3a8ec1a7fdff3ef83bd00963c21caeb4e8264f3b6775201c90b83d4653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f1d81c0d66a327a4174d0f39a9540f66d89a6c2049490ddad67174aa684b773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb3451ef827f096afd86039c4c7569ef452c57ddf12d48687f99553e0a851d4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73cabf42ce5073083d3f5db7b98a621247f45be0b8c0d6dcd43e3bc2810c6f32\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"message\\\":\\\"/client-go/informers/factory.go:160\\\\nI0930 13:58:54.768376 5975 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:58:54.768532 5975 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:58:54.769997 5975 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:58:54.772368 5975 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0930 13:58:54.772453 5975 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 13:58:54.772469 5975 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0930 13:58:54.772480 5975 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 13:58:54.772486 5975 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0930 13:58:54.772499 5975 factory.go:656] Stopping watch factory\\\\nI0930 13:58:54.772512 5975 ovnkube.go:599] Stopped ovnkube\\\\nI0930 13:58:54.772567 5975 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 13:58:54.772552 5975 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 13:58:54.772582 5975 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 13:58:54.772588 5975 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb3451ef827f096afd86039c4c7569ef452c57ddf12d48687f99553e0a851d4c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:58:57Z\\\",\\\"message\\\":\\\" 6225 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:58:57.637573 6225 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 13:58:57.637613 6225 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 13:58:57.637633 6225 factory.go:656] Stopping watch factory\\\\nI0930 13:58:57.637649 6225 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 13:58:57.637657 6225 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 13:58:57.637908 6225 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:58:57.638020 6225 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:58:57.638147 6225 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:58:57.651829 6225 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0930 13:58:57.651866 6225 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0930 13:58:57.651948 6225 ovnkube.go:599] Stopped ovnkube\\\\nI0930 13:58:57.651969 6225 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0930 13:58:57.652043 6225 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d05c6ffe2c46461b1604661e27c75ac6f150de39e30e983d4c4dfbe9c84092e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6b55521ba448057d6f78ab17ef0a1a7d4c8e593e5c7a23bebb4db492fb4b54f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6b55521ba448057d6f78ab17ef0a1a7d4c8e593e5c7a23bebb4db492fb4b54f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9775s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.747917 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e47ea2c6-e937-4411-b4c9-98048a5e5f05-metrics-certs\") pod \"network-metrics-daemon-sksn7\" (UID: \"e47ea2c6-e937-4411-b4c9-98048a5e5f05\") " pod="openshift-multus/network-metrics-daemon-sksn7" Sep 30 13:58:57 crc kubenswrapper[4676]: E0930 13:58:57.748269 4676 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 13:58:57 crc kubenswrapper[4676]: E0930 13:58:57.748379 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e47ea2c6-e937-4411-b4c9-98048a5e5f05-metrics-certs podName:e47ea2c6-e937-4411-b4c9-98048a5e5f05 nodeName:}" failed. No retries permitted until 2025-09-30 13:58:59.748354535 +0000 UTC m=+43.731442964 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e47ea2c6-e937-4411-b4c9-98048a5e5f05-metrics-certs") pod "network-metrics-daemon-sksn7" (UID: "e47ea2c6-e937-4411-b4c9-98048a5e5f05") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.750152 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.761696 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.770997 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7stxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e065baf-f38b-4397-bbbe-ef52eea10f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f5fe4110f265eb89d70f5d85485514e7fea4ed909359edaa0c76d305d1a9c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nkkjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7stxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.781164 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qmgfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0fc06a-eb2b-4fa4-8554-ce8a0fd62074\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb44b6e0c8081c62bace086876620d80d412864b9afb7fd495413baf0363c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mwsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qmgfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.793727 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e1c7778cf1d8d9f6dc68d72d153ea277d19d66d775fa02ed06687e3392040c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98be87b4cb316839099b69460cc1c1ecd98d5162c7dfc995c8f6d86ad477fedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.806521 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.813104 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.813153 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.813163 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.813180 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.813189 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:57Z","lastTransitionTime":"2025-09-30T13:58:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.816006 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938f56b18fe369ceeb13a74b0c257fd1b7fafeeb4d331327d4508f482caca32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.827854 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s7q5x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12808c49-1bed-4251-bcbe-fad6207eea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://397ae2103d09a79613ccc4281fb5318128fa7dc84e8013a5327cf3cd31490049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84swg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s7q5x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.846305 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9343584b-a3e2-45a2-9f71-b815d206f44c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://168155413f2783887fde890a607fd98c1823da5a45b7fc550ef9358750451da7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69000b0d87b6f1e68fb36dd2349f95de211b17cc4e7bd7ec489969feca3810d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0228750af194e47c41da2795fb0b21b2c44b61e7d5a8943a943793954b33e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://288cb819756dee05f608c0e6875f3a2b4eb629a5d07db35aaf706a4bc4110899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9e51dab4ec7df53e36f707786a013e4fca0909f0d69cb9113dfdff6e0c9a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.857323 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2355fa03-bfc0-4ba0-941c-ceee570ab4b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb64f69d88447dcfe2a2e206f1e59b4b10f370ac02a3f9653e0b95e6d94529c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f628b643cc4c4a9a0d4b2c6d923d6ee5839b325a873da093e84dfc48255548f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86ba7b90dcf7a1fcd028d2fcf19bb86de8c58446ef34191d50a5055f4042bb71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5591cefc1293c938f8a2a8db06c7287d0e54d55a9d99c65a40126ca77abccd80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.869746 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e12a8bf70722b099302a9c3b8d88526bbea68746ea168b2bf41306a3e18641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.881185 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86284cfd-7d99-4f3b-bb3a-593b66737ae3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf15b6b87f6e760bf922b02b49658153875a79c4253bc50a895372c460e0f077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f838844cf38dd9b54b6054863a395010ea5cb87ca8066642ffe388365a91f5a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://151ae8ebe882d8c0150cd97088a56e52f55f0ae066819f8e4aaae2279a61881f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff5fec130c99031e2cb7164e0a3d5c4af84e2b2ac25d3cf255008dc5c1ab967\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3049f477ecf683e57b583e8f15aa9e5335347be632ffc02799ef279fefe9b446\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 13:58:31.152502 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:58:31.168708 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3623665920/tls.crt::/tmp/serving-cert-3623665920/tls.key\\\\\\\"\\\\nI0930 13:58:36.402239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:58:36.407288 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:58:36.407313 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:58:36.407332 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:58:36.407337 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:58:36.414209 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:58:36.414232 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414236 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:58:36.414244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:58:36.414247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:58:36.414250 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:58:36.414249 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:58:36.417272 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e2c7ce3911aaca47ffc14e5628e9efb3b32594d812028b2659b8f8ca681d53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.890376 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pw2gp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f8ff358-e9b0-478e-acbf-e30059107be1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a57e948acbb622af47bb656f844fcdca4ff7dbf2256fc07d7a73951621b06b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kq87r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f5e3bfb55994ad5563473790f09c8a934f98e96fe7c88bd1f0dc1ae3dfc7a0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kq87r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pw2gp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.898196 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sksn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47ea2c6-e937-4411-b4c9-98048a5e5f05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtbn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtbn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sksn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:58:57Z is after 2025-08-24T17:21:41Z" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.915710 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.915732 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.915743 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.915755 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:57 crc kubenswrapper[4676]: I0930 13:58:57.915763 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:57Z","lastTransitionTime":"2025-09-30T13:58:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:58 crc kubenswrapper[4676]: I0930 13:58:58.017530 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:58 crc kubenswrapper[4676]: I0930 13:58:58.017577 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:58 crc kubenswrapper[4676]: I0930 13:58:58.017588 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:58 crc kubenswrapper[4676]: I0930 13:58:58.017606 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:58 crc kubenswrapper[4676]: I0930 13:58:58.017619 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:58Z","lastTransitionTime":"2025-09-30T13:58:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:58 crc kubenswrapper[4676]: I0930 13:58:58.119386 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:58 crc kubenswrapper[4676]: I0930 13:58:58.119425 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:58 crc kubenswrapper[4676]: I0930 13:58:58.119434 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:58 crc kubenswrapper[4676]: I0930 13:58:58.119450 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:58 crc kubenswrapper[4676]: I0930 13:58:58.119461 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:58Z","lastTransitionTime":"2025-09-30T13:58:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:58 crc kubenswrapper[4676]: I0930 13:58:58.223831 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:58 crc kubenswrapper[4676]: I0930 13:58:58.224206 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:58 crc kubenswrapper[4676]: I0930 13:58:58.224219 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:58 crc kubenswrapper[4676]: I0930 13:58:58.224239 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:58 crc kubenswrapper[4676]: I0930 13:58:58.224255 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:58Z","lastTransitionTime":"2025-09-30T13:58:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:58 crc kubenswrapper[4676]: I0930 13:58:58.326510 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:58 crc kubenswrapper[4676]: I0930 13:58:58.326563 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:58 crc kubenswrapper[4676]: I0930 13:58:58.326575 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:58 crc kubenswrapper[4676]: I0930 13:58:58.326594 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:58 crc kubenswrapper[4676]: I0930 13:58:58.326607 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:58Z","lastTransitionTime":"2025-09-30T13:58:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:58 crc kubenswrapper[4676]: I0930 13:58:58.429026 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:58 crc kubenswrapper[4676]: I0930 13:58:58.429077 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:58 crc kubenswrapper[4676]: I0930 13:58:58.429094 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:58 crc kubenswrapper[4676]: I0930 13:58:58.429111 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:58 crc kubenswrapper[4676]: I0930 13:58:58.429122 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:58Z","lastTransitionTime":"2025-09-30T13:58:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:58 crc kubenswrapper[4676]: I0930 13:58:58.432514 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:58:58 crc kubenswrapper[4676]: I0930 13:58:58.432523 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:58:58 crc kubenswrapper[4676]: E0930 13:58:58.432608 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:58:58 crc kubenswrapper[4676]: I0930 13:58:58.432639 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:58:58 crc kubenswrapper[4676]: E0930 13:58:58.432766 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:58:58 crc kubenswrapper[4676]: E0930 13:58:58.432918 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:58:58 crc kubenswrapper[4676]: I0930 13:58:58.531650 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:58 crc kubenswrapper[4676]: I0930 13:58:58.531683 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:58 crc kubenswrapper[4676]: I0930 13:58:58.531734 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:58 crc kubenswrapper[4676]: I0930 13:58:58.531757 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:58 crc kubenswrapper[4676]: I0930 13:58:58.531770 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:58Z","lastTransitionTime":"2025-09-30T13:58:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:58 crc kubenswrapper[4676]: I0930 13:58:58.634136 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:58 crc kubenswrapper[4676]: I0930 13:58:58.634174 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:58 crc kubenswrapper[4676]: I0930 13:58:58.634184 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:58 crc kubenswrapper[4676]: I0930 13:58:58.634197 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:58 crc kubenswrapper[4676]: I0930 13:58:58.634206 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:58Z","lastTransitionTime":"2025-09-30T13:58:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:58 crc kubenswrapper[4676]: I0930 13:58:58.695542 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9775s_4fae6bdf-2a3f-4961-934d-b8f653412538/ovnkube-controller/1.log" Sep 30 13:58:58 crc kubenswrapper[4676]: I0930 13:58:58.736253 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:58 crc kubenswrapper[4676]: I0930 13:58:58.736295 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:58 crc kubenswrapper[4676]: I0930 13:58:58.736306 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:58 crc kubenswrapper[4676]: I0930 13:58:58.736324 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:58 crc kubenswrapper[4676]: I0930 13:58:58.736335 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:58Z","lastTransitionTime":"2025-09-30T13:58:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:58 crc kubenswrapper[4676]: I0930 13:58:58.839247 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:58 crc kubenswrapper[4676]: I0930 13:58:58.839286 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:58 crc kubenswrapper[4676]: I0930 13:58:58.839296 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:58 crc kubenswrapper[4676]: I0930 13:58:58.839310 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:58 crc kubenswrapper[4676]: I0930 13:58:58.839321 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:58Z","lastTransitionTime":"2025-09-30T13:58:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:58 crc kubenswrapper[4676]: I0930 13:58:58.941831 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:58 crc kubenswrapper[4676]: I0930 13:58:58.941891 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:58 crc kubenswrapper[4676]: I0930 13:58:58.941901 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:58 crc kubenswrapper[4676]: I0930 13:58:58.941914 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:58 crc kubenswrapper[4676]: I0930 13:58:58.941923 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:58Z","lastTransitionTime":"2025-09-30T13:58:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:59 crc kubenswrapper[4676]: I0930 13:58:59.044266 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:59 crc kubenswrapper[4676]: I0930 13:58:59.044301 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:59 crc kubenswrapper[4676]: I0930 13:58:59.044311 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:59 crc kubenswrapper[4676]: I0930 13:58:59.044328 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:59 crc kubenswrapper[4676]: I0930 13:58:59.044336 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:59Z","lastTransitionTime":"2025-09-30T13:58:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:59 crc kubenswrapper[4676]: I0930 13:58:59.147258 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:59 crc kubenswrapper[4676]: I0930 13:58:59.147308 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:59 crc kubenswrapper[4676]: I0930 13:58:59.147321 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:59 crc kubenswrapper[4676]: I0930 13:58:59.147339 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:59 crc kubenswrapper[4676]: I0930 13:58:59.147351 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:59Z","lastTransitionTime":"2025-09-30T13:58:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:59 crc kubenswrapper[4676]: I0930 13:58:59.249478 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:59 crc kubenswrapper[4676]: I0930 13:58:59.249524 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:59 crc kubenswrapper[4676]: I0930 13:58:59.249535 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:59 crc kubenswrapper[4676]: I0930 13:58:59.249550 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:59 crc kubenswrapper[4676]: I0930 13:58:59.249560 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:59Z","lastTransitionTime":"2025-09-30T13:58:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:59 crc kubenswrapper[4676]: I0930 13:58:59.353077 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:59 crc kubenswrapper[4676]: I0930 13:58:59.353144 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:59 crc kubenswrapper[4676]: I0930 13:58:59.353161 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:59 crc kubenswrapper[4676]: I0930 13:58:59.353186 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:59 crc kubenswrapper[4676]: I0930 13:58:59.353204 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:59Z","lastTransitionTime":"2025-09-30T13:58:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:59 crc kubenswrapper[4676]: I0930 13:58:59.432670 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sksn7" Sep 30 13:58:59 crc kubenswrapper[4676]: E0930 13:58:59.432824 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sksn7" podUID="e47ea2c6-e937-4411-b4c9-98048a5e5f05" Sep 30 13:58:59 crc kubenswrapper[4676]: I0930 13:58:59.456422 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:59 crc kubenswrapper[4676]: I0930 13:58:59.456513 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:59 crc kubenswrapper[4676]: I0930 13:58:59.456536 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:59 crc kubenswrapper[4676]: I0930 13:58:59.456573 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:59 crc kubenswrapper[4676]: I0930 13:58:59.456600 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:59Z","lastTransitionTime":"2025-09-30T13:58:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:59 crc kubenswrapper[4676]: I0930 13:58:59.559836 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:59 crc kubenswrapper[4676]: I0930 13:58:59.559933 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:59 crc kubenswrapper[4676]: I0930 13:58:59.559953 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:59 crc kubenswrapper[4676]: I0930 13:58:59.559983 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:59 crc kubenswrapper[4676]: I0930 13:58:59.559999 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:59Z","lastTransitionTime":"2025-09-30T13:58:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:59 crc kubenswrapper[4676]: I0930 13:58:59.663104 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:59 crc kubenswrapper[4676]: I0930 13:58:59.663163 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:59 crc kubenswrapper[4676]: I0930 13:58:59.663173 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:59 crc kubenswrapper[4676]: I0930 13:58:59.663189 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:59 crc kubenswrapper[4676]: I0930 13:58:59.663199 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:59Z","lastTransitionTime":"2025-09-30T13:58:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:59 crc kubenswrapper[4676]: I0930 13:58:59.765715 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:59 crc kubenswrapper[4676]: I0930 13:58:59.765755 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:59 crc kubenswrapper[4676]: I0930 13:58:59.765764 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:59 crc kubenswrapper[4676]: I0930 13:58:59.765777 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:59 crc kubenswrapper[4676]: I0930 13:58:59.765786 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:59Z","lastTransitionTime":"2025-09-30T13:58:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:59 crc kubenswrapper[4676]: I0930 13:58:59.765809 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e47ea2c6-e937-4411-b4c9-98048a5e5f05-metrics-certs\") pod \"network-metrics-daemon-sksn7\" (UID: \"e47ea2c6-e937-4411-b4c9-98048a5e5f05\") " pod="openshift-multus/network-metrics-daemon-sksn7" Sep 30 13:58:59 crc kubenswrapper[4676]: E0930 13:58:59.766096 4676 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 13:58:59 crc kubenswrapper[4676]: E0930 13:58:59.766224 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e47ea2c6-e937-4411-b4c9-98048a5e5f05-metrics-certs podName:e47ea2c6-e937-4411-b4c9-98048a5e5f05 nodeName:}" failed. No retries permitted until 2025-09-30 13:59:03.766194683 +0000 UTC m=+47.749283302 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e47ea2c6-e937-4411-b4c9-98048a5e5f05-metrics-certs") pod "network-metrics-daemon-sksn7" (UID: "e47ea2c6-e937-4411-b4c9-98048a5e5f05") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 13:58:59 crc kubenswrapper[4676]: I0930 13:58:59.869128 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:59 crc kubenswrapper[4676]: I0930 13:58:59.869204 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:59 crc kubenswrapper[4676]: I0930 13:58:59.869219 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:59 crc kubenswrapper[4676]: I0930 13:58:59.869243 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:59 crc kubenswrapper[4676]: I0930 13:58:59.869264 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:59Z","lastTransitionTime":"2025-09-30T13:58:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:58:59 crc kubenswrapper[4676]: I0930 13:58:59.972648 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:58:59 crc kubenswrapper[4676]: I0930 13:58:59.972707 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:58:59 crc kubenswrapper[4676]: I0930 13:58:59.972718 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:58:59 crc kubenswrapper[4676]: I0930 13:58:59.972741 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:58:59 crc kubenswrapper[4676]: I0930 13:58:59.972752 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:58:59Z","lastTransitionTime":"2025-09-30T13:58:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:00 crc kubenswrapper[4676]: I0930 13:59:00.075467 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:00 crc kubenswrapper[4676]: I0930 13:59:00.075530 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:00 crc kubenswrapper[4676]: I0930 13:59:00.075548 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:00 crc kubenswrapper[4676]: I0930 13:59:00.075574 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:00 crc kubenswrapper[4676]: I0930 13:59:00.075593 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:00Z","lastTransitionTime":"2025-09-30T13:59:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:00 crc kubenswrapper[4676]: I0930 13:59:00.178133 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:00 crc kubenswrapper[4676]: I0930 13:59:00.178180 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:00 crc kubenswrapper[4676]: I0930 13:59:00.178197 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:00 crc kubenswrapper[4676]: I0930 13:59:00.178221 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:00 crc kubenswrapper[4676]: I0930 13:59:00.178234 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:00Z","lastTransitionTime":"2025-09-30T13:59:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:00 crc kubenswrapper[4676]: I0930 13:59:00.281009 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:00 crc kubenswrapper[4676]: I0930 13:59:00.281062 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:00 crc kubenswrapper[4676]: I0930 13:59:00.281073 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:00 crc kubenswrapper[4676]: I0930 13:59:00.281089 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:00 crc kubenswrapper[4676]: I0930 13:59:00.281100 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:00Z","lastTransitionTime":"2025-09-30T13:59:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:00 crc kubenswrapper[4676]: I0930 13:59:00.383425 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:00 crc kubenswrapper[4676]: I0930 13:59:00.383491 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:00 crc kubenswrapper[4676]: I0930 13:59:00.383505 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:00 crc kubenswrapper[4676]: I0930 13:59:00.383529 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:00 crc kubenswrapper[4676]: I0930 13:59:00.383544 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:00Z","lastTransitionTime":"2025-09-30T13:59:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:00 crc kubenswrapper[4676]: I0930 13:59:00.431933 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:59:00 crc kubenswrapper[4676]: E0930 13:59:00.432105 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:59:00 crc kubenswrapper[4676]: I0930 13:59:00.432202 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:59:00 crc kubenswrapper[4676]: I0930 13:59:00.432249 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:59:00 crc kubenswrapper[4676]: E0930 13:59:00.432330 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:59:00 crc kubenswrapper[4676]: E0930 13:59:00.432495 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:59:00 crc kubenswrapper[4676]: I0930 13:59:00.486248 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:00 crc kubenswrapper[4676]: I0930 13:59:00.486323 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:00 crc kubenswrapper[4676]: I0930 13:59:00.486336 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:00 crc kubenswrapper[4676]: I0930 13:59:00.486362 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:00 crc kubenswrapper[4676]: I0930 13:59:00.486379 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:00Z","lastTransitionTime":"2025-09-30T13:59:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:00 crc kubenswrapper[4676]: I0930 13:59:00.588473 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:00 crc kubenswrapper[4676]: I0930 13:59:00.588540 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:00 crc kubenswrapper[4676]: I0930 13:59:00.588555 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:00 crc kubenswrapper[4676]: I0930 13:59:00.588578 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:00 crc kubenswrapper[4676]: I0930 13:59:00.588594 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:00Z","lastTransitionTime":"2025-09-30T13:59:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:00 crc kubenswrapper[4676]: I0930 13:59:00.691999 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:00 crc kubenswrapper[4676]: I0930 13:59:00.692050 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:00 crc kubenswrapper[4676]: I0930 13:59:00.692059 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:00 crc kubenswrapper[4676]: I0930 13:59:00.692075 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:00 crc kubenswrapper[4676]: I0930 13:59:00.692086 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:00Z","lastTransitionTime":"2025-09-30T13:59:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:00 crc kubenswrapper[4676]: I0930 13:59:00.794458 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:00 crc kubenswrapper[4676]: I0930 13:59:00.794511 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:00 crc kubenswrapper[4676]: I0930 13:59:00.794520 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:00 crc kubenswrapper[4676]: I0930 13:59:00.794535 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:00 crc kubenswrapper[4676]: I0930 13:59:00.794543 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:00Z","lastTransitionTime":"2025-09-30T13:59:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:00 crc kubenswrapper[4676]: I0930 13:59:00.897938 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:00 crc kubenswrapper[4676]: I0930 13:59:00.897985 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:00 crc kubenswrapper[4676]: I0930 13:59:00.897995 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:00 crc kubenswrapper[4676]: I0930 13:59:00.898012 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:00 crc kubenswrapper[4676]: I0930 13:59:00.898023 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:00Z","lastTransitionTime":"2025-09-30T13:59:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:01 crc kubenswrapper[4676]: I0930 13:59:01.001844 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:01 crc kubenswrapper[4676]: I0930 13:59:01.001930 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:01 crc kubenswrapper[4676]: I0930 13:59:01.001945 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:01 crc kubenswrapper[4676]: I0930 13:59:01.001968 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:01 crc kubenswrapper[4676]: I0930 13:59:01.001983 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:01Z","lastTransitionTime":"2025-09-30T13:59:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:01 crc kubenswrapper[4676]: I0930 13:59:01.104384 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:01 crc kubenswrapper[4676]: I0930 13:59:01.104471 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:01 crc kubenswrapper[4676]: I0930 13:59:01.104486 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:01 crc kubenswrapper[4676]: I0930 13:59:01.104505 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:01 crc kubenswrapper[4676]: I0930 13:59:01.104517 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:01Z","lastTransitionTime":"2025-09-30T13:59:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:01 crc kubenswrapper[4676]: I0930 13:59:01.207215 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:01 crc kubenswrapper[4676]: I0930 13:59:01.207254 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:01 crc kubenswrapper[4676]: I0930 13:59:01.207266 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:01 crc kubenswrapper[4676]: I0930 13:59:01.207283 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:01 crc kubenswrapper[4676]: I0930 13:59:01.207295 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:01Z","lastTransitionTime":"2025-09-30T13:59:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:01 crc kubenswrapper[4676]: I0930 13:59:01.309953 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:01 crc kubenswrapper[4676]: I0930 13:59:01.310022 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:01 crc kubenswrapper[4676]: I0930 13:59:01.310045 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:01 crc kubenswrapper[4676]: I0930 13:59:01.310079 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:01 crc kubenswrapper[4676]: I0930 13:59:01.310103 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:01Z","lastTransitionTime":"2025-09-30T13:59:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:01 crc kubenswrapper[4676]: I0930 13:59:01.412686 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:01 crc kubenswrapper[4676]: I0930 13:59:01.412745 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:01 crc kubenswrapper[4676]: I0930 13:59:01.412756 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:01 crc kubenswrapper[4676]: I0930 13:59:01.412775 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:01 crc kubenswrapper[4676]: I0930 13:59:01.412786 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:01Z","lastTransitionTime":"2025-09-30T13:59:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:01 crc kubenswrapper[4676]: I0930 13:59:01.432291 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sksn7" Sep 30 13:59:01 crc kubenswrapper[4676]: E0930 13:59:01.432479 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sksn7" podUID="e47ea2c6-e937-4411-b4c9-98048a5e5f05" Sep 30 13:59:01 crc kubenswrapper[4676]: I0930 13:59:01.515905 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:01 crc kubenswrapper[4676]: I0930 13:59:01.515944 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:01 crc kubenswrapper[4676]: I0930 13:59:01.515954 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:01 crc kubenswrapper[4676]: I0930 13:59:01.515970 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:01 crc kubenswrapper[4676]: I0930 13:59:01.515981 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:01Z","lastTransitionTime":"2025-09-30T13:59:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:01 crc kubenswrapper[4676]: I0930 13:59:01.619781 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:01 crc kubenswrapper[4676]: I0930 13:59:01.619864 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:01 crc kubenswrapper[4676]: I0930 13:59:01.619917 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:01 crc kubenswrapper[4676]: I0930 13:59:01.619937 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:01 crc kubenswrapper[4676]: I0930 13:59:01.619948 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:01Z","lastTransitionTime":"2025-09-30T13:59:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:01 crc kubenswrapper[4676]: I0930 13:59:01.722832 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:01 crc kubenswrapper[4676]: I0930 13:59:01.722981 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:01 crc kubenswrapper[4676]: I0930 13:59:01.723019 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:01 crc kubenswrapper[4676]: I0930 13:59:01.723059 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:01 crc kubenswrapper[4676]: I0930 13:59:01.723104 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:01Z","lastTransitionTime":"2025-09-30T13:59:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:01 crc kubenswrapper[4676]: I0930 13:59:01.826834 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:01 crc kubenswrapper[4676]: I0930 13:59:01.826941 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:01 crc kubenswrapper[4676]: I0930 13:59:01.826961 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:01 crc kubenswrapper[4676]: I0930 13:59:01.826985 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:01 crc kubenswrapper[4676]: I0930 13:59:01.827002 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:01Z","lastTransitionTime":"2025-09-30T13:59:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:01 crc kubenswrapper[4676]: I0930 13:59:01.930227 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:01 crc kubenswrapper[4676]: I0930 13:59:01.930276 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:01 crc kubenswrapper[4676]: I0930 13:59:01.930287 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:01 crc kubenswrapper[4676]: I0930 13:59:01.930305 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:01 crc kubenswrapper[4676]: I0930 13:59:01.930314 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:01Z","lastTransitionTime":"2025-09-30T13:59:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:02 crc kubenswrapper[4676]: I0930 13:59:02.033493 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:02 crc kubenswrapper[4676]: I0930 13:59:02.033543 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:02 crc kubenswrapper[4676]: I0930 13:59:02.033552 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:02 crc kubenswrapper[4676]: I0930 13:59:02.033575 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:02 crc kubenswrapper[4676]: I0930 13:59:02.033585 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:02Z","lastTransitionTime":"2025-09-30T13:59:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:02 crc kubenswrapper[4676]: I0930 13:59:02.136347 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:02 crc kubenswrapper[4676]: I0930 13:59:02.136384 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:02 crc kubenswrapper[4676]: I0930 13:59:02.136393 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:02 crc kubenswrapper[4676]: I0930 13:59:02.136407 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:02 crc kubenswrapper[4676]: I0930 13:59:02.136417 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:02Z","lastTransitionTime":"2025-09-30T13:59:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:02 crc kubenswrapper[4676]: I0930 13:59:02.238710 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:02 crc kubenswrapper[4676]: I0930 13:59:02.238755 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:02 crc kubenswrapper[4676]: I0930 13:59:02.238765 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:02 crc kubenswrapper[4676]: I0930 13:59:02.238781 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:02 crc kubenswrapper[4676]: I0930 13:59:02.238790 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:02Z","lastTransitionTime":"2025-09-30T13:59:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:02 crc kubenswrapper[4676]: I0930 13:59:02.341296 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:02 crc kubenswrapper[4676]: I0930 13:59:02.341362 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:02 crc kubenswrapper[4676]: I0930 13:59:02.341373 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:02 crc kubenswrapper[4676]: I0930 13:59:02.341395 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:02 crc kubenswrapper[4676]: I0930 13:59:02.341407 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:02Z","lastTransitionTime":"2025-09-30T13:59:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:02 crc kubenswrapper[4676]: I0930 13:59:02.432977 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:59:02 crc kubenswrapper[4676]: I0930 13:59:02.432977 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:59:02 crc kubenswrapper[4676]: I0930 13:59:02.433012 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:59:02 crc kubenswrapper[4676]: E0930 13:59:02.433192 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:59:02 crc kubenswrapper[4676]: E0930 13:59:02.433257 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:59:02 crc kubenswrapper[4676]: E0930 13:59:02.433476 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:59:02 crc kubenswrapper[4676]: I0930 13:59:02.444115 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:02 crc kubenswrapper[4676]: I0930 13:59:02.444173 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:02 crc kubenswrapper[4676]: I0930 13:59:02.444195 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:02 crc kubenswrapper[4676]: I0930 13:59:02.444215 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:02 crc kubenswrapper[4676]: I0930 13:59:02.444228 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:02Z","lastTransitionTime":"2025-09-30T13:59:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:02 crc kubenswrapper[4676]: I0930 13:59:02.547485 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:02 crc kubenswrapper[4676]: I0930 13:59:02.547553 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:02 crc kubenswrapper[4676]: I0930 13:59:02.547566 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:02 crc kubenswrapper[4676]: I0930 13:59:02.547586 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:02 crc kubenswrapper[4676]: I0930 13:59:02.547597 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:02Z","lastTransitionTime":"2025-09-30T13:59:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:02 crc kubenswrapper[4676]: I0930 13:59:02.649685 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:02 crc kubenswrapper[4676]: I0930 13:59:02.649752 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:02 crc kubenswrapper[4676]: I0930 13:59:02.649766 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:02 crc kubenswrapper[4676]: I0930 13:59:02.649791 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:02 crc kubenswrapper[4676]: I0930 13:59:02.649805 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:02Z","lastTransitionTime":"2025-09-30T13:59:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:02 crc kubenswrapper[4676]: I0930 13:59:02.752091 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:02 crc kubenswrapper[4676]: I0930 13:59:02.752128 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:02 crc kubenswrapper[4676]: I0930 13:59:02.752139 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:02 crc kubenswrapper[4676]: I0930 13:59:02.752159 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:02 crc kubenswrapper[4676]: I0930 13:59:02.752172 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:02Z","lastTransitionTime":"2025-09-30T13:59:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:02 crc kubenswrapper[4676]: I0930 13:59:02.855259 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:02 crc kubenswrapper[4676]: I0930 13:59:02.855290 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:02 crc kubenswrapper[4676]: I0930 13:59:02.855300 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:02 crc kubenswrapper[4676]: I0930 13:59:02.855317 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:02 crc kubenswrapper[4676]: I0930 13:59:02.855329 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:02Z","lastTransitionTime":"2025-09-30T13:59:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:02 crc kubenswrapper[4676]: I0930 13:59:02.957939 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:02 crc kubenswrapper[4676]: I0930 13:59:02.957982 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:02 crc kubenswrapper[4676]: I0930 13:59:02.957991 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:02 crc kubenswrapper[4676]: I0930 13:59:02.958007 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:02 crc kubenswrapper[4676]: I0930 13:59:02.958016 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:02Z","lastTransitionTime":"2025-09-30T13:59:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.060230 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.060277 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.060286 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.060303 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.060316 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:03Z","lastTransitionTime":"2025-09-30T13:59:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.162537 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.162582 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.162591 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.162606 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.162615 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:03Z","lastTransitionTime":"2025-09-30T13:59:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.265034 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.265073 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.265084 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.265103 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.265115 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:03Z","lastTransitionTime":"2025-09-30T13:59:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.320895 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.320936 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.320953 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.320970 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.320979 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:03Z","lastTransitionTime":"2025-09-30T13:59:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:03 crc kubenswrapper[4676]: E0930 13:59:03.333815 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1d613b32-e65a-4ccd-985a-b2960fc60b41\\\",\\\"systemUUID\\\":\\\"e459744e-c3c7-46eb-b48a-f9d168ffc645\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:03Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.337608 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.337651 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.337662 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.337677 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.337686 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:03Z","lastTransitionTime":"2025-09-30T13:59:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:03 crc kubenswrapper[4676]: E0930 13:59:03.348351 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1d613b32-e65a-4ccd-985a-b2960fc60b41\\\",\\\"systemUUID\\\":\\\"e459744e-c3c7-46eb-b48a-f9d168ffc645\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:03Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.351461 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.351522 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.351550 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.351576 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.351586 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:03Z","lastTransitionTime":"2025-09-30T13:59:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:03 crc kubenswrapper[4676]: E0930 13:59:03.362802 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1d613b32-e65a-4ccd-985a-b2960fc60b41\\\",\\\"systemUUID\\\":\\\"e459744e-c3c7-46eb-b48a-f9d168ffc645\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:03Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.366643 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.366673 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.366685 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.366702 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.366715 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:03Z","lastTransitionTime":"2025-09-30T13:59:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:03 crc kubenswrapper[4676]: E0930 13:59:03.381618 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1d613b32-e65a-4ccd-985a-b2960fc60b41\\\",\\\"systemUUID\\\":\\\"e459744e-c3c7-46eb-b48a-f9d168ffc645\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:03Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.385423 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.385466 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.385479 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.385499 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.385512 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:03Z","lastTransitionTime":"2025-09-30T13:59:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:03 crc kubenswrapper[4676]: E0930 13:59:03.396722 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1d613b32-e65a-4ccd-985a-b2960fc60b41\\\",\\\"systemUUID\\\":\\\"e459744e-c3c7-46eb-b48a-f9d168ffc645\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:03Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:03 crc kubenswrapper[4676]: E0930 13:59:03.396850 4676 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.398423 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.398453 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.398461 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.398484 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.398494 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:03Z","lastTransitionTime":"2025-09-30T13:59:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.432514 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sksn7" Sep 30 13:59:03 crc kubenswrapper[4676]: E0930 13:59:03.432660 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sksn7" podUID="e47ea2c6-e937-4411-b4c9-98048a5e5f05" Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.500636 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.500694 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.500709 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.500724 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.500735 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:03Z","lastTransitionTime":"2025-09-30T13:59:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.603261 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.603315 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.603348 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.603363 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.603371 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:03Z","lastTransitionTime":"2025-09-30T13:59:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.705416 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.705462 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.705478 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.705492 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.705750 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:03Z","lastTransitionTime":"2025-09-30T13:59:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.808127 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.808168 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.808179 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.808195 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.808206 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:03Z","lastTransitionTime":"2025-09-30T13:59:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.811789 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e47ea2c6-e937-4411-b4c9-98048a5e5f05-metrics-certs\") pod \"network-metrics-daemon-sksn7\" (UID: \"e47ea2c6-e937-4411-b4c9-98048a5e5f05\") " pod="openshift-multus/network-metrics-daemon-sksn7" Sep 30 13:59:03 crc kubenswrapper[4676]: E0930 13:59:03.811926 4676 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 13:59:03 crc kubenswrapper[4676]: E0930 13:59:03.811982 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e47ea2c6-e937-4411-b4c9-98048a5e5f05-metrics-certs podName:e47ea2c6-e937-4411-b4c9-98048a5e5f05 nodeName:}" failed. No retries permitted until 2025-09-30 13:59:11.811968094 +0000 UTC m=+55.795056523 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e47ea2c6-e937-4411-b4c9-98048a5e5f05-metrics-certs") pod "network-metrics-daemon-sksn7" (UID: "e47ea2c6-e937-4411-b4c9-98048a5e5f05") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.910895 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.910938 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.910950 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.910968 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:03 crc kubenswrapper[4676]: I0930 13:59:03.910979 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:03Z","lastTransitionTime":"2025-09-30T13:59:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:04 crc kubenswrapper[4676]: I0930 13:59:04.014005 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:04 crc kubenswrapper[4676]: I0930 13:59:04.014059 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:04 crc kubenswrapper[4676]: I0930 13:59:04.014072 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:04 crc kubenswrapper[4676]: I0930 13:59:04.014095 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:04 crc kubenswrapper[4676]: I0930 13:59:04.014112 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:04Z","lastTransitionTime":"2025-09-30T13:59:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:04 crc kubenswrapper[4676]: I0930 13:59:04.116700 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:04 crc kubenswrapper[4676]: I0930 13:59:04.116752 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:04 crc kubenswrapper[4676]: I0930 13:59:04.116775 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:04 crc kubenswrapper[4676]: I0930 13:59:04.116823 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:04 crc kubenswrapper[4676]: I0930 13:59:04.116869 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:04Z","lastTransitionTime":"2025-09-30T13:59:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:04 crc kubenswrapper[4676]: I0930 13:59:04.219727 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:04 crc kubenswrapper[4676]: I0930 13:59:04.219768 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:04 crc kubenswrapper[4676]: I0930 13:59:04.219783 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:04 crc kubenswrapper[4676]: I0930 13:59:04.219809 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:04 crc kubenswrapper[4676]: I0930 13:59:04.219823 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:04Z","lastTransitionTime":"2025-09-30T13:59:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:04 crc kubenswrapper[4676]: I0930 13:59:04.321928 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:04 crc kubenswrapper[4676]: I0930 13:59:04.321970 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:04 crc kubenswrapper[4676]: I0930 13:59:04.321979 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:04 crc kubenswrapper[4676]: I0930 13:59:04.321994 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:04 crc kubenswrapper[4676]: I0930 13:59:04.322003 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:04Z","lastTransitionTime":"2025-09-30T13:59:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:04 crc kubenswrapper[4676]: I0930 13:59:04.424690 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:04 crc kubenswrapper[4676]: I0930 13:59:04.424732 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:04 crc kubenswrapper[4676]: I0930 13:59:04.424742 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:04 crc kubenswrapper[4676]: I0930 13:59:04.424760 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:04 crc kubenswrapper[4676]: I0930 13:59:04.424771 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:04Z","lastTransitionTime":"2025-09-30T13:59:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:04 crc kubenswrapper[4676]: I0930 13:59:04.431898 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:59:04 crc kubenswrapper[4676]: I0930 13:59:04.431931 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:59:04 crc kubenswrapper[4676]: I0930 13:59:04.431916 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:59:04 crc kubenswrapper[4676]: E0930 13:59:04.432022 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:59:04 crc kubenswrapper[4676]: E0930 13:59:04.432091 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:59:04 crc kubenswrapper[4676]: E0930 13:59:04.432175 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:59:04 crc kubenswrapper[4676]: I0930 13:59:04.527377 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:04 crc kubenswrapper[4676]: I0930 13:59:04.527419 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:04 crc kubenswrapper[4676]: I0930 13:59:04.527434 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:04 crc kubenswrapper[4676]: I0930 13:59:04.527452 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:04 crc kubenswrapper[4676]: I0930 13:59:04.527462 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:04Z","lastTransitionTime":"2025-09-30T13:59:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:04 crc kubenswrapper[4676]: I0930 13:59:04.629592 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:04 crc kubenswrapper[4676]: I0930 13:59:04.629636 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:04 crc kubenswrapper[4676]: I0930 13:59:04.629647 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:04 crc kubenswrapper[4676]: I0930 13:59:04.629666 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:04 crc kubenswrapper[4676]: I0930 13:59:04.629677 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:04Z","lastTransitionTime":"2025-09-30T13:59:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:04 crc kubenswrapper[4676]: I0930 13:59:04.731502 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:04 crc kubenswrapper[4676]: I0930 13:59:04.731535 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:04 crc kubenswrapper[4676]: I0930 13:59:04.731543 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:04 crc kubenswrapper[4676]: I0930 13:59:04.731556 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:04 crc kubenswrapper[4676]: I0930 13:59:04.731564 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:04Z","lastTransitionTime":"2025-09-30T13:59:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:04 crc kubenswrapper[4676]: I0930 13:59:04.833949 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:04 crc kubenswrapper[4676]: I0930 13:59:04.833980 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:04 crc kubenswrapper[4676]: I0930 13:59:04.833988 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:04 crc kubenswrapper[4676]: I0930 13:59:04.834002 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:04 crc kubenswrapper[4676]: I0930 13:59:04.834010 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:04Z","lastTransitionTime":"2025-09-30T13:59:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:04 crc kubenswrapper[4676]: I0930 13:59:04.936324 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:04 crc kubenswrapper[4676]: I0930 13:59:04.936365 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:04 crc kubenswrapper[4676]: I0930 13:59:04.936375 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:04 crc kubenswrapper[4676]: I0930 13:59:04.936391 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:04 crc kubenswrapper[4676]: I0930 13:59:04.936401 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:04Z","lastTransitionTime":"2025-09-30T13:59:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:05 crc kubenswrapper[4676]: I0930 13:59:05.038470 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:05 crc kubenswrapper[4676]: I0930 13:59:05.038552 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:05 crc kubenswrapper[4676]: I0930 13:59:05.038568 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:05 crc kubenswrapper[4676]: I0930 13:59:05.038587 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:05 crc kubenswrapper[4676]: I0930 13:59:05.038601 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:05Z","lastTransitionTime":"2025-09-30T13:59:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:05 crc kubenswrapper[4676]: I0930 13:59:05.140783 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:05 crc kubenswrapper[4676]: I0930 13:59:05.140821 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:05 crc kubenswrapper[4676]: I0930 13:59:05.140858 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:05 crc kubenswrapper[4676]: I0930 13:59:05.140889 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:05 crc kubenswrapper[4676]: I0930 13:59:05.140919 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:05Z","lastTransitionTime":"2025-09-30T13:59:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:05 crc kubenswrapper[4676]: I0930 13:59:05.244458 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:05 crc kubenswrapper[4676]: I0930 13:59:05.244499 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:05 crc kubenswrapper[4676]: I0930 13:59:05.244513 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:05 crc kubenswrapper[4676]: I0930 13:59:05.244530 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:05 crc kubenswrapper[4676]: I0930 13:59:05.244541 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:05Z","lastTransitionTime":"2025-09-30T13:59:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:05 crc kubenswrapper[4676]: I0930 13:59:05.347963 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:05 crc kubenswrapper[4676]: I0930 13:59:05.348010 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:05 crc kubenswrapper[4676]: I0930 13:59:05.348022 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:05 crc kubenswrapper[4676]: I0930 13:59:05.348038 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:05 crc kubenswrapper[4676]: I0930 13:59:05.348050 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:05Z","lastTransitionTime":"2025-09-30T13:59:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:05 crc kubenswrapper[4676]: I0930 13:59:05.432316 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sksn7" Sep 30 13:59:05 crc kubenswrapper[4676]: E0930 13:59:05.432540 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sksn7" podUID="e47ea2c6-e937-4411-b4c9-98048a5e5f05" Sep 30 13:59:05 crc kubenswrapper[4676]: I0930 13:59:05.451033 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:05 crc kubenswrapper[4676]: I0930 13:59:05.451099 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:05 crc kubenswrapper[4676]: I0930 13:59:05.451117 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:05 crc kubenswrapper[4676]: I0930 13:59:05.451141 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:05 crc kubenswrapper[4676]: I0930 13:59:05.451151 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:05Z","lastTransitionTime":"2025-09-30T13:59:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:05 crc kubenswrapper[4676]: I0930 13:59:05.555001 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:05 crc kubenswrapper[4676]: I0930 13:59:05.555077 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:05 crc kubenswrapper[4676]: I0930 13:59:05.555095 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:05 crc kubenswrapper[4676]: I0930 13:59:05.555124 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:05 crc kubenswrapper[4676]: I0930 13:59:05.555142 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:05Z","lastTransitionTime":"2025-09-30T13:59:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:05 crc kubenswrapper[4676]: I0930 13:59:05.658844 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:05 crc kubenswrapper[4676]: I0930 13:59:05.659010 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:05 crc kubenswrapper[4676]: I0930 13:59:05.659040 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:05 crc kubenswrapper[4676]: I0930 13:59:05.659097 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:05 crc kubenswrapper[4676]: I0930 13:59:05.659129 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:05Z","lastTransitionTime":"2025-09-30T13:59:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:05 crc kubenswrapper[4676]: I0930 13:59:05.762441 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:05 crc kubenswrapper[4676]: I0930 13:59:05.762505 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:05 crc kubenswrapper[4676]: I0930 13:59:05.762522 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:05 crc kubenswrapper[4676]: I0930 13:59:05.762547 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:05 crc kubenswrapper[4676]: I0930 13:59:05.762563 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:05Z","lastTransitionTime":"2025-09-30T13:59:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:05 crc kubenswrapper[4676]: I0930 13:59:05.865455 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:05 crc kubenswrapper[4676]: I0930 13:59:05.865515 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:05 crc kubenswrapper[4676]: I0930 13:59:05.865525 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:05 crc kubenswrapper[4676]: I0930 13:59:05.865547 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:05 crc kubenswrapper[4676]: I0930 13:59:05.865560 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:05Z","lastTransitionTime":"2025-09-30T13:59:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:05 crc kubenswrapper[4676]: I0930 13:59:05.968669 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:05 crc kubenswrapper[4676]: I0930 13:59:05.968728 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:05 crc kubenswrapper[4676]: I0930 13:59:05.968737 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:05 crc kubenswrapper[4676]: I0930 13:59:05.968757 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:05 crc kubenswrapper[4676]: I0930 13:59:05.968768 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:05Z","lastTransitionTime":"2025-09-30T13:59:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:06 crc kubenswrapper[4676]: I0930 13:59:06.072137 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:06 crc kubenswrapper[4676]: I0930 13:59:06.072215 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:06 crc kubenswrapper[4676]: I0930 13:59:06.072235 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:06 crc kubenswrapper[4676]: I0930 13:59:06.072264 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:06 crc kubenswrapper[4676]: I0930 13:59:06.072283 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:06Z","lastTransitionTime":"2025-09-30T13:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:06 crc kubenswrapper[4676]: I0930 13:59:06.175353 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:06 crc kubenswrapper[4676]: I0930 13:59:06.175391 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:06 crc kubenswrapper[4676]: I0930 13:59:06.175404 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:06 crc kubenswrapper[4676]: I0930 13:59:06.175422 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:06 crc kubenswrapper[4676]: I0930 13:59:06.175434 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:06Z","lastTransitionTime":"2025-09-30T13:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:06 crc kubenswrapper[4676]: I0930 13:59:06.278019 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:06 crc kubenswrapper[4676]: I0930 13:59:06.278070 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:06 crc kubenswrapper[4676]: I0930 13:59:06.278082 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:06 crc kubenswrapper[4676]: I0930 13:59:06.278105 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:06 crc kubenswrapper[4676]: I0930 13:59:06.278116 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:06Z","lastTransitionTime":"2025-09-30T13:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:06 crc kubenswrapper[4676]: I0930 13:59:06.380895 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:06 crc kubenswrapper[4676]: I0930 13:59:06.381297 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:06 crc kubenswrapper[4676]: I0930 13:59:06.381432 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:06 crc kubenswrapper[4676]: I0930 13:59:06.381533 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:06 crc kubenswrapper[4676]: I0930 13:59:06.381621 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:06Z","lastTransitionTime":"2025-09-30T13:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:06 crc kubenswrapper[4676]: I0930 13:59:06.432684 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:59:06 crc kubenswrapper[4676]: I0930 13:59:06.432835 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:59:06 crc kubenswrapper[4676]: I0930 13:59:06.433341 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:59:06 crc kubenswrapper[4676]: E0930 13:59:06.433477 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:59:06 crc kubenswrapper[4676]: E0930 13:59:06.433686 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:59:06 crc kubenswrapper[4676]: E0930 13:59:06.433938 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:59:06 crc kubenswrapper[4676]: I0930 13:59:06.484997 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:06 crc kubenswrapper[4676]: I0930 13:59:06.485065 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:06 crc kubenswrapper[4676]: I0930 13:59:06.485077 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:06 crc kubenswrapper[4676]: I0930 13:59:06.485097 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:06 crc kubenswrapper[4676]: I0930 13:59:06.485117 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:06Z","lastTransitionTime":"2025-09-30T13:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:06 crc kubenswrapper[4676]: I0930 13:59:06.587525 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:06 crc kubenswrapper[4676]: I0930 13:59:06.587565 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:06 crc kubenswrapper[4676]: I0930 13:59:06.587573 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:06 crc kubenswrapper[4676]: I0930 13:59:06.587586 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:06 crc kubenswrapper[4676]: I0930 13:59:06.587596 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:06Z","lastTransitionTime":"2025-09-30T13:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:06 crc kubenswrapper[4676]: I0930 13:59:06.690289 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:06 crc kubenswrapper[4676]: I0930 13:59:06.690343 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:06 crc kubenswrapper[4676]: I0930 13:59:06.690359 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:06 crc kubenswrapper[4676]: I0930 13:59:06.690381 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:06 crc kubenswrapper[4676]: I0930 13:59:06.690397 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:06Z","lastTransitionTime":"2025-09-30T13:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:06 crc kubenswrapper[4676]: I0930 13:59:06.793757 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:06 crc kubenswrapper[4676]: I0930 13:59:06.793803 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:06 crc kubenswrapper[4676]: I0930 13:59:06.793815 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:06 crc kubenswrapper[4676]: I0930 13:59:06.793838 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:06 crc kubenswrapper[4676]: I0930 13:59:06.793853 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:06Z","lastTransitionTime":"2025-09-30T13:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:06 crc kubenswrapper[4676]: I0930 13:59:06.896821 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:06 crc kubenswrapper[4676]: I0930 13:59:06.896928 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:06 crc kubenswrapper[4676]: I0930 13:59:06.896949 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:06 crc kubenswrapper[4676]: I0930 13:59:06.896977 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:06 crc kubenswrapper[4676]: I0930 13:59:06.896998 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:06Z","lastTransitionTime":"2025-09-30T13:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:07 crc kubenswrapper[4676]: I0930 13:59:07.000261 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:07 crc kubenswrapper[4676]: I0930 13:59:07.000336 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:07 crc kubenswrapper[4676]: I0930 13:59:07.000359 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:07 crc kubenswrapper[4676]: I0930 13:59:07.000387 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:07 crc kubenswrapper[4676]: I0930 13:59:07.000407 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:07Z","lastTransitionTime":"2025-09-30T13:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:07 crc kubenswrapper[4676]: I0930 13:59:07.104617 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:07 crc kubenswrapper[4676]: I0930 13:59:07.104666 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:07 crc kubenswrapper[4676]: I0930 13:59:07.104676 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:07 crc kubenswrapper[4676]: I0930 13:59:07.104692 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:07 crc kubenswrapper[4676]: I0930 13:59:07.104701 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:07Z","lastTransitionTime":"2025-09-30T13:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:07 crc kubenswrapper[4676]: I0930 13:59:07.207252 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:07 crc kubenswrapper[4676]: I0930 13:59:07.207288 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:07 crc kubenswrapper[4676]: I0930 13:59:07.207296 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:07 crc kubenswrapper[4676]: I0930 13:59:07.207328 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:07 crc kubenswrapper[4676]: I0930 13:59:07.207338 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:07Z","lastTransitionTime":"2025-09-30T13:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:07 crc kubenswrapper[4676]: I0930 13:59:07.311921 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:07 crc kubenswrapper[4676]: I0930 13:59:07.311997 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:07 crc kubenswrapper[4676]: I0930 13:59:07.312011 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:07 crc kubenswrapper[4676]: I0930 13:59:07.312036 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:07 crc kubenswrapper[4676]: I0930 13:59:07.312051 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:07Z","lastTransitionTime":"2025-09-30T13:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:07 crc kubenswrapper[4676]: I0930 13:59:07.415019 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:07 crc kubenswrapper[4676]: I0930 13:59:07.415110 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:07 crc kubenswrapper[4676]: I0930 13:59:07.415135 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:07 crc kubenswrapper[4676]: I0930 13:59:07.415166 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:07 crc kubenswrapper[4676]: I0930 13:59:07.415184 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:07Z","lastTransitionTime":"2025-09-30T13:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:07 crc kubenswrapper[4676]: I0930 13:59:07.432355 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sksn7" Sep 30 13:59:07 crc kubenswrapper[4676]: E0930 13:59:07.432575 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sksn7" podUID="e47ea2c6-e937-4411-b4c9-98048a5e5f05" Sep 30 13:59:07 crc kubenswrapper[4676]: I0930 13:59:07.450034 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:07Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:07 crc kubenswrapper[4676]: I0930 13:59:07.466017 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:07Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:07 crc kubenswrapper[4676]: I0930 13:59:07.480162 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7stxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e065baf-f38b-4397-bbbe-ef52eea10f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f5fe4110f265eb89d70f5d85485514e7fea4ed909359edaa0c76d305d1a9c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nkkjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7stxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:07Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:07 crc kubenswrapper[4676]: I0930 13:59:07.492495 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qmgfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0fc06a-eb2b-4fa4-8554-ce8a0fd62074\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb44b6e0c8081c62bace086876620d80d412864b9afb7fd495413baf0363c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mwsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qmgfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:07Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:07 crc kubenswrapper[4676]: I0930 13:59:07.515967 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fae6bdf-2a3f-4961-934d-b8f653412538\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dbb7809575c05151462f55a6d1d4eca91ba859d817d152220688e61385beff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2def96f529a78f915d69adfd74b48057081c10bf8f21bd69fce0d4b305f80a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a8ea8d40cefaa9290dfd65c7f1fa80963910c38b71de3992bdcf77fde19eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1323c0deb00658d6452b2d6aa4515233551626a4c4def3f4fff539db4a3f5038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6836ea3a8ec1a7fdff3ef83bd00963c21caeb4e8264f3b6775201c90b83d4653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f1d81c0d66a327a4174d0f39a9540f66d89a6c2049490ddad67174aa684b773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb3451ef827f096afd86039c4c7569ef452c57ddf12d48687f99553e0a851d4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73cabf42ce5073083d3f5db7b98a621247f45be0b8c0d6dcd43e3bc2810c6f32\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"message\\\":\\\"/client-go/informers/factory.go:160\\\\nI0930 13:58:54.768376 5975 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:58:54.768532 5975 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:58:54.769997 5975 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:58:54.772368 5975 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0930 13:58:54.772453 5975 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 13:58:54.772469 5975 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0930 13:58:54.772480 5975 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 13:58:54.772486 5975 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0930 13:58:54.772499 5975 factory.go:656] Stopping watch factory\\\\nI0930 13:58:54.772512 5975 ovnkube.go:599] Stopped ovnkube\\\\nI0930 13:58:54.772567 5975 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 13:58:54.772552 5975 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 13:58:54.772582 5975 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 13:58:54.772588 5975 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb3451ef827f096afd86039c4c7569ef452c57ddf12d48687f99553e0a851d4c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:58:57Z\\\",\\\"message\\\":\\\" 6225 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:58:57.637573 6225 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 13:58:57.637613 6225 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 13:58:57.637633 6225 factory.go:656] Stopping watch factory\\\\nI0930 13:58:57.637649 6225 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 13:58:57.637657 6225 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 13:58:57.637908 6225 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:58:57.638020 6225 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:58:57.638147 6225 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:58:57.651829 6225 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0930 13:58:57.651866 6225 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0930 13:58:57.651948 6225 ovnkube.go:599] Stopped ovnkube\\\\nI0930 13:58:57.651969 6225 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0930 13:58:57.652043 6225 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d05c6ffe2c46461b1604661e27c75ac6f150de39e30e983d4c4dfbe9c84092e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6b55521ba448057d6f78ab17ef0a1a7d4c8e593e5c7a23bebb4db492fb4b54f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6b55521ba448057d6f78ab17ef0a1a7d4c8e593e5c7a23bebb4db492fb4b54f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9775s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:07Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:07 crc kubenswrapper[4676]: I0930 13:59:07.516916 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:07 crc kubenswrapper[4676]: I0930 13:59:07.516947 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:07 crc kubenswrapper[4676]: I0930 13:59:07.516958 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:07 crc kubenswrapper[4676]: I0930 13:59:07.516973 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:07 crc kubenswrapper[4676]: I0930 13:59:07.516983 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:07Z","lastTransitionTime":"2025-09-30T13:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:07 crc kubenswrapper[4676]: I0930 13:59:07.529953 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938f56b18fe369ceeb13a74b0c257fd1b7fafeeb4d331327d4508f482caca32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:07Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:07 crc kubenswrapper[4676]: I0930 13:59:07.544655 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s7q5x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12808c49-1bed-4251-bcbe-fad6207eea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://397ae2103d09a79613ccc4281fb5318128fa7dc84e8013a5327cf3cd31490049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84swg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s7q5x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:07Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:07 crc kubenswrapper[4676]: I0930 13:59:07.561512 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9343584b-a3e2-45a2-9f71-b815d206f44c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://168155413f2783887fde890a607fd98c1823da5a45b7fc550ef9358750451da7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69000b0d87b6f1e68fb36dd2349f95de211b17cc4e7bd7ec489969feca3810d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0228750af194e47c41da2795fb0b21b2c44b61e7d5a8943a943793954b33e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://288cb819756dee05f608c0e6875f3a2b4eb629a5d07db35aaf706a4bc4110899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9e51dab4ec7df53e36f707786a013e4fca0909f0d69cb9113dfdff6e0c9a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:07Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:07 crc kubenswrapper[4676]: I0930 13:59:07.572986 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2355fa03-bfc0-4ba0-941c-ceee570ab4b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb64f69d88447dcfe2a2e206f1e59b4b10f370ac02a3f9653e0b95e6d94529c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f628b643cc4c4a9a0d4b2c6d923d6ee5839b325a873da093e84dfc48255548f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86ba7b90dcf7a1fcd028d2fcf19bb86de8c58446ef34191d50a5055f4042bb71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5591cefc1293c938f8a2a8db06c7287d0e54d55a9d99c65a40126ca77abccd80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:07Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:07 crc kubenswrapper[4676]: I0930 13:59:07.584986 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e12a8bf70722b099302a9c3b8d88526bbea68746ea168b2bf41306a3e18641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:07Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:07 crc kubenswrapper[4676]: I0930 13:59:07.597328 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e1c7778cf1d8d9f6dc68d72d153ea277d19d66d775fa02ed06687e3392040c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98be87b4cb316839099b69460cc1c1ecd98d5162c7dfc995c8f6d86ad477fedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:07Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:07 crc kubenswrapper[4676]: I0930 13:59:07.611523 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:07Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:07 crc kubenswrapper[4676]: I0930 13:59:07.619232 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:07 crc kubenswrapper[4676]: I0930 13:59:07.619267 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:07 crc kubenswrapper[4676]: I0930 13:59:07.619276 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:07 crc kubenswrapper[4676]: I0930 13:59:07.619291 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:07 crc kubenswrapper[4676]: I0930 13:59:07.619303 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:07Z","lastTransitionTime":"2025-09-30T13:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:07 crc kubenswrapper[4676]: I0930 13:59:07.625742 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86284cfd-7d99-4f3b-bb3a-593b66737ae3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf15b6b87f6e760bf922b02b49658153875a79c4253bc50a895372c460e0f077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f838844cf38dd9b54b6054863a395010ea5cb87ca8066642ffe388365a91f5a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://151ae8ebe882d8c0150cd97088a56e52f55f0ae066819f8e4aaae2279a61881f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff5fec130c99031e2cb7164e0a3d5c4af84e2b2ac25d3cf255008dc5c1ab967\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3049f477ecf683e57b583e8f15aa9e5335347be632ffc02799ef279fefe9b446\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 13:58:31.152502 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:58:31.168708 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3623665920/tls.crt::/tmp/serving-cert-3623665920/tls.key\\\\\\\"\\\\nI0930 13:58:36.402239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:58:36.407288 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:58:36.407313 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:58:36.407332 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:58:36.407337 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:58:36.414209 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:58:36.414232 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414236 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:58:36.414244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:58:36.414247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:58:36.414250 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:58:36.414249 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:58:36.417272 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e2c7ce3911aaca47ffc14e5628e9efb3b32594d812028b2659b8f8ca681d53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:07Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:07 crc kubenswrapper[4676]: I0930 13:59:07.638471 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pw2gp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f8ff358-e9b0-478e-acbf-e30059107be1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a57e948acbb622af47bb656f844fcdca4ff7dbf2256fc07d7a73951621b06b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kq87r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f5e3bfb55994ad5563473790f09c8a934f98e96fe7c88bd1f0dc1ae3dfc7a0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kq87r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pw2gp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:07Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:07 crc kubenswrapper[4676]: I0930 13:59:07.650565 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sksn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47ea2c6-e937-4411-b4c9-98048a5e5f05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtbn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtbn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sksn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:07Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:07 crc kubenswrapper[4676]: I0930 13:59:07.665162 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ksfzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70e11f41-7d9e-49ef-a2f5-0691d5f8f631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72f096ef608f833a542cea2dde328dd04393d90bcf9c20601499b9bb49734e8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e68434ff5ca08dde9677cf96ed4703958895aa880048ef99eead372a1a06bcf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e68434ff5ca08dde9677cf96ed4703958895aa880048ef99eead372a1a06bcf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f888604a37c64c17b61fbe754fb766b3a8dd363cd84342d3f464ac3b377b703c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f888604a37c64c17b61fbe754fb766b3a8dd363cd84342d3f464ac3b377b703c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a4a6ea3224024cc371c57de0a5be4082b7bd9092090f7af42bedda1a897d78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70a4a6ea3224024cc371c57de0a5be4082b7bd9092090f7af42bedda1a897d78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62090d66b5fc63638db525937297cd84b886fc27446d8df9935ee6b45f7d8610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62090d66b5fc63638db525937297cd84b886fc27446d8df9935ee6b45f7d8610\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4452138f5ffc57e76d26761a49ebe4f782695f4d0298426e19d9fc9f87202645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4452138f5ffc57e76d26761a49ebe4f782695f4d0298426e19d9fc9f87202645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2ac4de1ea55982e71a3ecd8eaffb3d44648249635c4cdfbe0181f0497a2d1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c2ac4de1ea55982e71a3ecd8eaffb3d44648249635c4cdfbe0181f0497a2d1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ksfzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:07Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:07 crc kubenswrapper[4676]: I0930 13:59:07.679131 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af133cb7-f0e4-428e-b348-c6e81493fc1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8aa7f7317a33eb0f00d05f59ec96a210aa79be528b5dc94d099b25fa019511a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7djb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ba88aa74993edfc49871c7cb7c45c498fc9a6d1e2cc573864d4656d4ade55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7djb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4k2dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:07Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:07 crc kubenswrapper[4676]: I0930 13:59:07.721695 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:07 crc kubenswrapper[4676]: I0930 13:59:07.721757 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:07 crc kubenswrapper[4676]: I0930 13:59:07.721774 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:07 crc kubenswrapper[4676]: I0930 13:59:07.721800 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:07 crc kubenswrapper[4676]: I0930 13:59:07.721815 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:07Z","lastTransitionTime":"2025-09-30T13:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:07 crc kubenswrapper[4676]: I0930 13:59:07.824654 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:07 crc kubenswrapper[4676]: I0930 13:59:07.824713 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:07 crc kubenswrapper[4676]: I0930 13:59:07.824729 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:07 crc kubenswrapper[4676]: I0930 13:59:07.824746 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:07 crc kubenswrapper[4676]: I0930 13:59:07.824797 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:07Z","lastTransitionTime":"2025-09-30T13:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:07 crc kubenswrapper[4676]: I0930 13:59:07.928189 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:07 crc kubenswrapper[4676]: I0930 13:59:07.928249 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:07 crc kubenswrapper[4676]: I0930 13:59:07.928268 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:07 crc kubenswrapper[4676]: I0930 13:59:07.928292 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:07 crc kubenswrapper[4676]: I0930 13:59:07.928305 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:07Z","lastTransitionTime":"2025-09-30T13:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:08 crc kubenswrapper[4676]: I0930 13:59:08.032019 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:08 crc kubenswrapper[4676]: I0930 13:59:08.032070 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:08 crc kubenswrapper[4676]: I0930 13:59:08.032083 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:08 crc kubenswrapper[4676]: I0930 13:59:08.032103 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:08 crc kubenswrapper[4676]: I0930 13:59:08.032116 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:08Z","lastTransitionTime":"2025-09-30T13:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:08 crc kubenswrapper[4676]: I0930 13:59:08.137222 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:08 crc kubenswrapper[4676]: I0930 13:59:08.137298 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:08 crc kubenswrapper[4676]: I0930 13:59:08.137318 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:08 crc kubenswrapper[4676]: I0930 13:59:08.137351 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:08 crc kubenswrapper[4676]: I0930 13:59:08.137372 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:08Z","lastTransitionTime":"2025-09-30T13:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:08 crc kubenswrapper[4676]: I0930 13:59:08.154079 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:59:08 crc kubenswrapper[4676]: I0930 13:59:08.154205 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:59:08 crc kubenswrapper[4676]: E0930 13:59:08.154259 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 13:59:40.154231053 +0000 UTC m=+84.137319492 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:59:08 crc kubenswrapper[4676]: I0930 13:59:08.154342 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:59:08 crc kubenswrapper[4676]: E0930 13:59:08.154389 4676 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 13:59:08 crc kubenswrapper[4676]: E0930 13:59:08.154491 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 13:59:40.154464018 +0000 UTC m=+84.137552487 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 13:59:08 crc kubenswrapper[4676]: E0930 13:59:08.154499 4676 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 13:59:08 crc kubenswrapper[4676]: E0930 13:59:08.154542 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 13:59:40.15453274 +0000 UTC m=+84.137621179 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 13:59:08 crc kubenswrapper[4676]: I0930 13:59:08.245163 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:08 crc kubenswrapper[4676]: I0930 13:59:08.245269 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:08 crc kubenswrapper[4676]: I0930 13:59:08.245313 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:08 crc kubenswrapper[4676]: I0930 13:59:08.245357 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:08 crc kubenswrapper[4676]: I0930 13:59:08.245390 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:08Z","lastTransitionTime":"2025-09-30T13:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:08 crc kubenswrapper[4676]: I0930 13:59:08.255618 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:59:08 crc kubenswrapper[4676]: I0930 13:59:08.255692 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:59:08 crc kubenswrapper[4676]: E0930 13:59:08.255854 4676 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 13:59:08 crc kubenswrapper[4676]: E0930 13:59:08.255914 4676 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 13:59:08 crc kubenswrapper[4676]: E0930 13:59:08.255931 4676 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:59:08 crc kubenswrapper[4676]: E0930 13:59:08.255988 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 13:59:40.255967895 +0000 UTC m=+84.239056334 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:59:08 crc kubenswrapper[4676]: E0930 13:59:08.256149 4676 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 13:59:08 crc kubenswrapper[4676]: E0930 13:59:08.256240 4676 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 13:59:08 crc kubenswrapper[4676]: E0930 13:59:08.256267 4676 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:59:08 crc kubenswrapper[4676]: E0930 13:59:08.256376 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 13:59:40.256345535 +0000 UTC m=+84.239434004 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:59:08 crc kubenswrapper[4676]: I0930 13:59:08.348744 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:08 crc kubenswrapper[4676]: I0930 13:59:08.348804 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:08 crc kubenswrapper[4676]: I0930 13:59:08.348816 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:08 crc kubenswrapper[4676]: I0930 13:59:08.348835 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:08 crc kubenswrapper[4676]: I0930 13:59:08.348851 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:08Z","lastTransitionTime":"2025-09-30T13:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:08 crc kubenswrapper[4676]: I0930 13:59:08.432106 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:59:08 crc kubenswrapper[4676]: I0930 13:59:08.432190 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:59:08 crc kubenswrapper[4676]: E0930 13:59:08.432251 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:59:08 crc kubenswrapper[4676]: I0930 13:59:08.432347 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:59:08 crc kubenswrapper[4676]: E0930 13:59:08.432524 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:59:08 crc kubenswrapper[4676]: E0930 13:59:08.432663 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:59:08 crc kubenswrapper[4676]: I0930 13:59:08.452064 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:08 crc kubenswrapper[4676]: I0930 13:59:08.452119 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:08 crc kubenswrapper[4676]: I0930 13:59:08.452135 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:08 crc kubenswrapper[4676]: I0930 13:59:08.452157 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:08 crc kubenswrapper[4676]: I0930 13:59:08.452169 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:08Z","lastTransitionTime":"2025-09-30T13:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:08 crc kubenswrapper[4676]: I0930 13:59:08.555310 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:08 crc kubenswrapper[4676]: I0930 13:59:08.555372 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:08 crc kubenswrapper[4676]: I0930 13:59:08.555385 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:08 crc kubenswrapper[4676]: I0930 13:59:08.555410 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:08 crc kubenswrapper[4676]: I0930 13:59:08.555427 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:08Z","lastTransitionTime":"2025-09-30T13:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:08 crc kubenswrapper[4676]: I0930 13:59:08.657591 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:08 crc kubenswrapper[4676]: I0930 13:59:08.657623 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:08 crc kubenswrapper[4676]: I0930 13:59:08.657631 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:08 crc kubenswrapper[4676]: I0930 13:59:08.657644 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:08 crc kubenswrapper[4676]: I0930 13:59:08.657654 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:08Z","lastTransitionTime":"2025-09-30T13:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:08 crc kubenswrapper[4676]: I0930 13:59:08.759920 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:08 crc kubenswrapper[4676]: I0930 13:59:08.759963 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:08 crc kubenswrapper[4676]: I0930 13:59:08.759977 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:08 crc kubenswrapper[4676]: I0930 13:59:08.759993 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:08 crc kubenswrapper[4676]: I0930 13:59:08.760009 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:08Z","lastTransitionTime":"2025-09-30T13:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:08 crc kubenswrapper[4676]: I0930 13:59:08.861754 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:08 crc kubenswrapper[4676]: I0930 13:59:08.861792 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:08 crc kubenswrapper[4676]: I0930 13:59:08.861808 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:08 crc kubenswrapper[4676]: I0930 13:59:08.861825 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:08 crc kubenswrapper[4676]: I0930 13:59:08.861838 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:08Z","lastTransitionTime":"2025-09-30T13:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:08 crc kubenswrapper[4676]: I0930 13:59:08.964065 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:08 crc kubenswrapper[4676]: I0930 13:59:08.964124 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:08 crc kubenswrapper[4676]: I0930 13:59:08.964133 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:08 crc kubenswrapper[4676]: I0930 13:59:08.964145 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:08 crc kubenswrapper[4676]: I0930 13:59:08.964154 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:08Z","lastTransitionTime":"2025-09-30T13:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.065977 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.066010 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.066018 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.066031 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.066040 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:09Z","lastTransitionTime":"2025-09-30T13:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.168899 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.168928 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.168937 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.168950 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.168960 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:09Z","lastTransitionTime":"2025-09-30T13:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.271577 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.271624 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.271633 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.271649 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.271660 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:09Z","lastTransitionTime":"2025-09-30T13:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.374596 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.374693 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.374708 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.374730 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.374765 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:09Z","lastTransitionTime":"2025-09-30T13:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.433086 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sksn7" Sep 30 13:59:09 crc kubenswrapper[4676]: E0930 13:59:09.433317 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sksn7" podUID="e47ea2c6-e937-4411-b4c9-98048a5e5f05" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.434606 4676 scope.go:117] "RemoveContainer" containerID="bb3451ef827f096afd86039c4c7569ef452c57ddf12d48687f99553e0a851d4c" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.450512 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:09Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.465755 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7stxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e065baf-f38b-4397-bbbe-ef52eea10f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f5fe4110f265eb89d70f5d85485514e7fea4ed909359edaa0c76d305d1a9c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nkkjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7stxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:09Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.478193 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.478237 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.478247 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.478264 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.478275 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:09Z","lastTransitionTime":"2025-09-30T13:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.480406 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qmgfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0fc06a-eb2b-4fa4-8554-ce8a0fd62074\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb44b6e0c8081c62bace086876620d80d412864b9afb7fd495413baf0363c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mwsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qmgfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:09Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.503518 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fae6bdf-2a3f-4961-934d-b8f653412538\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dbb7809575c05151462f55a6d1d4eca91ba859d817d152220688e61385beff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2def96f529a78f915d69adfd74b48057081c10bf8f21bd69fce0d4b305f80a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a8ea8d40cefaa9290dfd65c7f1fa80963910c38b71de3992bdcf77fde19eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1323c0deb00658d6452b2d6aa4515233551626a4c4def3f4fff539db4a3f5038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6836ea3a8ec1a7fdff3ef83bd00963c21caeb4e8264f3b6775201c90b83d4653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f1d81c0d66a327a4174d0f39a9540f66d89a6c2049490ddad67174aa684b773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb3451ef827f096afd86039c4c7569ef452c57ddf12d48687f99553e0a851d4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb3451ef827f096afd86039c4c7569ef452c57ddf12d48687f99553e0a851d4c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:58:57Z\\\",\\\"message\\\":\\\" 6225 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:58:57.637573 6225 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 13:58:57.637613 6225 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 13:58:57.637633 6225 factory.go:656] Stopping watch factory\\\\nI0930 13:58:57.637649 6225 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 13:58:57.637657 6225 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 13:58:57.637908 6225 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:58:57.638020 6225 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:58:57.638147 6225 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:58:57.651829 6225 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0930 13:58:57.651866 6225 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0930 13:58:57.651948 6225 ovnkube.go:599] Stopped ovnkube\\\\nI0930 13:58:57.651969 6225 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0930 13:58:57.652043 6225 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-9775s_openshift-ovn-kubernetes(4fae6bdf-2a3f-4961-934d-b8f653412538)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d05c6ffe2c46461b1604661e27c75ac6f150de39e30e983d4c4dfbe9c84092e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6b55521ba448057d6f78ab17ef0a1a7d4c8e593e5c7a23bebb4db492fb4b54f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6b55521ba448057d6f78ab17ef0a1a7d4c8e593e5c7a23bebb4db492fb4b54f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9775s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:09Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.516688 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:09Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.531231 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2355fa03-bfc0-4ba0-941c-ceee570ab4b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb64f69d88447dcfe2a2e206f1e59b4b10f370ac02a3f9653e0b95e6d94529c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f628b643cc4c4a9a0d4b2c6d923d6ee5839b325a873da093e84dfc48255548f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86ba7b90dcf7a1fcd028d2fcf19bb86de8c58446ef34191d50a5055f4042bb71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5591cefc1293c938f8a2a8db06c7287d0e54d55a9d99c65a40126ca77abccd80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:09Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.549304 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e12a8bf70722b099302a9c3b8d88526bbea68746ea168b2bf41306a3e18641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:09Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.565689 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e1c7778cf1d8d9f6dc68d72d153ea277d19d66d775fa02ed06687e3392040c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98be87b4cb316839099b69460cc1c1ecd98d5162c7dfc995c8f6d86ad477fedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:09Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.581266 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.581291 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.581301 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.581313 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.581325 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:09Z","lastTransitionTime":"2025-09-30T13:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.585174 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:09Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.603206 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938f56b18fe369ceeb13a74b0c257fd1b7fafeeb4d331327d4508f482caca32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:09Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.617910 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s7q5x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12808c49-1bed-4251-bcbe-fad6207eea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://397ae2103d09a79613ccc4281fb5318128fa7dc84e8013a5327cf3cd31490049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84swg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s7q5x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:09Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.640972 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9343584b-a3e2-45a2-9f71-b815d206f44c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://168155413f2783887fde890a607fd98c1823da5a45b7fc550ef9358750451da7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69000b0d87b6f1e68fb36dd2349f95de211b17cc4e7bd7ec489969feca3810d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0228750af194e47c41da2795fb0b21b2c44b61e7d5a8943a943793954b33e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://288cb819756dee05f608c0e6875f3a2b4eb629a5d07db35aaf706a4bc4110899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9e51dab4ec7df53e36f707786a013e4fca0909f0d69cb9113dfdff6e0c9a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:09Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.655706 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pw2gp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f8ff358-e9b0-478e-acbf-e30059107be1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a57e948acbb622af47bb656f844fcdca4ff7dbf2256fc07d7a73951621b06b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kq87r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f5e3bfb55994ad5563473790f09c8a934f98e96fe7c88bd1f0dc1ae3dfc7a0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kq87r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pw2gp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:09Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.671010 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sksn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47ea2c6-e937-4411-b4c9-98048a5e5f05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtbn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtbn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sksn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:09Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.684257 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.684314 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.684324 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.684346 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.684356 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:09Z","lastTransitionTime":"2025-09-30T13:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.686774 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86284cfd-7d99-4f3b-bb3a-593b66737ae3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf15b6b87f6e760bf922b02b49658153875a79c4253bc50a895372c460e0f077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f838844cf38dd9b54b6054863a395010ea5cb87ca8066642ffe388365a91f5a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://151ae8ebe882d8c0150cd97088a56e52f55f0ae066819f8e4aaae2279a61881f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff5fec130c99031e2cb7164e0a3d5c4af84e2b2ac25d3cf255008dc5c1ab967\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3049f477ecf683e57b583e8f15aa9e5335347be632ffc02799ef279fefe9b446\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 13:58:31.152502 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:58:31.168708 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3623665920/tls.crt::/tmp/serving-cert-3623665920/tls.key\\\\\\\"\\\\nI0930 13:58:36.402239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:58:36.407288 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:58:36.407313 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:58:36.407332 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:58:36.407337 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:58:36.414209 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:58:36.414232 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414236 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:58:36.414244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:58:36.414247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:58:36.414250 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:58:36.414249 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:58:36.417272 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e2c7ce3911aaca47ffc14e5628e9efb3b32594d812028b2659b8f8ca681d53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:09Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.703097 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ksfzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70e11f41-7d9e-49ef-a2f5-0691d5f8f631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72f096ef608f833a542cea2dde328dd04393d90bcf9c20601499b9bb49734e8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e68434ff5ca08dde9677cf96ed4703958895aa880048ef99eead372a1a06bcf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e68434ff5ca08dde9677cf96ed4703958895aa880048ef99eead372a1a06bcf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f888604a37c64c17b61fbe754fb766b3a8dd363cd84342d3f464ac3b377b703c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f888604a37c64c17b61fbe754fb766b3a8dd363cd84342d3f464ac3b377b703c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a4a6ea3224024cc371c57de0a5be4082b7bd9092090f7af42bedda1a897d78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70a4a6ea3224024cc371c57de0a5be4082b7bd9092090f7af42bedda1a897d78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62090d66b5fc63638db525937297cd84b886fc27446d8df9935ee6b45f7d8610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62090d66b5fc63638db525937297cd84b886fc27446d8df9935ee6b45f7d8610\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4452138f5ffc57e76d26761a49ebe4f782695f4d0298426e19d9fc9f87202645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4452138f5ffc57e76d26761a49ebe4f782695f4d0298426e19d9fc9f87202645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2ac4de1ea55982e71a3ecd8eaffb3d44648249635c4cdfbe0181f0497a2d1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c2ac4de1ea55982e71a3ecd8eaffb3d44648249635c4cdfbe0181f0497a2d1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ksfzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:09Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.714109 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af133cb7-f0e4-428e-b348-c6e81493fc1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8aa7f7317a33eb0f00d05f59ec96a210aa79be528b5dc94d099b25fa019511a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7djb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ba88aa74993edfc49871c7cb7c45c498fc9a6d1e2cc573864d4656d4ade55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7djb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4k2dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:09Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.735350 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9775s_4fae6bdf-2a3f-4961-934d-b8f653412538/ovnkube-controller/1.log" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.738717 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" event={"ID":"4fae6bdf-2a3f-4961-934d-b8f653412538","Type":"ContainerStarted","Data":"63ef46221a74ba486be9f66f0c25be6a8d34edccc58a35fedcd728dc2b61ba90"} Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.739157 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.755564 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:09Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.770359 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:09Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.780163 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7stxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e065baf-f38b-4397-bbbe-ef52eea10f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f5fe4110f265eb89d70f5d85485514e7fea4ed909359edaa0c76d305d1a9c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nkkjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7stxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:09Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.786795 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.786858 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.786869 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.786904 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.786915 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:09Z","lastTransitionTime":"2025-09-30T13:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.791473 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qmgfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0fc06a-eb2b-4fa4-8554-ce8a0fd62074\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb44b6e0c8081c62bace086876620d80d412864b9afb7fd495413baf0363c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mwsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qmgfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:09Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.811558 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fae6bdf-2a3f-4961-934d-b8f653412538\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dbb7809575c05151462f55a6d1d4eca91ba859d817d152220688e61385beff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2def96f529a78f915d69adfd74b48057081c10bf8f21bd69fce0d4b305f80a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a8ea8d40cefaa9290dfd65c7f1fa80963910c38b71de3992bdcf77fde19eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1323c0deb00658d6452b2d6aa4515233551626a4c4def3f4fff539db4a3f5038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6836ea3a8ec1a7fdff3ef83bd00963c21caeb4e8264f3b6775201c90b83d4653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f1d81c0d66a327a4174d0f39a9540f66d89a6c2049490ddad67174aa684b773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63ef46221a74ba486be9f66f0c25be6a8d34edccc58a35fedcd728dc2b61ba90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb3451ef827f096afd86039c4c7569ef452c57ddf12d48687f99553e0a851d4c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:58:57Z\\\",\\\"message\\\":\\\" 6225 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:58:57.637573 6225 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 13:58:57.637613 6225 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 13:58:57.637633 6225 factory.go:656] Stopping watch factory\\\\nI0930 13:58:57.637649 6225 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 13:58:57.637657 6225 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 13:58:57.637908 6225 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:58:57.638020 6225 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:58:57.638147 6225 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:58:57.651829 6225 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0930 13:58:57.651866 6225 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0930 13:58:57.651948 6225 ovnkube.go:599] Stopped ovnkube\\\\nI0930 13:58:57.651969 6225 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0930 13:58:57.652043 6225 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:59:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d05c6ffe2c46461b1604661e27c75ac6f150de39e30e983d4c4dfbe9c84092e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6b55521ba448057d6f78ab17ef0a1a7d4c8e593e5c7a23bebb4db492fb4b54f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6b55521ba448057d6f78ab17ef0a1a7d4c8e593e5c7a23bebb4db492fb4b54f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9775s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:09Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.844112 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:09Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.864240 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938f56b18fe369ceeb13a74b0c257fd1b7fafeeb4d331327d4508f482caca32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:09Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.877826 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s7q5x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12808c49-1bed-4251-bcbe-fad6207eea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://397ae2103d09a79613ccc4281fb5318128fa7dc84e8013a5327cf3cd31490049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84swg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s7q5x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:09Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.890045 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.890090 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.890100 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.890116 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.890131 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:09Z","lastTransitionTime":"2025-09-30T13:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.898341 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9343584b-a3e2-45a2-9f71-b815d206f44c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://168155413f2783887fde890a607fd98c1823da5a45b7fc550ef9358750451da7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69000b0d87b6f1e68fb36dd2349f95de211b17cc4e7bd7ec489969feca3810d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0228750af194e47c41da2795fb0b21b2c44b61e7d5a8943a943793954b33e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://288cb819756dee05f608c0e6875f3a2b4eb629a5d07db35aaf706a4bc4110899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9e51dab4ec7df53e36f707786a013e4fca0909f0d69cb9113dfdff6e0c9a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:09Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.913548 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2355fa03-bfc0-4ba0-941c-ceee570ab4b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb64f69d88447dcfe2a2e206f1e59b4b10f370ac02a3f9653e0b95e6d94529c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f628b643cc4c4a9a0d4b2c6d923d6ee5839b325a873da093e84dfc48255548f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86ba7b90dcf7a1fcd028d2fcf19bb86de8c58446ef34191d50a5055f4042bb71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5591cefc1293c938f8a2a8db06c7287d0e54d55a9d99c65a40126ca77abccd80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:09Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.930471 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e12a8bf70722b099302a9c3b8d88526bbea68746ea168b2bf41306a3e18641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:09Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.943455 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e1c7778cf1d8d9f6dc68d72d153ea277d19d66d775fa02ed06687e3392040c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98be87b4cb316839099b69460cc1c1ecd98d5162c7dfc995c8f6d86ad477fedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:09Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.959008 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86284cfd-7d99-4f3b-bb3a-593b66737ae3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf15b6b87f6e760bf922b02b49658153875a79c4253bc50a895372c460e0f077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f838844cf38dd9b54b6054863a395010ea5cb87ca8066642ffe388365a91f5a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://151ae8ebe882d8c0150cd97088a56e52f55f0ae066819f8e4aaae2279a61881f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff5fec130c99031e2cb7164e0a3d5c4af84e2b2ac25d3cf255008dc5c1ab967\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3049f477ecf683e57b583e8f15aa9e5335347be632ffc02799ef279fefe9b446\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 13:58:31.152502 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:58:31.168708 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3623665920/tls.crt::/tmp/serving-cert-3623665920/tls.key\\\\\\\"\\\\nI0930 13:58:36.402239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:58:36.407288 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:58:36.407313 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:58:36.407332 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:58:36.407337 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:58:36.414209 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:58:36.414232 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414236 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:58:36.414244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:58:36.414247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:58:36.414250 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:58:36.414249 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:58:36.417272 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e2c7ce3911aaca47ffc14e5628e9efb3b32594d812028b2659b8f8ca681d53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:09Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.972424 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pw2gp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f8ff358-e9b0-478e-acbf-e30059107be1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a57e948acbb622af47bb656f844fcdca4ff7dbf2256fc07d7a73951621b06b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kq87r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f5e3bfb55994ad5563473790f09c8a934f98e96fe7c88bd1f0dc1ae3dfc7a0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kq87r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pw2gp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:09Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.986625 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sksn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47ea2c6-e937-4411-b4c9-98048a5e5f05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtbn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtbn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sksn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:09Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.992121 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.992156 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.992165 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.992197 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:09 crc kubenswrapper[4676]: I0930 13:59:09.992206 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:09Z","lastTransitionTime":"2025-09-30T13:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.006315 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ksfzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70e11f41-7d9e-49ef-a2f5-0691d5f8f631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72f096ef608f833a542cea2dde328dd04393d90bcf9c20601499b9bb49734e8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e68434ff5ca08dde9677cf96ed4703958895aa880048ef99eead372a1a06bcf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e68434ff5ca08dde9677cf96ed4703958895aa880048ef99eead372a1a06bcf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f888604a37c64c17b61fbe754fb766b3a8dd363cd84342d3f464ac3b377b703c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f888604a37c64c17b61fbe754fb766b3a8dd363cd84342d3f464ac3b377b703c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a4a6ea3224024cc371c57de0a5be4082b7bd9092090f7af42bedda1a897d78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70a4a6ea3224024cc371c57de0a5be4082b7bd9092090f7af42bedda1a897d78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62090d66b5fc63638db525937297cd84b886fc27446d8df9935ee6b45f7d8610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62090d66b5fc63638db525937297cd84b886fc27446d8df9935ee6b45f7d8610\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4452138f5ffc57e76d26761a49ebe4f782695f4d0298426e19d9fc9f87202645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4452138f5ffc57e76d26761a49ebe4f782695f4d0298426e19d9fc9f87202645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2ac4de1ea55982e71a3ecd8eaffb3d44648249635c4cdfbe0181f0497a2d1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c2ac4de1ea55982e71a3ecd8eaffb3d44648249635c4cdfbe0181f0497a2d1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ksfzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:10Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.019909 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af133cb7-f0e4-428e-b348-c6e81493fc1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8aa7f7317a33eb0f00d05f59ec96a210aa79be528b5dc94d099b25fa019511a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7djb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ba88aa74993edfc49871c7cb7c45c498fc9a6d1e2cc573864d4656d4ade55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7djb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4k2dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:10Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.095190 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.095522 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.095677 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.095836 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.095946 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:10Z","lastTransitionTime":"2025-09-30T13:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.198423 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.198455 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.198465 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.198480 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.198493 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:10Z","lastTransitionTime":"2025-09-30T13:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.301072 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.301100 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.301112 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.301129 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.301141 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:10Z","lastTransitionTime":"2025-09-30T13:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.403860 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.403921 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.403930 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.403945 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.403956 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:10Z","lastTransitionTime":"2025-09-30T13:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.432918 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.432979 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:59:10 crc kubenswrapper[4676]: E0930 13:59:10.433036 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:59:10 crc kubenswrapper[4676]: E0930 13:59:10.433096 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.433153 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:59:10 crc kubenswrapper[4676]: E0930 13:59:10.433621 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.499849 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.506194 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.506250 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.506263 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.506285 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.506300 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:10Z","lastTransitionTime":"2025-09-30T13:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.511413 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.520961 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ksfzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70e11f41-7d9e-49ef-a2f5-0691d5f8f631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72f096ef608f833a542cea2dde328dd04393d90bcf9c20601499b9bb49734e8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e68434ff5ca08dde9677cf96ed4703958895aa880048ef99eead372a1a06bcf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e68434ff5ca08dde9677cf96ed4703958895aa880048ef99eead372a1a06bcf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f888604a37c64c17b61fbe754fb766b3a8dd363cd84342d3f464ac3b377b703c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f888604a37c64c17b61fbe754fb766b3a8dd363cd84342d3f464ac3b377b703c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a4a6ea3224024cc371c57de0a5be4082b7bd9092090f7af42bedda1a897d78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70a4a6ea3224024cc371c57de0a5be4082b7bd9092090f7af42bedda1a897d78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62090d66b5fc63638db525937297cd84b886fc27446d8df9935ee6b45f7d8610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62090d66b5fc63638db525937297cd84b886fc27446d8df9935ee6b45f7d8610\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4452138f5ffc57e76d26761a49ebe4f782695f4d0298426e19d9fc9f87202645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4452138f5ffc57e76d26761a49ebe4f782695f4d0298426e19d9fc9f87202645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2ac4de1ea55982e71a3ecd8eaffb3d44648249635c4cdfbe0181f0497a2d1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c2ac4de1ea55982e71a3ecd8eaffb3d44648249635c4cdfbe0181f0497a2d1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ksfzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:10Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.535599 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af133cb7-f0e4-428e-b348-c6e81493fc1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8aa7f7317a33eb0f00d05f59ec96a210aa79be528b5dc94d099b25fa019511a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7djb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ba88aa74993edfc49871c7cb7c45c498fc9a6d1e2cc573864d4656d4ade55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7djb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4k2dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:10Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.551058 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:10Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.565112 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:10Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.575657 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7stxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e065baf-f38b-4397-bbbe-ef52eea10f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f5fe4110f265eb89d70f5d85485514e7fea4ed909359edaa0c76d305d1a9c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nkkjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7stxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:10Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.587302 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qmgfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0fc06a-eb2b-4fa4-8554-ce8a0fd62074\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb44b6e0c8081c62bace086876620d80d412864b9afb7fd495413baf0363c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mwsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qmgfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:10Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.605511 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fae6bdf-2a3f-4961-934d-b8f653412538\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dbb7809575c05151462f55a6d1d4eca91ba859d817d152220688e61385beff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2def96f529a78f915d69adfd74b48057081c10bf8f21bd69fce0d4b305f80a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a8ea8d40cefaa9290dfd65c7f1fa80963910c38b71de3992bdcf77fde19eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1323c0deb00658d6452b2d6aa4515233551626a4c4def3f4fff539db4a3f5038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6836ea3a8ec1a7fdff3ef83bd00963c21caeb4e8264f3b6775201c90b83d4653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f1d81c0d66a327a4174d0f39a9540f66d89a6c2049490ddad67174aa684b773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63ef46221a74ba486be9f66f0c25be6a8d34edccc58a35fedcd728dc2b61ba90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb3451ef827f096afd86039c4c7569ef452c57ddf12d48687f99553e0a851d4c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:58:57Z\\\",\\\"message\\\":\\\" 6225 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:58:57.637573 6225 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 13:58:57.637613 6225 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 13:58:57.637633 6225 factory.go:656] Stopping watch factory\\\\nI0930 13:58:57.637649 6225 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 13:58:57.637657 6225 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 13:58:57.637908 6225 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:58:57.638020 6225 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:58:57.638147 6225 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:58:57.651829 6225 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0930 13:58:57.651866 6225 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0930 13:58:57.651948 6225 ovnkube.go:599] Stopped ovnkube\\\\nI0930 13:58:57.651969 6225 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0930 13:58:57.652043 6225 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:59:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d05c6ffe2c46461b1604661e27c75ac6f150de39e30e983d4c4dfbe9c84092e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6b55521ba448057d6f78ab17ef0a1a7d4c8e593e5c7a23bebb4db492fb4b54f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6b55521ba448057d6f78ab17ef0a1a7d4c8e593e5c7a23bebb4db492fb4b54f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9775s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:10Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.609650 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.609713 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.609732 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.609758 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.609780 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:10Z","lastTransitionTime":"2025-09-30T13:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.622734 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s7q5x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12808c49-1bed-4251-bcbe-fad6207eea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://397ae2103d09a79613ccc4281fb5318128fa7dc84e8013a5327cf3cd31490049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84swg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s7q5x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:10Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.663946 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9343584b-a3e2-45a2-9f71-b815d206f44c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://168155413f2783887fde890a607fd98c1823da5a45b7fc550ef9358750451da7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69000b0d87b6f1e68fb36dd2349f95de211b17cc4e7bd7ec489969feca3810d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0228750af194e47c41da2795fb0b21b2c44b61e7d5a8943a943793954b33e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://288cb819756dee05f608c0e6875f3a2b4eb629a5d07db35aaf706a4bc4110899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9e51dab4ec7df53e36f707786a013e4fca0909f0d69cb9113dfdff6e0c9a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:10Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.679007 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2355fa03-bfc0-4ba0-941c-ceee570ab4b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb64f69d88447dcfe2a2e206f1e59b4b10f370ac02a3f9653e0b95e6d94529c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f628b643cc4c4a9a0d4b2c6d923d6ee5839b325a873da093e84dfc48255548f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86ba7b90dcf7a1fcd028d2fcf19bb86de8c58446ef34191d50a5055f4042bb71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5591cefc1293c938f8a2a8db06c7287d0e54d55a9d99c65a40126ca77abccd80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:10Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.694248 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e12a8bf70722b099302a9c3b8d88526bbea68746ea168b2bf41306a3e18641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:10Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.708528 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e1c7778cf1d8d9f6dc68d72d153ea277d19d66d775fa02ed06687e3392040c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98be87b4cb316839099b69460cc1c1ecd98d5162c7dfc995c8f6d86ad477fedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:10Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.712380 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.712416 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.712428 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.712445 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.712459 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:10Z","lastTransitionTime":"2025-09-30T13:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.726332 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:10Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.744346 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938f56b18fe369ceeb13a74b0c257fd1b7fafeeb4d331327d4508f482caca32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:10Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.745046 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9775s_4fae6bdf-2a3f-4961-934d-b8f653412538/ovnkube-controller/2.log" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.745821 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9775s_4fae6bdf-2a3f-4961-934d-b8f653412538/ovnkube-controller/1.log" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.750217 4676 generic.go:334] "Generic (PLEG): container finished" podID="4fae6bdf-2a3f-4961-934d-b8f653412538" containerID="63ef46221a74ba486be9f66f0c25be6a8d34edccc58a35fedcd728dc2b61ba90" exitCode=1 Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.751353 4676 scope.go:117] "RemoveContainer" containerID="63ef46221a74ba486be9f66f0c25be6a8d34edccc58a35fedcd728dc2b61ba90" Sep 30 13:59:10 crc kubenswrapper[4676]: E0930 13:59:10.751480 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-9775s_openshift-ovn-kubernetes(4fae6bdf-2a3f-4961-934d-b8f653412538)\"" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" podUID="4fae6bdf-2a3f-4961-934d-b8f653412538" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.751527 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" event={"ID":"4fae6bdf-2a3f-4961-934d-b8f653412538","Type":"ContainerDied","Data":"63ef46221a74ba486be9f66f0c25be6a8d34edccc58a35fedcd728dc2b61ba90"} Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.751567 4676 scope.go:117] "RemoveContainer" containerID="bb3451ef827f096afd86039c4c7569ef452c57ddf12d48687f99553e0a851d4c" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.761899 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86284cfd-7d99-4f3b-bb3a-593b66737ae3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf15b6b87f6e760bf922b02b49658153875a79c4253bc50a895372c460e0f077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f838844cf38dd9b54b6054863a395010ea5cb87ca8066642ffe388365a91f5a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://151ae8ebe882d8c0150cd97088a56e52f55f0ae066819f8e4aaae2279a61881f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff5fec130c99031e2cb7164e0a3d5c4af84e2b2ac25d3cf255008dc5c1ab967\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3049f477ecf683e57b583e8f15aa9e5335347be632ffc02799ef279fefe9b446\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 13:58:31.152502 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:58:31.168708 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3623665920/tls.crt::/tmp/serving-cert-3623665920/tls.key\\\\\\\"\\\\nI0930 13:58:36.402239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:58:36.407288 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:58:36.407313 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:58:36.407332 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:58:36.407337 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:58:36.414209 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:58:36.414232 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414236 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:58:36.414244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:58:36.414247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:58:36.414250 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:58:36.414249 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:58:36.417272 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e2c7ce3911aaca47ffc14e5628e9efb3b32594d812028b2659b8f8ca681d53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:10Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.777147 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pw2gp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f8ff358-e9b0-478e-acbf-e30059107be1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a57e948acbb622af47bb656f844fcdca4ff7dbf2256fc07d7a73951621b06b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kq87r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f5e3bfb55994ad5563473790f09c8a934f98e96fe7c88bd1f0dc1ae3dfc7a0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kq87r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pw2gp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:10Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.787228 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sksn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47ea2c6-e937-4411-b4c9-98048a5e5f05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtbn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtbn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sksn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:10Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.799709 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86284cfd-7d99-4f3b-bb3a-593b66737ae3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf15b6b87f6e760bf922b02b49658153875a79c4253bc50a895372c460e0f077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f838844cf38dd9b54b6054863a395010ea5cb87ca8066642ffe388365a91f5a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://151ae8ebe882d8c0150cd97088a56e52f55f0ae066819f8e4aaae2279a61881f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff5fec130c99031e2cb7164e0a3d5c4af84e2b2ac25d3cf255008dc5c1ab967\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3049f477ecf683e57b583e8f15aa9e5335347be632ffc02799ef279fefe9b446\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 13:58:31.152502 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:58:31.168708 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3623665920/tls.crt::/tmp/serving-cert-3623665920/tls.key\\\\\\\"\\\\nI0930 13:58:36.402239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:58:36.407288 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:58:36.407313 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:58:36.407332 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:58:36.407337 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:58:36.414209 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:58:36.414232 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414236 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:58:36.414244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:58:36.414247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:58:36.414250 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:58:36.414249 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:58:36.417272 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e2c7ce3911aaca47ffc14e5628e9efb3b32594d812028b2659b8f8ca681d53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:10Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.809616 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pw2gp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f8ff358-e9b0-478e-acbf-e30059107be1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a57e948acbb622af47bb656f844fcdca4ff7dbf2256fc07d7a73951621b06b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kq87r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f5e3bfb55994ad5563473790f09c8a934f98e96fe7c88bd1f0dc1ae3dfc7a0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kq87r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pw2gp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:10Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.815892 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.815947 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.815963 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.815989 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.816002 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:10Z","lastTransitionTime":"2025-09-30T13:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.818956 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sksn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47ea2c6-e937-4411-b4c9-98048a5e5f05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtbn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtbn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sksn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:10Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.833733 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ksfzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70e11f41-7d9e-49ef-a2f5-0691d5f8f631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72f096ef608f833a542cea2dde328dd04393d90bcf9c20601499b9bb49734e8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e68434ff5ca08dde9677cf96ed4703958895aa880048ef99eead372a1a06bcf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e68434ff5ca08dde9677cf96ed4703958895aa880048ef99eead372a1a06bcf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f888604a37c64c17b61fbe754fb766b3a8dd363cd84342d3f464ac3b377b703c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f888604a37c64c17b61fbe754fb766b3a8dd363cd84342d3f464ac3b377b703c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a4a6ea3224024cc371c57de0a5be4082b7bd9092090f7af42bedda1a897d78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70a4a6ea3224024cc371c57de0a5be4082b7bd9092090f7af42bedda1a897d78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62090d66b5fc63638db525937297cd84b886fc27446d8df9935ee6b45f7d8610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62090d66b5fc63638db525937297cd84b886fc27446d8df9935ee6b45f7d8610\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4452138f5ffc57e76d26761a49ebe4f782695f4d0298426e19d9fc9f87202645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4452138f5ffc57e76d26761a49ebe4f782695f4d0298426e19d9fc9f87202645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2ac4de1ea55982e71a3ecd8eaffb3d44648249635c4cdfbe0181f0497a2d1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c2ac4de1ea55982e71a3ecd8eaffb3d44648249635c4cdfbe0181f0497a2d1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ksfzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:10Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.845440 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af133cb7-f0e4-428e-b348-c6e81493fc1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8aa7f7317a33eb0f00d05f59ec96a210aa79be528b5dc94d099b25fa019511a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7djb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ba88aa74993edfc49871c7cb7c45c498fc9a6d1e2cc573864d4656d4ade55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7djb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4k2dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:10Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.857923 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:10Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.874604 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:10Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.890301 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7stxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e065baf-f38b-4397-bbbe-ef52eea10f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f5fe4110f265eb89d70f5d85485514e7fea4ed909359edaa0c76d305d1a9c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nkkjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7stxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:10Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.905386 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qmgfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0fc06a-eb2b-4fa4-8554-ce8a0fd62074\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb44b6e0c8081c62bace086876620d80d412864b9afb7fd495413baf0363c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mwsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qmgfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:10Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.919683 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.919747 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.919758 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.919781 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.919828 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:10Z","lastTransitionTime":"2025-09-30T13:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.932753 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fae6bdf-2a3f-4961-934d-b8f653412538\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dbb7809575c05151462f55a6d1d4eca91ba859d817d152220688e61385beff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2def96f529a78f915d69adfd74b48057081c10bf8f21bd69fce0d4b305f80a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a8ea8d40cefaa9290dfd65c7f1fa80963910c38b71de3992bdcf77fde19eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1323c0deb00658d6452b2d6aa4515233551626a4c4def3f4fff539db4a3f5038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6836ea3a8ec1a7fdff3ef83bd00963c21caeb4e8264f3b6775201c90b83d4653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f1d81c0d66a327a4174d0f39a9540f66d89a6c2049490ddad67174aa684b773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63ef46221a74ba486be9f66f0c25be6a8d34edccc58a35fedcd728dc2b61ba90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb3451ef827f096afd86039c4c7569ef452c57ddf12d48687f99553e0a851d4c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:58:57Z\\\",\\\"message\\\":\\\" 6225 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:58:57.637573 6225 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 13:58:57.637613 6225 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 13:58:57.637633 6225 factory.go:656] Stopping watch factory\\\\nI0930 13:58:57.637649 6225 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 13:58:57.637657 6225 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 13:58:57.637908 6225 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:58:57.638020 6225 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:58:57.638147 6225 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 13:58:57.651829 6225 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0930 13:58:57.651866 6225 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0930 13:58:57.651948 6225 ovnkube.go:599] Stopped ovnkube\\\\nI0930 13:58:57.651969 6225 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0930 13:58:57.652043 6225 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63ef46221a74ba486be9f66f0c25be6a8d34edccc58a35fedcd728dc2b61ba90\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:59:10Z\\\",\\\"message\\\":\\\"eVersion:\\\\\\\"26904\\\\\\\", FieldPath:\\\\\\\"\\\\\\\"}): type: 'Warning' reason: 'ErrorAddingResource' addLogicalPort failed for openshift-multus/network-metrics-daemon-sksn7: failed to update pod openshift-multus/network-metrics-daemon-sksn7: Internal error occurred: failed calling webhook \\\\\\\"pod.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/pod?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:10Z is after 2025-08-24T17:21:41Z\\\\nF0930 13:59:10.306789 6383 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:10Z is after 2025-0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:59:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d05c6ffe2c46461b1604661e27c75ac6f150de39e30e983d4c4dfbe9c84092e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6b55521ba448057d6f78ab17ef0a1a7d4c8e593e5c7a23bebb4db492fb4b54f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6b55521ba448057d6f78ab17ef0a1a7d4c8e593e5c7a23bebb4db492fb4b54f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9775s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:10Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.949075 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s7q5x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12808c49-1bed-4251-bcbe-fad6207eea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://397ae2103d09a79613ccc4281fb5318128fa7dc84e8013a5327cf3cd31490049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84swg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s7q5x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:10Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.971872 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9343584b-a3e2-45a2-9f71-b815d206f44c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://168155413f2783887fde890a607fd98c1823da5a45b7fc550ef9358750451da7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69000b0d87b6f1e68fb36dd2349f95de211b17cc4e7bd7ec489969feca3810d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0228750af194e47c41da2795fb0b21b2c44b61e7d5a8943a943793954b33e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://288cb819756dee05f608c0e6875f3a2b4eb629a5d07db35aaf706a4bc4110899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9e51dab4ec7df53e36f707786a013e4fca0909f0d69cb9113dfdff6e0c9a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:10Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:10 crc kubenswrapper[4676]: I0930 13:59:10.990372 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2355fa03-bfc0-4ba0-941c-ceee570ab4b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb64f69d88447dcfe2a2e206f1e59b4b10f370ac02a3f9653e0b95e6d94529c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f628b643cc4c4a9a0d4b2c6d923d6ee5839b325a873da093e84dfc48255548f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86ba7b90dcf7a1fcd028d2fcf19bb86de8c58446ef34191d50a5055f4042bb71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5591cefc1293c938f8a2a8db06c7287d0e54d55a9d99c65a40126ca77abccd80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:10Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.005490 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f9bd59d-f026-4c72-9734-c670da1df387\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a7307312b0604e0dced8cbc2b8ed07583ec6650b05091d4c5f41a08da70691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14d07155343479f6d7f452a871bca1361e4fd22efd3ef828281e1836117454a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44aa745d9d01236afaa143bbc447bc230cac1190229c1c74ecdf92555cc721eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05f3170673f4e23690bc0c5543265931501d623eda84ee7df8e515e1ad8b6346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05f3170673f4e23690bc0c5543265931501d623eda84ee7df8e515e1ad8b6346\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:11Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.020175 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e12a8bf70722b099302a9c3b8d88526bbea68746ea168b2bf41306a3e18641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:11Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.022051 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.022120 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.022136 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.022159 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.022178 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:11Z","lastTransitionTime":"2025-09-30T13:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.034121 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e1c7778cf1d8d9f6dc68d72d153ea277d19d66d775fa02ed06687e3392040c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98be87b4cb316839099b69460cc1c1ecd98d5162c7dfc995c8f6d86ad477fedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:11Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.049109 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:11Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.063331 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938f56b18fe369ceeb13a74b0c257fd1b7fafeeb4d331327d4508f482caca32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:11Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.125075 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.125114 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.125123 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.125138 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.125151 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:11Z","lastTransitionTime":"2025-09-30T13:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.227491 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.227541 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.227553 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.227573 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.227589 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:11Z","lastTransitionTime":"2025-09-30T13:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.330761 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.330823 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.330839 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.330861 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.330900 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:11Z","lastTransitionTime":"2025-09-30T13:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.432404 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sksn7" Sep 30 13:59:11 crc kubenswrapper[4676]: E0930 13:59:11.432734 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sksn7" podUID="e47ea2c6-e937-4411-b4c9-98048a5e5f05" Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.433806 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.433849 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.433861 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.433898 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.433910 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:11Z","lastTransitionTime":"2025-09-30T13:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.535626 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.535670 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.535679 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.535690 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.535699 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:11Z","lastTransitionTime":"2025-09-30T13:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.639171 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.639220 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.639231 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.639245 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.639257 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:11Z","lastTransitionTime":"2025-09-30T13:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.741337 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.741399 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.741419 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.741438 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.741455 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:11Z","lastTransitionTime":"2025-09-30T13:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.753925 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9775s_4fae6bdf-2a3f-4961-934d-b8f653412538/ovnkube-controller/2.log" Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.756926 4676 scope.go:117] "RemoveContainer" containerID="63ef46221a74ba486be9f66f0c25be6a8d34edccc58a35fedcd728dc2b61ba90" Sep 30 13:59:11 crc kubenswrapper[4676]: E0930 13:59:11.757084 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-9775s_openshift-ovn-kubernetes(4fae6bdf-2a3f-4961-934d-b8f653412538)\"" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" podUID="4fae6bdf-2a3f-4961-934d-b8f653412538" Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.769119 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s7q5x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12808c49-1bed-4251-bcbe-fad6207eea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://397ae2103d09a79613ccc4281fb5318128fa7dc84e8013a5327cf3cd31490049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84swg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s7q5x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:11Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.786633 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9343584b-a3e2-45a2-9f71-b815d206f44c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://168155413f2783887fde890a607fd98c1823da5a45b7fc550ef9358750451da7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69000b0d87b6f1e68fb36dd2349f95de211b17cc4e7bd7ec489969feca3810d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0228750af194e47c41da2795fb0b21b2c44b61e7d5a8943a943793954b33e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://288cb819756dee05f608c0e6875f3a2b4eb629a5d07db35aaf706a4bc4110899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9e51dab4ec7df53e36f707786a013e4fca0909f0d69cb9113dfdff6e0c9a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:11Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.797716 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2355fa03-bfc0-4ba0-941c-ceee570ab4b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb64f69d88447dcfe2a2e206f1e59b4b10f370ac02a3f9653e0b95e6d94529c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f628b643cc4c4a9a0d4b2c6d923d6ee5839b325a873da093e84dfc48255548f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86ba7b90dcf7a1fcd028d2fcf19bb86de8c58446ef34191d50a5055f4042bb71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5591cefc1293c938f8a2a8db06c7287d0e54d55a9d99c65a40126ca77abccd80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:11Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.808080 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f9bd59d-f026-4c72-9734-c670da1df387\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a7307312b0604e0dced8cbc2b8ed07583ec6650b05091d4c5f41a08da70691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14d07155343479f6d7f452a871bca1361e4fd22efd3ef828281e1836117454a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44aa745d9d01236afaa143bbc447bc230cac1190229c1c74ecdf92555cc721eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05f3170673f4e23690bc0c5543265931501d623eda84ee7df8e515e1ad8b6346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05f3170673f4e23690bc0c5543265931501d623eda84ee7df8e515e1ad8b6346\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:11Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.818739 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e12a8bf70722b099302a9c3b8d88526bbea68746ea168b2bf41306a3e18641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:11Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.828964 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e1c7778cf1d8d9f6dc68d72d153ea277d19d66d775fa02ed06687e3392040c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98be87b4cb316839099b69460cc1c1ecd98d5162c7dfc995c8f6d86ad477fedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:11Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.839917 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:11Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.843626 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.843673 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.843688 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.843721 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.843742 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:11Z","lastTransitionTime":"2025-09-30T13:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.850192 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938f56b18fe369ceeb13a74b0c257fd1b7fafeeb4d331327d4508f482caca32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:11Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.862530 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86284cfd-7d99-4f3b-bb3a-593b66737ae3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf15b6b87f6e760bf922b02b49658153875a79c4253bc50a895372c460e0f077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f838844cf38dd9b54b6054863a395010ea5cb87ca8066642ffe388365a91f5a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://151ae8ebe882d8c0150cd97088a56e52f55f0ae066819f8e4aaae2279a61881f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff5fec130c99031e2cb7164e0a3d5c4af84e2b2ac25d3cf255008dc5c1ab967\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3049f477ecf683e57b583e8f15aa9e5335347be632ffc02799ef279fefe9b446\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 13:58:31.152502 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:58:31.168708 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3623665920/tls.crt::/tmp/serving-cert-3623665920/tls.key\\\\\\\"\\\\nI0930 13:58:36.402239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:58:36.407288 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:58:36.407313 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:58:36.407332 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:58:36.407337 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:58:36.414209 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:58:36.414232 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414236 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:58:36.414244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:58:36.414247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:58:36.414250 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:58:36.414249 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:58:36.417272 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e2c7ce3911aaca47ffc14e5628e9efb3b32594d812028b2659b8f8ca681d53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:11Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.875066 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pw2gp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f8ff358-e9b0-478e-acbf-e30059107be1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a57e948acbb622af47bb656f844fcdca4ff7dbf2256fc07d7a73951621b06b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kq87r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f5e3bfb55994ad5563473790f09c8a934f98e96fe7c88bd1f0dc1ae3dfc7a0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kq87r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pw2gp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:11Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.886438 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sksn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47ea2c6-e937-4411-b4c9-98048a5e5f05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtbn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtbn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sksn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:11Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.894945 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e47ea2c6-e937-4411-b4c9-98048a5e5f05-metrics-certs\") pod \"network-metrics-daemon-sksn7\" (UID: \"e47ea2c6-e937-4411-b4c9-98048a5e5f05\") " pod="openshift-multus/network-metrics-daemon-sksn7" Sep 30 13:59:11 crc kubenswrapper[4676]: E0930 13:59:11.895086 4676 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 13:59:11 crc kubenswrapper[4676]: E0930 13:59:11.895146 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e47ea2c6-e937-4411-b4c9-98048a5e5f05-metrics-certs podName:e47ea2c6-e937-4411-b4c9-98048a5e5f05 nodeName:}" failed. No retries permitted until 2025-09-30 13:59:27.895129581 +0000 UTC m=+71.878218020 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e47ea2c6-e937-4411-b4c9-98048a5e5f05-metrics-certs") pod "network-metrics-daemon-sksn7" (UID: "e47ea2c6-e937-4411-b4c9-98048a5e5f05") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.905390 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ksfzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70e11f41-7d9e-49ef-a2f5-0691d5f8f631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72f096ef608f833a542cea2dde328dd04393d90bcf9c20601499b9bb49734e8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e68434ff5ca08dde9677cf96ed4703958895aa880048ef99eead372a1a06bcf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e68434ff5ca08dde9677cf96ed4703958895aa880048ef99eead372a1a06bcf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f888604a37c64c17b61fbe754fb766b3a8dd363cd84342d3f464ac3b377b703c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f888604a37c64c17b61fbe754fb766b3a8dd363cd84342d3f464ac3b377b703c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a4a6ea3224024cc371c57de0a5be4082b7bd9092090f7af42bedda1a897d78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70a4a6ea3224024cc371c57de0a5be4082b7bd9092090f7af42bedda1a897d78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62090d66b5fc63638db525937297cd84b886fc27446d8df9935ee6b45f7d8610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62090d66b5fc63638db525937297cd84b886fc27446d8df9935ee6b45f7d8610\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4452138f5ffc57e76d26761a49ebe4f782695f4d0298426e19d9fc9f87202645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4452138f5ffc57e76d26761a49ebe4f782695f4d0298426e19d9fc9f87202645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2ac4de1ea55982e71a3ecd8eaffb3d44648249635c4cdfbe0181f0497a2d1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c2ac4de1ea55982e71a3ecd8eaffb3d44648249635c4cdfbe0181f0497a2d1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ksfzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:11Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.916949 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af133cb7-f0e4-428e-b348-c6e81493fc1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8aa7f7317a33eb0f00d05f59ec96a210aa79be528b5dc94d099b25fa019511a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7djb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ba88aa74993edfc49871c7cb7c45c498fc9a6d1e2cc573864d4656d4ade55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7djb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4k2dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:11Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.929306 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:11Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.940556 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:11Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.946242 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.946277 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.946287 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.946301 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.946310 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:11Z","lastTransitionTime":"2025-09-30T13:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.950018 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7stxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e065baf-f38b-4397-bbbe-ef52eea10f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f5fe4110f265eb89d70f5d85485514e7fea4ed909359edaa0c76d305d1a9c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nkkjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7stxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:11Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.959289 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qmgfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0fc06a-eb2b-4fa4-8554-ce8a0fd62074\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb44b6e0c8081c62bace086876620d80d412864b9afb7fd495413baf0363c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mwsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qmgfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:11Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:11 crc kubenswrapper[4676]: I0930 13:59:11.974700 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fae6bdf-2a3f-4961-934d-b8f653412538\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dbb7809575c05151462f55a6d1d4eca91ba859d817d152220688e61385beff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2def96f529a78f915d69adfd74b48057081c10bf8f21bd69fce0d4b305f80a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a8ea8d40cefaa9290dfd65c7f1fa80963910c38b71de3992bdcf77fde19eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1323c0deb00658d6452b2d6aa4515233551626a4c4def3f4fff539db4a3f5038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6836ea3a8ec1a7fdff3ef83bd00963c21caeb4e8264f3b6775201c90b83d4653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f1d81c0d66a327a4174d0f39a9540f66d89a6c2049490ddad67174aa684b773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63ef46221a74ba486be9f66f0c25be6a8d34edccc58a35fedcd728dc2b61ba90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63ef46221a74ba486be9f66f0c25be6a8d34edccc58a35fedcd728dc2b61ba90\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:59:10Z\\\",\\\"message\\\":\\\"eVersion:\\\\\\\"26904\\\\\\\", FieldPath:\\\\\\\"\\\\\\\"}): type: 'Warning' reason: 'ErrorAddingResource' addLogicalPort failed for openshift-multus/network-metrics-daemon-sksn7: failed to update pod openshift-multus/network-metrics-daemon-sksn7: Internal error occurred: failed calling webhook \\\\\\\"pod.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/pod?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:10Z is after 2025-08-24T17:21:41Z\\\\nF0930 13:59:10.306789 6383 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:10Z is after 2025-0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:59:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-9775s_openshift-ovn-kubernetes(4fae6bdf-2a3f-4961-934d-b8f653412538)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d05c6ffe2c46461b1604661e27c75ac6f150de39e30e983d4c4dfbe9c84092e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6b55521ba448057d6f78ab17ef0a1a7d4c8e593e5c7a23bebb4db492fb4b54f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6b55521ba448057d6f78ab17ef0a1a7d4c8e593e5c7a23bebb4db492fb4b54f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9775s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:11Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:12 crc kubenswrapper[4676]: I0930 13:59:12.049795 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:12 crc kubenswrapper[4676]: I0930 13:59:12.049834 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:12 crc kubenswrapper[4676]: I0930 13:59:12.049842 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:12 crc kubenswrapper[4676]: I0930 13:59:12.049860 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:12 crc kubenswrapper[4676]: I0930 13:59:12.049873 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:12Z","lastTransitionTime":"2025-09-30T13:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:12 crc kubenswrapper[4676]: I0930 13:59:12.152221 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:12 crc kubenswrapper[4676]: I0930 13:59:12.152506 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:12 crc kubenswrapper[4676]: I0930 13:59:12.152600 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:12 crc kubenswrapper[4676]: I0930 13:59:12.152685 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:12 crc kubenswrapper[4676]: I0930 13:59:12.152761 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:12Z","lastTransitionTime":"2025-09-30T13:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:12 crc kubenswrapper[4676]: I0930 13:59:12.254861 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:12 crc kubenswrapper[4676]: I0930 13:59:12.254919 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:12 crc kubenswrapper[4676]: I0930 13:59:12.254931 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:12 crc kubenswrapper[4676]: I0930 13:59:12.254947 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:12 crc kubenswrapper[4676]: I0930 13:59:12.254959 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:12Z","lastTransitionTime":"2025-09-30T13:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:12 crc kubenswrapper[4676]: I0930 13:59:12.357674 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:12 crc kubenswrapper[4676]: I0930 13:59:12.357708 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:12 crc kubenswrapper[4676]: I0930 13:59:12.357718 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:12 crc kubenswrapper[4676]: I0930 13:59:12.357737 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:12 crc kubenswrapper[4676]: I0930 13:59:12.357746 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:12Z","lastTransitionTime":"2025-09-30T13:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:12 crc kubenswrapper[4676]: I0930 13:59:12.432242 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:59:12 crc kubenswrapper[4676]: I0930 13:59:12.432537 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:59:12 crc kubenswrapper[4676]: I0930 13:59:12.432425 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:59:12 crc kubenswrapper[4676]: E0930 13:59:12.432801 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:59:12 crc kubenswrapper[4676]: E0930 13:59:12.432964 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:59:12 crc kubenswrapper[4676]: E0930 13:59:12.433060 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:59:12 crc kubenswrapper[4676]: I0930 13:59:12.460837 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:12 crc kubenswrapper[4676]: I0930 13:59:12.461190 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:12 crc kubenswrapper[4676]: I0930 13:59:12.461312 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:12 crc kubenswrapper[4676]: I0930 13:59:12.461432 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:12 crc kubenswrapper[4676]: I0930 13:59:12.461537 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:12Z","lastTransitionTime":"2025-09-30T13:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:12 crc kubenswrapper[4676]: I0930 13:59:12.564098 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:12 crc kubenswrapper[4676]: I0930 13:59:12.564135 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:12 crc kubenswrapper[4676]: I0930 13:59:12.564144 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:12 crc kubenswrapper[4676]: I0930 13:59:12.564163 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:12 crc kubenswrapper[4676]: I0930 13:59:12.564176 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:12Z","lastTransitionTime":"2025-09-30T13:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:12 crc kubenswrapper[4676]: I0930 13:59:12.667927 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:12 crc kubenswrapper[4676]: I0930 13:59:12.667974 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:12 crc kubenswrapper[4676]: I0930 13:59:12.667987 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:12 crc kubenswrapper[4676]: I0930 13:59:12.668008 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:12 crc kubenswrapper[4676]: I0930 13:59:12.668022 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:12Z","lastTransitionTime":"2025-09-30T13:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:12 crc kubenswrapper[4676]: I0930 13:59:12.770080 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:12 crc kubenswrapper[4676]: I0930 13:59:12.770139 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:12 crc kubenswrapper[4676]: I0930 13:59:12.770150 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:12 crc kubenswrapper[4676]: I0930 13:59:12.770167 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:12 crc kubenswrapper[4676]: I0930 13:59:12.770181 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:12Z","lastTransitionTime":"2025-09-30T13:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:12 crc kubenswrapper[4676]: I0930 13:59:12.872667 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:12 crc kubenswrapper[4676]: I0930 13:59:12.872703 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:12 crc kubenswrapper[4676]: I0930 13:59:12.872716 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:12 crc kubenswrapper[4676]: I0930 13:59:12.872734 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:12 crc kubenswrapper[4676]: I0930 13:59:12.872747 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:12Z","lastTransitionTime":"2025-09-30T13:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:12 crc kubenswrapper[4676]: I0930 13:59:12.975116 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:12 crc kubenswrapper[4676]: I0930 13:59:12.975156 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:12 crc kubenswrapper[4676]: I0930 13:59:12.975166 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:12 crc kubenswrapper[4676]: I0930 13:59:12.975180 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:12 crc kubenswrapper[4676]: I0930 13:59:12.975189 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:12Z","lastTransitionTime":"2025-09-30T13:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:13 crc kubenswrapper[4676]: I0930 13:59:13.077827 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:13 crc kubenswrapper[4676]: I0930 13:59:13.077855 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:13 crc kubenswrapper[4676]: I0930 13:59:13.077862 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:13 crc kubenswrapper[4676]: I0930 13:59:13.077893 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:13 crc kubenswrapper[4676]: I0930 13:59:13.077903 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:13Z","lastTransitionTime":"2025-09-30T13:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:13 crc kubenswrapper[4676]: I0930 13:59:13.180039 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:13 crc kubenswrapper[4676]: I0930 13:59:13.180078 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:13 crc kubenswrapper[4676]: I0930 13:59:13.180086 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:13 crc kubenswrapper[4676]: I0930 13:59:13.180102 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:13 crc kubenswrapper[4676]: I0930 13:59:13.180111 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:13Z","lastTransitionTime":"2025-09-30T13:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:13 crc kubenswrapper[4676]: I0930 13:59:13.282029 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:13 crc kubenswrapper[4676]: I0930 13:59:13.282072 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:13 crc kubenswrapper[4676]: I0930 13:59:13.282084 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:13 crc kubenswrapper[4676]: I0930 13:59:13.282101 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:13 crc kubenswrapper[4676]: I0930 13:59:13.282114 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:13Z","lastTransitionTime":"2025-09-30T13:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:13 crc kubenswrapper[4676]: I0930 13:59:13.384937 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:13 crc kubenswrapper[4676]: I0930 13:59:13.384974 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:13 crc kubenswrapper[4676]: I0930 13:59:13.384986 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:13 crc kubenswrapper[4676]: I0930 13:59:13.385005 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:13 crc kubenswrapper[4676]: I0930 13:59:13.385018 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:13Z","lastTransitionTime":"2025-09-30T13:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:13 crc kubenswrapper[4676]: I0930 13:59:13.432917 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sksn7" Sep 30 13:59:13 crc kubenswrapper[4676]: E0930 13:59:13.433688 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sksn7" podUID="e47ea2c6-e937-4411-b4c9-98048a5e5f05" Sep 30 13:59:13 crc kubenswrapper[4676]: I0930 13:59:13.487774 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:13 crc kubenswrapper[4676]: I0930 13:59:13.488234 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:13 crc kubenswrapper[4676]: I0930 13:59:13.488368 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:13 crc kubenswrapper[4676]: I0930 13:59:13.488470 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:13 crc kubenswrapper[4676]: I0930 13:59:13.488560 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:13Z","lastTransitionTime":"2025-09-30T13:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:13 crc kubenswrapper[4676]: I0930 13:59:13.527570 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:13 crc kubenswrapper[4676]: I0930 13:59:13.528006 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:13 crc kubenswrapper[4676]: I0930 13:59:13.528063 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:13 crc kubenswrapper[4676]: I0930 13:59:13.528088 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:13 crc kubenswrapper[4676]: I0930 13:59:13.528099 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:13Z","lastTransitionTime":"2025-09-30T13:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:13 crc kubenswrapper[4676]: E0930 13:59:13.540952 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1d613b32-e65a-4ccd-985a-b2960fc60b41\\\",\\\"systemUUID\\\":\\\"e459744e-c3c7-46eb-b48a-f9d168ffc645\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:13Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:13 crc kubenswrapper[4676]: I0930 13:59:13.546602 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:13 crc kubenswrapper[4676]: I0930 13:59:13.546651 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:13 crc kubenswrapper[4676]: I0930 13:59:13.546671 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:13 crc kubenswrapper[4676]: I0930 13:59:13.546730 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:13 crc kubenswrapper[4676]: I0930 13:59:13.546742 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:13Z","lastTransitionTime":"2025-09-30T13:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:13 crc kubenswrapper[4676]: E0930 13:59:13.562991 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1d613b32-e65a-4ccd-985a-b2960fc60b41\\\",\\\"systemUUID\\\":\\\"e459744e-c3c7-46eb-b48a-f9d168ffc645\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:13Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:13 crc kubenswrapper[4676]: I0930 13:59:13.567992 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:13 crc kubenswrapper[4676]: I0930 13:59:13.568036 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:13 crc kubenswrapper[4676]: I0930 13:59:13.568049 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:13 crc kubenswrapper[4676]: I0930 13:59:13.568076 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:13 crc kubenswrapper[4676]: I0930 13:59:13.568089 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:13Z","lastTransitionTime":"2025-09-30T13:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:13 crc kubenswrapper[4676]: E0930 13:59:13.583508 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1d613b32-e65a-4ccd-985a-b2960fc60b41\\\",\\\"systemUUID\\\":\\\"e459744e-c3c7-46eb-b48a-f9d168ffc645\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:13Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:13 crc kubenswrapper[4676]: I0930 13:59:13.589204 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:13 crc kubenswrapper[4676]: I0930 13:59:13.589276 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:13 crc kubenswrapper[4676]: I0930 13:59:13.589290 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:13 crc kubenswrapper[4676]: I0930 13:59:13.589314 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:13 crc kubenswrapper[4676]: I0930 13:59:13.589337 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:13Z","lastTransitionTime":"2025-09-30T13:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:13 crc kubenswrapper[4676]: E0930 13:59:13.608499 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1d613b32-e65a-4ccd-985a-b2960fc60b41\\\",\\\"systemUUID\\\":\\\"e459744e-c3c7-46eb-b48a-f9d168ffc645\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:13Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:13 crc kubenswrapper[4676]: I0930 13:59:13.613252 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:13 crc kubenswrapper[4676]: I0930 13:59:13.613330 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:13 crc kubenswrapper[4676]: I0930 13:59:13.613346 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:13 crc kubenswrapper[4676]: I0930 13:59:13.613368 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:13 crc kubenswrapper[4676]: I0930 13:59:13.613534 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:13Z","lastTransitionTime":"2025-09-30T13:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:13 crc kubenswrapper[4676]: E0930 13:59:13.627114 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1d613b32-e65a-4ccd-985a-b2960fc60b41\\\",\\\"systemUUID\\\":\\\"e459744e-c3c7-46eb-b48a-f9d168ffc645\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:13Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:13 crc kubenswrapper[4676]: E0930 13:59:13.627493 4676 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 13:59:13 crc kubenswrapper[4676]: I0930 13:59:13.629580 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:13 crc kubenswrapper[4676]: I0930 13:59:13.629619 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:13 crc kubenswrapper[4676]: I0930 13:59:13.629627 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:13 crc kubenswrapper[4676]: I0930 13:59:13.629642 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:13 crc kubenswrapper[4676]: I0930 13:59:13.629652 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:13Z","lastTransitionTime":"2025-09-30T13:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:13 crc kubenswrapper[4676]: I0930 13:59:13.732540 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:13 crc kubenswrapper[4676]: I0930 13:59:13.732584 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:13 crc kubenswrapper[4676]: I0930 13:59:13.732595 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:13 crc kubenswrapper[4676]: I0930 13:59:13.732611 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:13 crc kubenswrapper[4676]: I0930 13:59:13.732619 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:13Z","lastTransitionTime":"2025-09-30T13:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:13 crc kubenswrapper[4676]: I0930 13:59:13.835066 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:13 crc kubenswrapper[4676]: I0930 13:59:13.835120 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:13 crc kubenswrapper[4676]: I0930 13:59:13.835135 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:13 crc kubenswrapper[4676]: I0930 13:59:13.835155 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:13 crc kubenswrapper[4676]: I0930 13:59:13.835165 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:13Z","lastTransitionTime":"2025-09-30T13:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:13 crc kubenswrapper[4676]: I0930 13:59:13.938468 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:13 crc kubenswrapper[4676]: I0930 13:59:13.938512 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:13 crc kubenswrapper[4676]: I0930 13:59:13.938521 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:13 crc kubenswrapper[4676]: I0930 13:59:13.938535 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:13 crc kubenswrapper[4676]: I0930 13:59:13.938545 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:13Z","lastTransitionTime":"2025-09-30T13:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:14 crc kubenswrapper[4676]: I0930 13:59:14.041434 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:14 crc kubenswrapper[4676]: I0930 13:59:14.041483 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:14 crc kubenswrapper[4676]: I0930 13:59:14.041495 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:14 crc kubenswrapper[4676]: I0930 13:59:14.041509 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:14 crc kubenswrapper[4676]: I0930 13:59:14.041520 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:14Z","lastTransitionTime":"2025-09-30T13:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:14 crc kubenswrapper[4676]: I0930 13:59:14.146225 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:14 crc kubenswrapper[4676]: I0930 13:59:14.146285 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:14 crc kubenswrapper[4676]: I0930 13:59:14.146300 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:14 crc kubenswrapper[4676]: I0930 13:59:14.146319 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:14 crc kubenswrapper[4676]: I0930 13:59:14.146627 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:14Z","lastTransitionTime":"2025-09-30T13:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:14 crc kubenswrapper[4676]: I0930 13:59:14.249319 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:14 crc kubenswrapper[4676]: I0930 13:59:14.249356 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:14 crc kubenswrapper[4676]: I0930 13:59:14.249368 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:14 crc kubenswrapper[4676]: I0930 13:59:14.249385 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:14 crc kubenswrapper[4676]: I0930 13:59:14.249399 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:14Z","lastTransitionTime":"2025-09-30T13:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:14 crc kubenswrapper[4676]: I0930 13:59:14.352236 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:14 crc kubenswrapper[4676]: I0930 13:59:14.352276 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:14 crc kubenswrapper[4676]: I0930 13:59:14.352290 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:14 crc kubenswrapper[4676]: I0930 13:59:14.352308 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:14 crc kubenswrapper[4676]: I0930 13:59:14.352320 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:14Z","lastTransitionTime":"2025-09-30T13:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:14 crc kubenswrapper[4676]: I0930 13:59:14.432871 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:59:14 crc kubenswrapper[4676]: I0930 13:59:14.432994 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:59:14 crc kubenswrapper[4676]: E0930 13:59:14.433057 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:59:14 crc kubenswrapper[4676]: E0930 13:59:14.433109 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:59:14 crc kubenswrapper[4676]: I0930 13:59:14.432899 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:59:14 crc kubenswrapper[4676]: E0930 13:59:14.433185 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:59:14 crc kubenswrapper[4676]: I0930 13:59:14.454904 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:14 crc kubenswrapper[4676]: I0930 13:59:14.454949 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:14 crc kubenswrapper[4676]: I0930 13:59:14.454962 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:14 crc kubenswrapper[4676]: I0930 13:59:14.454977 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:14 crc kubenswrapper[4676]: I0930 13:59:14.454991 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:14Z","lastTransitionTime":"2025-09-30T13:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:14 crc kubenswrapper[4676]: I0930 13:59:14.556931 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:14 crc kubenswrapper[4676]: I0930 13:59:14.557237 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:14 crc kubenswrapper[4676]: I0930 13:59:14.557336 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:14 crc kubenswrapper[4676]: I0930 13:59:14.557421 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:14 crc kubenswrapper[4676]: I0930 13:59:14.557515 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:14Z","lastTransitionTime":"2025-09-30T13:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:14 crc kubenswrapper[4676]: I0930 13:59:14.660202 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:14 crc kubenswrapper[4676]: I0930 13:59:14.660255 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:14 crc kubenswrapper[4676]: I0930 13:59:14.660272 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:14 crc kubenswrapper[4676]: I0930 13:59:14.660295 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:14 crc kubenswrapper[4676]: I0930 13:59:14.660311 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:14Z","lastTransitionTime":"2025-09-30T13:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:14 crc kubenswrapper[4676]: I0930 13:59:14.762209 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:14 crc kubenswrapper[4676]: I0930 13:59:14.762242 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:14 crc kubenswrapper[4676]: I0930 13:59:14.762253 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:14 crc kubenswrapper[4676]: I0930 13:59:14.762268 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:14 crc kubenswrapper[4676]: I0930 13:59:14.762278 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:14Z","lastTransitionTime":"2025-09-30T13:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:14 crc kubenswrapper[4676]: I0930 13:59:14.864818 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:14 crc kubenswrapper[4676]: I0930 13:59:14.864855 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:14 crc kubenswrapper[4676]: I0930 13:59:14.864865 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:14 crc kubenswrapper[4676]: I0930 13:59:14.864896 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:14 crc kubenswrapper[4676]: I0930 13:59:14.864907 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:14Z","lastTransitionTime":"2025-09-30T13:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:14 crc kubenswrapper[4676]: I0930 13:59:14.967632 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:14 crc kubenswrapper[4676]: I0930 13:59:14.968107 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:14 crc kubenswrapper[4676]: I0930 13:59:14.968218 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:14 crc kubenswrapper[4676]: I0930 13:59:14.968308 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:14 crc kubenswrapper[4676]: I0930 13:59:14.968411 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:14Z","lastTransitionTime":"2025-09-30T13:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:15 crc kubenswrapper[4676]: I0930 13:59:15.071401 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:15 crc kubenswrapper[4676]: I0930 13:59:15.071458 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:15 crc kubenswrapper[4676]: I0930 13:59:15.071474 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:15 crc kubenswrapper[4676]: I0930 13:59:15.071492 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:15 crc kubenswrapper[4676]: I0930 13:59:15.071503 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:15Z","lastTransitionTime":"2025-09-30T13:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:15 crc kubenswrapper[4676]: I0930 13:59:15.174957 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:15 crc kubenswrapper[4676]: I0930 13:59:15.175038 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:15 crc kubenswrapper[4676]: I0930 13:59:15.175055 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:15 crc kubenswrapper[4676]: I0930 13:59:15.175080 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:15 crc kubenswrapper[4676]: I0930 13:59:15.175092 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:15Z","lastTransitionTime":"2025-09-30T13:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:15 crc kubenswrapper[4676]: I0930 13:59:15.277743 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:15 crc kubenswrapper[4676]: I0930 13:59:15.277783 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:15 crc kubenswrapper[4676]: I0930 13:59:15.277793 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:15 crc kubenswrapper[4676]: I0930 13:59:15.277808 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:15 crc kubenswrapper[4676]: I0930 13:59:15.277817 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:15Z","lastTransitionTime":"2025-09-30T13:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:15 crc kubenswrapper[4676]: I0930 13:59:15.380651 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:15 crc kubenswrapper[4676]: I0930 13:59:15.380704 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:15 crc kubenswrapper[4676]: I0930 13:59:15.380715 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:15 crc kubenswrapper[4676]: I0930 13:59:15.380738 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:15 crc kubenswrapper[4676]: I0930 13:59:15.380750 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:15Z","lastTransitionTime":"2025-09-30T13:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:15 crc kubenswrapper[4676]: I0930 13:59:15.432800 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sksn7" Sep 30 13:59:15 crc kubenswrapper[4676]: E0930 13:59:15.433006 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sksn7" podUID="e47ea2c6-e937-4411-b4c9-98048a5e5f05" Sep 30 13:59:15 crc kubenswrapper[4676]: I0930 13:59:15.484268 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:15 crc kubenswrapper[4676]: I0930 13:59:15.484313 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:15 crc kubenswrapper[4676]: I0930 13:59:15.484325 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:15 crc kubenswrapper[4676]: I0930 13:59:15.484343 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:15 crc kubenswrapper[4676]: I0930 13:59:15.484354 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:15Z","lastTransitionTime":"2025-09-30T13:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:15 crc kubenswrapper[4676]: I0930 13:59:15.586944 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:15 crc kubenswrapper[4676]: I0930 13:59:15.587013 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:15 crc kubenswrapper[4676]: I0930 13:59:15.587028 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:15 crc kubenswrapper[4676]: I0930 13:59:15.587071 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:15 crc kubenswrapper[4676]: I0930 13:59:15.587090 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:15Z","lastTransitionTime":"2025-09-30T13:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:15 crc kubenswrapper[4676]: I0930 13:59:15.689230 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:15 crc kubenswrapper[4676]: I0930 13:59:15.689273 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:15 crc kubenswrapper[4676]: I0930 13:59:15.689287 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:15 crc kubenswrapper[4676]: I0930 13:59:15.689322 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:15 crc kubenswrapper[4676]: I0930 13:59:15.689335 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:15Z","lastTransitionTime":"2025-09-30T13:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:15 crc kubenswrapper[4676]: I0930 13:59:15.791455 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:15 crc kubenswrapper[4676]: I0930 13:59:15.791496 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:15 crc kubenswrapper[4676]: I0930 13:59:15.791507 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:15 crc kubenswrapper[4676]: I0930 13:59:15.791522 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:15 crc kubenswrapper[4676]: I0930 13:59:15.791532 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:15Z","lastTransitionTime":"2025-09-30T13:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:15 crc kubenswrapper[4676]: I0930 13:59:15.894295 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:15 crc kubenswrapper[4676]: I0930 13:59:15.894329 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:15 crc kubenswrapper[4676]: I0930 13:59:15.894340 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:15 crc kubenswrapper[4676]: I0930 13:59:15.894355 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:15 crc kubenswrapper[4676]: I0930 13:59:15.894365 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:15Z","lastTransitionTime":"2025-09-30T13:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:15 crc kubenswrapper[4676]: I0930 13:59:15.996935 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:15 crc kubenswrapper[4676]: I0930 13:59:15.996970 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:15 crc kubenswrapper[4676]: I0930 13:59:15.996980 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:15 crc kubenswrapper[4676]: I0930 13:59:15.996993 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:15 crc kubenswrapper[4676]: I0930 13:59:15.997002 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:15Z","lastTransitionTime":"2025-09-30T13:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:16 crc kubenswrapper[4676]: I0930 13:59:16.100328 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:16 crc kubenswrapper[4676]: I0930 13:59:16.100372 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:16 crc kubenswrapper[4676]: I0930 13:59:16.100383 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:16 crc kubenswrapper[4676]: I0930 13:59:16.100398 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:16 crc kubenswrapper[4676]: I0930 13:59:16.100408 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:16Z","lastTransitionTime":"2025-09-30T13:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:16 crc kubenswrapper[4676]: I0930 13:59:16.204321 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:16 crc kubenswrapper[4676]: I0930 13:59:16.204537 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:16 crc kubenswrapper[4676]: I0930 13:59:16.204631 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:16 crc kubenswrapper[4676]: I0930 13:59:16.204711 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:16 crc kubenswrapper[4676]: I0930 13:59:16.204773 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:16Z","lastTransitionTime":"2025-09-30T13:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:16 crc kubenswrapper[4676]: I0930 13:59:16.306775 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:16 crc kubenswrapper[4676]: I0930 13:59:16.306812 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:16 crc kubenswrapper[4676]: I0930 13:59:16.306821 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:16 crc kubenswrapper[4676]: I0930 13:59:16.306835 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:16 crc kubenswrapper[4676]: I0930 13:59:16.306844 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:16Z","lastTransitionTime":"2025-09-30T13:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:16 crc kubenswrapper[4676]: I0930 13:59:16.408699 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:16 crc kubenswrapper[4676]: I0930 13:59:16.408744 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:16 crc kubenswrapper[4676]: I0930 13:59:16.408755 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:16 crc kubenswrapper[4676]: I0930 13:59:16.408772 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:16 crc kubenswrapper[4676]: I0930 13:59:16.408784 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:16Z","lastTransitionTime":"2025-09-30T13:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:16 crc kubenswrapper[4676]: I0930 13:59:16.432377 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:59:16 crc kubenswrapper[4676]: I0930 13:59:16.432469 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:59:16 crc kubenswrapper[4676]: I0930 13:59:16.432385 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:59:16 crc kubenswrapper[4676]: E0930 13:59:16.432522 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:59:16 crc kubenswrapper[4676]: E0930 13:59:16.432590 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:59:16 crc kubenswrapper[4676]: E0930 13:59:16.432655 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:59:16 crc kubenswrapper[4676]: I0930 13:59:16.511334 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:16 crc kubenswrapper[4676]: I0930 13:59:16.511390 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:16 crc kubenswrapper[4676]: I0930 13:59:16.511399 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:16 crc kubenswrapper[4676]: I0930 13:59:16.511411 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:16 crc kubenswrapper[4676]: I0930 13:59:16.511420 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:16Z","lastTransitionTime":"2025-09-30T13:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:16 crc kubenswrapper[4676]: I0930 13:59:16.614136 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:16 crc kubenswrapper[4676]: I0930 13:59:16.614180 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:16 crc kubenswrapper[4676]: I0930 13:59:16.614189 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:16 crc kubenswrapper[4676]: I0930 13:59:16.614207 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:16 crc kubenswrapper[4676]: I0930 13:59:16.614217 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:16Z","lastTransitionTime":"2025-09-30T13:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:16 crc kubenswrapper[4676]: I0930 13:59:16.716805 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:16 crc kubenswrapper[4676]: I0930 13:59:16.716849 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:16 crc kubenswrapper[4676]: I0930 13:59:16.716860 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:16 crc kubenswrapper[4676]: I0930 13:59:16.716903 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:16 crc kubenswrapper[4676]: I0930 13:59:16.716917 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:16Z","lastTransitionTime":"2025-09-30T13:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:16 crc kubenswrapper[4676]: I0930 13:59:16.819749 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:16 crc kubenswrapper[4676]: I0930 13:59:16.819817 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:16 crc kubenswrapper[4676]: I0930 13:59:16.819828 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:16 crc kubenswrapper[4676]: I0930 13:59:16.819849 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:16 crc kubenswrapper[4676]: I0930 13:59:16.819859 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:16Z","lastTransitionTime":"2025-09-30T13:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:16 crc kubenswrapper[4676]: I0930 13:59:16.923249 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:16 crc kubenswrapper[4676]: I0930 13:59:16.923324 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:16 crc kubenswrapper[4676]: I0930 13:59:16.923336 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:16 crc kubenswrapper[4676]: I0930 13:59:16.923354 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:16 crc kubenswrapper[4676]: I0930 13:59:16.923365 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:16Z","lastTransitionTime":"2025-09-30T13:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:17 crc kubenswrapper[4676]: I0930 13:59:17.026551 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:17 crc kubenswrapper[4676]: I0930 13:59:17.026598 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:17 crc kubenswrapper[4676]: I0930 13:59:17.026609 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:17 crc kubenswrapper[4676]: I0930 13:59:17.026628 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:17 crc kubenswrapper[4676]: I0930 13:59:17.026640 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:17Z","lastTransitionTime":"2025-09-30T13:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:17 crc kubenswrapper[4676]: I0930 13:59:17.129863 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:17 crc kubenswrapper[4676]: I0930 13:59:17.129926 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:17 crc kubenswrapper[4676]: I0930 13:59:17.129935 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:17 crc kubenswrapper[4676]: I0930 13:59:17.129951 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:17 crc kubenswrapper[4676]: I0930 13:59:17.129960 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:17Z","lastTransitionTime":"2025-09-30T13:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:17 crc kubenswrapper[4676]: I0930 13:59:17.232648 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:17 crc kubenswrapper[4676]: I0930 13:59:17.232695 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:17 crc kubenswrapper[4676]: I0930 13:59:17.232713 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:17 crc kubenswrapper[4676]: I0930 13:59:17.232734 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:17 crc kubenswrapper[4676]: I0930 13:59:17.232750 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:17Z","lastTransitionTime":"2025-09-30T13:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:17 crc kubenswrapper[4676]: I0930 13:59:17.336355 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:17 crc kubenswrapper[4676]: I0930 13:59:17.336416 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:17 crc kubenswrapper[4676]: I0930 13:59:17.336430 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:17 crc kubenswrapper[4676]: I0930 13:59:17.336456 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:17 crc kubenswrapper[4676]: I0930 13:59:17.336475 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:17Z","lastTransitionTime":"2025-09-30T13:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:17 crc kubenswrapper[4676]: I0930 13:59:17.432441 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sksn7" Sep 30 13:59:17 crc kubenswrapper[4676]: E0930 13:59:17.432749 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sksn7" podUID="e47ea2c6-e937-4411-b4c9-98048a5e5f05" Sep 30 13:59:17 crc kubenswrapper[4676]: I0930 13:59:17.442424 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:17 crc kubenswrapper[4676]: I0930 13:59:17.442476 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:17 crc kubenswrapper[4676]: I0930 13:59:17.442486 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:17 crc kubenswrapper[4676]: I0930 13:59:17.442501 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:17 crc kubenswrapper[4676]: I0930 13:59:17.442584 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:17Z","lastTransitionTime":"2025-09-30T13:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:17 crc kubenswrapper[4676]: I0930 13:59:17.445454 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qmgfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0fc06a-eb2b-4fa4-8554-ce8a0fd62074\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb44b6e0c8081c62bace086876620d80d412864b9afb7fd495413baf0363c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mwsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qmgfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:17Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:17 crc kubenswrapper[4676]: I0930 13:59:17.465771 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fae6bdf-2a3f-4961-934d-b8f653412538\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dbb7809575c05151462f55a6d1d4eca91ba859d817d152220688e61385beff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2def96f529a78f915d69adfd74b48057081c10bf8f21bd69fce0d4b305f80a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a8ea8d40cefaa9290dfd65c7f1fa80963910c38b71de3992bdcf77fde19eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1323c0deb00658d6452b2d6aa4515233551626a4c4def3f4fff539db4a3f5038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6836ea3a8ec1a7fdff3ef83bd00963c21caeb4e8264f3b6775201c90b83d4653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f1d81c0d66a327a4174d0f39a9540f66d89a6c2049490ddad67174aa684b773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63ef46221a74ba486be9f66f0c25be6a8d34edccc58a35fedcd728dc2b61ba90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63ef46221a74ba486be9f66f0c25be6a8d34edccc58a35fedcd728dc2b61ba90\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:59:10Z\\\",\\\"message\\\":\\\"eVersion:\\\\\\\"26904\\\\\\\", FieldPath:\\\\\\\"\\\\\\\"}): type: 'Warning' reason: 'ErrorAddingResource' addLogicalPort failed for openshift-multus/network-metrics-daemon-sksn7: failed to update pod openshift-multus/network-metrics-daemon-sksn7: Internal error occurred: failed calling webhook \\\\\\\"pod.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/pod?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:10Z is after 2025-08-24T17:21:41Z\\\\nF0930 13:59:10.306789 6383 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:10Z is after 2025-0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:59:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-9775s_openshift-ovn-kubernetes(4fae6bdf-2a3f-4961-934d-b8f653412538)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d05c6ffe2c46461b1604661e27c75ac6f150de39e30e983d4c4dfbe9c84092e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6b55521ba448057d6f78ab17ef0a1a7d4c8e593e5c7a23bebb4db492fb4b54f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6b55521ba448057d6f78ab17ef0a1a7d4c8e593e5c7a23bebb4db492fb4b54f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9775s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:17Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:17 crc kubenswrapper[4676]: I0930 13:59:17.476981 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:17Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:17 crc kubenswrapper[4676]: I0930 13:59:17.488251 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:17Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:17 crc kubenswrapper[4676]: I0930 13:59:17.497247 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7stxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e065baf-f38b-4397-bbbe-ef52eea10f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f5fe4110f265eb89d70f5d85485514e7fea4ed909359edaa0c76d305d1a9c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nkkjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7stxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:17Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:17 crc kubenswrapper[4676]: I0930 13:59:17.508296 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e12a8bf70722b099302a9c3b8d88526bbea68746ea168b2bf41306a3e18641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:17Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:17 crc kubenswrapper[4676]: I0930 13:59:17.519191 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e1c7778cf1d8d9f6dc68d72d153ea277d19d66d775fa02ed06687e3392040c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98be87b4cb316839099b69460cc1c1ecd98d5162c7dfc995c8f6d86ad477fedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:17Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:17 crc kubenswrapper[4676]: I0930 13:59:17.531411 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:17Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:17 crc kubenswrapper[4676]: I0930 13:59:17.545455 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:17 crc kubenswrapper[4676]: I0930 13:59:17.545495 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:17 crc kubenswrapper[4676]: I0930 13:59:17.545505 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:17 crc kubenswrapper[4676]: I0930 13:59:17.545546 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:17 crc kubenswrapper[4676]: I0930 13:59:17.545558 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:17Z","lastTransitionTime":"2025-09-30T13:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:17 crc kubenswrapper[4676]: I0930 13:59:17.546106 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938f56b18fe369ceeb13a74b0c257fd1b7fafeeb4d331327d4508f482caca32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:17Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:17 crc kubenswrapper[4676]: I0930 13:59:17.566608 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s7q5x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12808c49-1bed-4251-bcbe-fad6207eea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://397ae2103d09a79613ccc4281fb5318128fa7dc84e8013a5327cf3cd31490049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84swg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s7q5x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:17Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:17 crc kubenswrapper[4676]: I0930 13:59:17.585913 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9343584b-a3e2-45a2-9f71-b815d206f44c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://168155413f2783887fde890a607fd98c1823da5a45b7fc550ef9358750451da7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69000b0d87b6f1e68fb36dd2349f95de211b17cc4e7bd7ec489969feca3810d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0228750af194e47c41da2795fb0b21b2c44b61e7d5a8943a943793954b33e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://288cb819756dee05f608c0e6875f3a2b4eb629a5d07db35aaf706a4bc4110899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9e51dab4ec7df53e36f707786a013e4fca0909f0d69cb9113dfdff6e0c9a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:17Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:17 crc kubenswrapper[4676]: I0930 13:59:17.597140 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2355fa03-bfc0-4ba0-941c-ceee570ab4b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb64f69d88447dcfe2a2e206f1e59b4b10f370ac02a3f9653e0b95e6d94529c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f628b643cc4c4a9a0d4b2c6d923d6ee5839b325a873da093e84dfc48255548f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86ba7b90dcf7a1fcd028d2fcf19bb86de8c58446ef34191d50a5055f4042bb71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5591cefc1293c938f8a2a8db06c7287d0e54d55a9d99c65a40126ca77abccd80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:17Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:17 crc kubenswrapper[4676]: I0930 13:59:17.607933 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f9bd59d-f026-4c72-9734-c670da1df387\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a7307312b0604e0dced8cbc2b8ed07583ec6650b05091d4c5f41a08da70691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14d07155343479f6d7f452a871bca1361e4fd22efd3ef828281e1836117454a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44aa745d9d01236afaa143bbc447bc230cac1190229c1c74ecdf92555cc721eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05f3170673f4e23690bc0c5543265931501d623eda84ee7df8e515e1ad8b6346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05f3170673f4e23690bc0c5543265931501d623eda84ee7df8e515e1ad8b6346\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:17Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:17 crc kubenswrapper[4676]: I0930 13:59:17.621705 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86284cfd-7d99-4f3b-bb3a-593b66737ae3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf15b6b87f6e760bf922b02b49658153875a79c4253bc50a895372c460e0f077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f838844cf38dd9b54b6054863a395010ea5cb87ca8066642ffe388365a91f5a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://151ae8ebe882d8c0150cd97088a56e52f55f0ae066819f8e4aaae2279a61881f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff5fec130c99031e2cb7164e0a3d5c4af84e2b2ac25d3cf255008dc5c1ab967\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3049f477ecf683e57b583e8f15aa9e5335347be632ffc02799ef279fefe9b446\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 13:58:31.152502 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:58:31.168708 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3623665920/tls.crt::/tmp/serving-cert-3623665920/tls.key\\\\\\\"\\\\nI0930 13:58:36.402239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:58:36.407288 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:58:36.407313 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:58:36.407332 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:58:36.407337 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:58:36.414209 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:58:36.414232 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414236 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:58:36.414244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:58:36.414247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:58:36.414250 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:58:36.414249 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:58:36.417272 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e2c7ce3911aaca47ffc14e5628e9efb3b32594d812028b2659b8f8ca681d53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:17Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:17 crc kubenswrapper[4676]: I0930 13:59:17.633317 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pw2gp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f8ff358-e9b0-478e-acbf-e30059107be1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a57e948acbb622af47bb656f844fcdca4ff7dbf2256fc07d7a73951621b06b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kq87r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f5e3bfb55994ad5563473790f09c8a934f98e96fe7c88bd1f0dc1ae3dfc7a0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kq87r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pw2gp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:17Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:17 crc kubenswrapper[4676]: I0930 13:59:17.642962 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sksn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47ea2c6-e937-4411-b4c9-98048a5e5f05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtbn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtbn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sksn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:17Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:17 crc kubenswrapper[4676]: I0930 13:59:17.647470 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:17 crc kubenswrapper[4676]: I0930 13:59:17.647501 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:17 crc kubenswrapper[4676]: I0930 13:59:17.647512 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:17 crc kubenswrapper[4676]: I0930 13:59:17.647528 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:17 crc kubenswrapper[4676]: I0930 13:59:17.647538 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:17Z","lastTransitionTime":"2025-09-30T13:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:17 crc kubenswrapper[4676]: I0930 13:59:17.656039 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ksfzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70e11f41-7d9e-49ef-a2f5-0691d5f8f631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72f096ef608f833a542cea2dde328dd04393d90bcf9c20601499b9bb49734e8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e68434ff5ca08dde9677cf96ed4703958895aa880048ef99eead372a1a06bcf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e68434ff5ca08dde9677cf96ed4703958895aa880048ef99eead372a1a06bcf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f888604a37c64c17b61fbe754fb766b3a8dd363cd84342d3f464ac3b377b703c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f888604a37c64c17b61fbe754fb766b3a8dd363cd84342d3f464ac3b377b703c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a4a6ea3224024cc371c57de0a5be4082b7bd9092090f7af42bedda1a897d78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70a4a6ea3224024cc371c57de0a5be4082b7bd9092090f7af42bedda1a897d78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62090d66b5fc63638db525937297cd84b886fc27446d8df9935ee6b45f7d8610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62090d66b5fc63638db525937297cd84b886fc27446d8df9935ee6b45f7d8610\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4452138f5ffc57e76d26761a49ebe4f782695f4d0298426e19d9fc9f87202645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4452138f5ffc57e76d26761a49ebe4f782695f4d0298426e19d9fc9f87202645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2ac4de1ea55982e71a3ecd8eaffb3d44648249635c4cdfbe0181f0497a2d1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c2ac4de1ea55982e71a3ecd8eaffb3d44648249635c4cdfbe0181f0497a2d1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ksfzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:17Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:17 crc kubenswrapper[4676]: I0930 13:59:17.666568 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af133cb7-f0e4-428e-b348-c6e81493fc1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8aa7f7317a33eb0f00d05f59ec96a210aa79be528b5dc94d099b25fa019511a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7djb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ba88aa74993edfc49871c7cb7c45c498fc9a6d1e2cc573864d4656d4ade55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7djb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4k2dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:17Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:17 crc kubenswrapper[4676]: I0930 13:59:17.749502 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:17 crc kubenswrapper[4676]: I0930 13:59:17.749555 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:17 crc kubenswrapper[4676]: I0930 13:59:17.749566 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:17 crc kubenswrapper[4676]: I0930 13:59:17.749580 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:17 crc kubenswrapper[4676]: I0930 13:59:17.749590 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:17Z","lastTransitionTime":"2025-09-30T13:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:17 crc kubenswrapper[4676]: I0930 13:59:17.851755 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:17 crc kubenswrapper[4676]: I0930 13:59:17.851790 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:17 crc kubenswrapper[4676]: I0930 13:59:17.851798 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:17 crc kubenswrapper[4676]: I0930 13:59:17.851810 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:17 crc kubenswrapper[4676]: I0930 13:59:17.851819 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:17Z","lastTransitionTime":"2025-09-30T13:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:17 crc kubenswrapper[4676]: I0930 13:59:17.954720 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:17 crc kubenswrapper[4676]: I0930 13:59:17.955300 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:17 crc kubenswrapper[4676]: I0930 13:59:17.955392 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:17 crc kubenswrapper[4676]: I0930 13:59:17.955475 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:17 crc kubenswrapper[4676]: I0930 13:59:17.955564 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:17Z","lastTransitionTime":"2025-09-30T13:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:18 crc kubenswrapper[4676]: I0930 13:59:18.060141 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:18 crc kubenswrapper[4676]: I0930 13:59:18.060210 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:18 crc kubenswrapper[4676]: I0930 13:59:18.060232 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:18 crc kubenswrapper[4676]: I0930 13:59:18.060260 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:18 crc kubenswrapper[4676]: I0930 13:59:18.060277 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:18Z","lastTransitionTime":"2025-09-30T13:59:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:18 crc kubenswrapper[4676]: I0930 13:59:18.163734 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:18 crc kubenswrapper[4676]: I0930 13:59:18.163781 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:18 crc kubenswrapper[4676]: I0930 13:59:18.163792 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:18 crc kubenswrapper[4676]: I0930 13:59:18.163810 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:18 crc kubenswrapper[4676]: I0930 13:59:18.163821 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:18Z","lastTransitionTime":"2025-09-30T13:59:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:18 crc kubenswrapper[4676]: I0930 13:59:18.272322 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:18 crc kubenswrapper[4676]: I0930 13:59:18.272653 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:18 crc kubenswrapper[4676]: I0930 13:59:18.272804 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:18 crc kubenswrapper[4676]: I0930 13:59:18.272974 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:18 crc kubenswrapper[4676]: I0930 13:59:18.273619 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:18Z","lastTransitionTime":"2025-09-30T13:59:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:18 crc kubenswrapper[4676]: I0930 13:59:18.375936 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:18 crc kubenswrapper[4676]: I0930 13:59:18.375974 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:18 crc kubenswrapper[4676]: I0930 13:59:18.375983 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:18 crc kubenswrapper[4676]: I0930 13:59:18.375997 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:18 crc kubenswrapper[4676]: I0930 13:59:18.376006 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:18Z","lastTransitionTime":"2025-09-30T13:59:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:18 crc kubenswrapper[4676]: I0930 13:59:18.432867 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:59:18 crc kubenswrapper[4676]: I0930 13:59:18.433008 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:59:18 crc kubenswrapper[4676]: I0930 13:59:18.432937 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:59:18 crc kubenswrapper[4676]: E0930 13:59:18.433133 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:59:18 crc kubenswrapper[4676]: E0930 13:59:18.433272 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:59:18 crc kubenswrapper[4676]: E0930 13:59:18.433494 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:59:18 crc kubenswrapper[4676]: I0930 13:59:18.479093 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:18 crc kubenswrapper[4676]: I0930 13:59:18.479530 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:18 crc kubenswrapper[4676]: I0930 13:59:18.479598 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:18 crc kubenswrapper[4676]: I0930 13:59:18.479669 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:18 crc kubenswrapper[4676]: I0930 13:59:18.479732 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:18Z","lastTransitionTime":"2025-09-30T13:59:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:18 crc kubenswrapper[4676]: I0930 13:59:18.581813 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:18 crc kubenswrapper[4676]: I0930 13:59:18.581858 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:18 crc kubenswrapper[4676]: I0930 13:59:18.581867 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:18 crc kubenswrapper[4676]: I0930 13:59:18.581899 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:18 crc kubenswrapper[4676]: I0930 13:59:18.581909 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:18Z","lastTransitionTime":"2025-09-30T13:59:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:18 crc kubenswrapper[4676]: I0930 13:59:18.684360 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:18 crc kubenswrapper[4676]: I0930 13:59:18.684408 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:18 crc kubenswrapper[4676]: I0930 13:59:18.684418 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:18 crc kubenswrapper[4676]: I0930 13:59:18.684435 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:18 crc kubenswrapper[4676]: I0930 13:59:18.684447 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:18Z","lastTransitionTime":"2025-09-30T13:59:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:18 crc kubenswrapper[4676]: I0930 13:59:18.786342 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:18 crc kubenswrapper[4676]: I0930 13:59:18.786377 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:18 crc kubenswrapper[4676]: I0930 13:59:18.786386 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:18 crc kubenswrapper[4676]: I0930 13:59:18.786399 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:18 crc kubenswrapper[4676]: I0930 13:59:18.786449 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:18Z","lastTransitionTime":"2025-09-30T13:59:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:18 crc kubenswrapper[4676]: I0930 13:59:18.888961 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:18 crc kubenswrapper[4676]: I0930 13:59:18.888990 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:18 crc kubenswrapper[4676]: I0930 13:59:18.888999 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:18 crc kubenswrapper[4676]: I0930 13:59:18.889013 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:18 crc kubenswrapper[4676]: I0930 13:59:18.889022 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:18Z","lastTransitionTime":"2025-09-30T13:59:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:18 crc kubenswrapper[4676]: I0930 13:59:18.992443 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:18 crc kubenswrapper[4676]: I0930 13:59:18.992862 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:18 crc kubenswrapper[4676]: I0930 13:59:18.993062 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:18 crc kubenswrapper[4676]: I0930 13:59:18.993174 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:18 crc kubenswrapper[4676]: I0930 13:59:18.993301 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:18Z","lastTransitionTime":"2025-09-30T13:59:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:19 crc kubenswrapper[4676]: I0930 13:59:19.096484 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:19 crc kubenswrapper[4676]: I0930 13:59:19.096559 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:19 crc kubenswrapper[4676]: I0930 13:59:19.096574 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:19 crc kubenswrapper[4676]: I0930 13:59:19.096598 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:19 crc kubenswrapper[4676]: I0930 13:59:19.096611 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:19Z","lastTransitionTime":"2025-09-30T13:59:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:19 crc kubenswrapper[4676]: I0930 13:59:19.199207 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:19 crc kubenswrapper[4676]: I0930 13:59:19.199266 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:19 crc kubenswrapper[4676]: I0930 13:59:19.199281 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:19 crc kubenswrapper[4676]: I0930 13:59:19.199305 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:19 crc kubenswrapper[4676]: I0930 13:59:19.199321 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:19Z","lastTransitionTime":"2025-09-30T13:59:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:19 crc kubenswrapper[4676]: I0930 13:59:19.302485 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:19 crc kubenswrapper[4676]: I0930 13:59:19.302534 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:19 crc kubenswrapper[4676]: I0930 13:59:19.302547 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:19 crc kubenswrapper[4676]: I0930 13:59:19.302571 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:19 crc kubenswrapper[4676]: I0930 13:59:19.302584 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:19Z","lastTransitionTime":"2025-09-30T13:59:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:19 crc kubenswrapper[4676]: I0930 13:59:19.405113 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:19 crc kubenswrapper[4676]: I0930 13:59:19.405169 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:19 crc kubenswrapper[4676]: I0930 13:59:19.405182 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:19 crc kubenswrapper[4676]: I0930 13:59:19.405204 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:19 crc kubenswrapper[4676]: I0930 13:59:19.405218 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:19Z","lastTransitionTime":"2025-09-30T13:59:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:19 crc kubenswrapper[4676]: I0930 13:59:19.432936 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sksn7" Sep 30 13:59:19 crc kubenswrapper[4676]: E0930 13:59:19.433187 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sksn7" podUID="e47ea2c6-e937-4411-b4c9-98048a5e5f05" Sep 30 13:59:19 crc kubenswrapper[4676]: I0930 13:59:19.507593 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:19 crc kubenswrapper[4676]: I0930 13:59:19.507670 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:19 crc kubenswrapper[4676]: I0930 13:59:19.507690 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:19 crc kubenswrapper[4676]: I0930 13:59:19.507722 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:19 crc kubenswrapper[4676]: I0930 13:59:19.507741 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:19Z","lastTransitionTime":"2025-09-30T13:59:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:19 crc kubenswrapper[4676]: I0930 13:59:19.610554 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:19 crc kubenswrapper[4676]: I0930 13:59:19.610671 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:19 crc kubenswrapper[4676]: I0930 13:59:19.610687 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:19 crc kubenswrapper[4676]: I0930 13:59:19.610704 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:19 crc kubenswrapper[4676]: I0930 13:59:19.610717 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:19Z","lastTransitionTime":"2025-09-30T13:59:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:19 crc kubenswrapper[4676]: I0930 13:59:19.714164 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:19 crc kubenswrapper[4676]: I0930 13:59:19.714223 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:19 crc kubenswrapper[4676]: I0930 13:59:19.714233 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:19 crc kubenswrapper[4676]: I0930 13:59:19.714253 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:19 crc kubenswrapper[4676]: I0930 13:59:19.714264 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:19Z","lastTransitionTime":"2025-09-30T13:59:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:19 crc kubenswrapper[4676]: I0930 13:59:19.817706 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:19 crc kubenswrapper[4676]: I0930 13:59:19.817748 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:19 crc kubenswrapper[4676]: I0930 13:59:19.817758 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:19 crc kubenswrapper[4676]: I0930 13:59:19.817778 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:19 crc kubenswrapper[4676]: I0930 13:59:19.817790 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:19Z","lastTransitionTime":"2025-09-30T13:59:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:19 crc kubenswrapper[4676]: I0930 13:59:19.920742 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:19 crc kubenswrapper[4676]: I0930 13:59:19.920776 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:19 crc kubenswrapper[4676]: I0930 13:59:19.920783 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:19 crc kubenswrapper[4676]: I0930 13:59:19.920797 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:19 crc kubenswrapper[4676]: I0930 13:59:19.920805 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:19Z","lastTransitionTime":"2025-09-30T13:59:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:20 crc kubenswrapper[4676]: I0930 13:59:20.023618 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:20 crc kubenswrapper[4676]: I0930 13:59:20.023655 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:20 crc kubenswrapper[4676]: I0930 13:59:20.023668 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:20 crc kubenswrapper[4676]: I0930 13:59:20.023684 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:20 crc kubenswrapper[4676]: I0930 13:59:20.023695 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:20Z","lastTransitionTime":"2025-09-30T13:59:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:20 crc kubenswrapper[4676]: I0930 13:59:20.126668 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:20 crc kubenswrapper[4676]: I0930 13:59:20.126709 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:20 crc kubenswrapper[4676]: I0930 13:59:20.126721 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:20 crc kubenswrapper[4676]: I0930 13:59:20.126739 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:20 crc kubenswrapper[4676]: I0930 13:59:20.126752 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:20Z","lastTransitionTime":"2025-09-30T13:59:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:20 crc kubenswrapper[4676]: I0930 13:59:20.229336 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:20 crc kubenswrapper[4676]: I0930 13:59:20.229383 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:20 crc kubenswrapper[4676]: I0930 13:59:20.229395 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:20 crc kubenswrapper[4676]: I0930 13:59:20.229415 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:20 crc kubenswrapper[4676]: I0930 13:59:20.229429 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:20Z","lastTransitionTime":"2025-09-30T13:59:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:20 crc kubenswrapper[4676]: I0930 13:59:20.332817 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:20 crc kubenswrapper[4676]: I0930 13:59:20.332859 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:20 crc kubenswrapper[4676]: I0930 13:59:20.332869 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:20 crc kubenswrapper[4676]: I0930 13:59:20.332906 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:20 crc kubenswrapper[4676]: I0930 13:59:20.332919 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:20Z","lastTransitionTime":"2025-09-30T13:59:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:20 crc kubenswrapper[4676]: I0930 13:59:20.432645 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:59:20 crc kubenswrapper[4676]: I0930 13:59:20.432759 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:59:20 crc kubenswrapper[4676]: I0930 13:59:20.432866 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:59:20 crc kubenswrapper[4676]: E0930 13:59:20.432789 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:59:20 crc kubenswrapper[4676]: E0930 13:59:20.433064 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:59:20 crc kubenswrapper[4676]: E0930 13:59:20.433095 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:59:20 crc kubenswrapper[4676]: I0930 13:59:20.435320 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:20 crc kubenswrapper[4676]: I0930 13:59:20.435350 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:20 crc kubenswrapper[4676]: I0930 13:59:20.435361 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:20 crc kubenswrapper[4676]: I0930 13:59:20.435376 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:20 crc kubenswrapper[4676]: I0930 13:59:20.435386 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:20Z","lastTransitionTime":"2025-09-30T13:59:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:20 crc kubenswrapper[4676]: I0930 13:59:20.537474 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:20 crc kubenswrapper[4676]: I0930 13:59:20.537517 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:20 crc kubenswrapper[4676]: I0930 13:59:20.537530 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:20 crc kubenswrapper[4676]: I0930 13:59:20.537544 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:20 crc kubenswrapper[4676]: I0930 13:59:20.537554 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:20Z","lastTransitionTime":"2025-09-30T13:59:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:20 crc kubenswrapper[4676]: I0930 13:59:20.640794 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:20 crc kubenswrapper[4676]: I0930 13:59:20.640866 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:20 crc kubenswrapper[4676]: I0930 13:59:20.641115 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:20 crc kubenswrapper[4676]: I0930 13:59:20.641144 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:20 crc kubenswrapper[4676]: I0930 13:59:20.641156 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:20Z","lastTransitionTime":"2025-09-30T13:59:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:20 crc kubenswrapper[4676]: I0930 13:59:20.743931 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:20 crc kubenswrapper[4676]: I0930 13:59:20.744010 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:20 crc kubenswrapper[4676]: I0930 13:59:20.744024 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:20 crc kubenswrapper[4676]: I0930 13:59:20.744049 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:20 crc kubenswrapper[4676]: I0930 13:59:20.744065 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:20Z","lastTransitionTime":"2025-09-30T13:59:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:20 crc kubenswrapper[4676]: I0930 13:59:20.846416 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:20 crc kubenswrapper[4676]: I0930 13:59:20.846451 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:20 crc kubenswrapper[4676]: I0930 13:59:20.846461 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:20 crc kubenswrapper[4676]: I0930 13:59:20.846476 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:20 crc kubenswrapper[4676]: I0930 13:59:20.846485 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:20Z","lastTransitionTime":"2025-09-30T13:59:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:20 crc kubenswrapper[4676]: I0930 13:59:20.949140 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:20 crc kubenswrapper[4676]: I0930 13:59:20.949371 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:20 crc kubenswrapper[4676]: I0930 13:59:20.949435 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:20 crc kubenswrapper[4676]: I0930 13:59:20.949500 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:20 crc kubenswrapper[4676]: I0930 13:59:20.949557 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:20Z","lastTransitionTime":"2025-09-30T13:59:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:21 crc kubenswrapper[4676]: I0930 13:59:21.052268 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:21 crc kubenswrapper[4676]: I0930 13:59:21.052310 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:21 crc kubenswrapper[4676]: I0930 13:59:21.052319 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:21 crc kubenswrapper[4676]: I0930 13:59:21.052336 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:21 crc kubenswrapper[4676]: I0930 13:59:21.052346 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:21Z","lastTransitionTime":"2025-09-30T13:59:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:21 crc kubenswrapper[4676]: I0930 13:59:21.154957 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:21 crc kubenswrapper[4676]: I0930 13:59:21.155013 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:21 crc kubenswrapper[4676]: I0930 13:59:21.155025 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:21 crc kubenswrapper[4676]: I0930 13:59:21.155044 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:21 crc kubenswrapper[4676]: I0930 13:59:21.155057 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:21Z","lastTransitionTime":"2025-09-30T13:59:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:21 crc kubenswrapper[4676]: I0930 13:59:21.259056 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:21 crc kubenswrapper[4676]: I0930 13:59:21.259126 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:21 crc kubenswrapper[4676]: I0930 13:59:21.259145 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:21 crc kubenswrapper[4676]: I0930 13:59:21.259170 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:21 crc kubenswrapper[4676]: I0930 13:59:21.259186 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:21Z","lastTransitionTime":"2025-09-30T13:59:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:21 crc kubenswrapper[4676]: I0930 13:59:21.361627 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:21 crc kubenswrapper[4676]: I0930 13:59:21.361670 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:21 crc kubenswrapper[4676]: I0930 13:59:21.361681 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:21 crc kubenswrapper[4676]: I0930 13:59:21.361700 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:21 crc kubenswrapper[4676]: I0930 13:59:21.361713 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:21Z","lastTransitionTime":"2025-09-30T13:59:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:21 crc kubenswrapper[4676]: I0930 13:59:21.435157 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sksn7" Sep 30 13:59:21 crc kubenswrapper[4676]: E0930 13:59:21.435904 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sksn7" podUID="e47ea2c6-e937-4411-b4c9-98048a5e5f05" Sep 30 13:59:21 crc kubenswrapper[4676]: I0930 13:59:21.463922 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:21 crc kubenswrapper[4676]: I0930 13:59:21.463965 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:21 crc kubenswrapper[4676]: I0930 13:59:21.463980 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:21 crc kubenswrapper[4676]: I0930 13:59:21.464003 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:21 crc kubenswrapper[4676]: I0930 13:59:21.464018 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:21Z","lastTransitionTime":"2025-09-30T13:59:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:21 crc kubenswrapper[4676]: I0930 13:59:21.566716 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:21 crc kubenswrapper[4676]: I0930 13:59:21.567142 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:21 crc kubenswrapper[4676]: I0930 13:59:21.567291 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:21 crc kubenswrapper[4676]: I0930 13:59:21.567389 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:21 crc kubenswrapper[4676]: I0930 13:59:21.567464 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:21Z","lastTransitionTime":"2025-09-30T13:59:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:21 crc kubenswrapper[4676]: I0930 13:59:21.669890 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:21 crc kubenswrapper[4676]: I0930 13:59:21.669924 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:21 crc kubenswrapper[4676]: I0930 13:59:21.669935 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:21 crc kubenswrapper[4676]: I0930 13:59:21.669952 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:21 crc kubenswrapper[4676]: I0930 13:59:21.669961 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:21Z","lastTransitionTime":"2025-09-30T13:59:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:21 crc kubenswrapper[4676]: I0930 13:59:21.776120 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:21 crc kubenswrapper[4676]: I0930 13:59:21.776399 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:21 crc kubenswrapper[4676]: I0930 13:59:21.776533 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:21 crc kubenswrapper[4676]: I0930 13:59:21.776654 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:21 crc kubenswrapper[4676]: I0930 13:59:21.776727 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:21Z","lastTransitionTime":"2025-09-30T13:59:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:21 crc kubenswrapper[4676]: I0930 13:59:21.878688 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:21 crc kubenswrapper[4676]: I0930 13:59:21.878732 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:21 crc kubenswrapper[4676]: I0930 13:59:21.878772 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:21 crc kubenswrapper[4676]: I0930 13:59:21.878788 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:21 crc kubenswrapper[4676]: I0930 13:59:21.878800 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:21Z","lastTransitionTime":"2025-09-30T13:59:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:21 crc kubenswrapper[4676]: I0930 13:59:21.981140 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:21 crc kubenswrapper[4676]: I0930 13:59:21.981204 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:21 crc kubenswrapper[4676]: I0930 13:59:21.981213 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:21 crc kubenswrapper[4676]: I0930 13:59:21.981229 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:21 crc kubenswrapper[4676]: I0930 13:59:21.981242 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:21Z","lastTransitionTime":"2025-09-30T13:59:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:22 crc kubenswrapper[4676]: I0930 13:59:22.084095 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:22 crc kubenswrapper[4676]: I0930 13:59:22.084161 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:22 crc kubenswrapper[4676]: I0930 13:59:22.084173 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:22 crc kubenswrapper[4676]: I0930 13:59:22.084188 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:22 crc kubenswrapper[4676]: I0930 13:59:22.084200 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:22Z","lastTransitionTime":"2025-09-30T13:59:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:22 crc kubenswrapper[4676]: I0930 13:59:22.186787 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:22 crc kubenswrapper[4676]: I0930 13:59:22.186869 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:22 crc kubenswrapper[4676]: I0930 13:59:22.186898 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:22 crc kubenswrapper[4676]: I0930 13:59:22.186915 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:22 crc kubenswrapper[4676]: I0930 13:59:22.186927 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:22Z","lastTransitionTime":"2025-09-30T13:59:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:22 crc kubenswrapper[4676]: I0930 13:59:22.289821 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:22 crc kubenswrapper[4676]: I0930 13:59:22.289918 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:22 crc kubenswrapper[4676]: I0930 13:59:22.289933 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:22 crc kubenswrapper[4676]: I0930 13:59:22.289951 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:22 crc kubenswrapper[4676]: I0930 13:59:22.289963 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:22Z","lastTransitionTime":"2025-09-30T13:59:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:22 crc kubenswrapper[4676]: I0930 13:59:22.392387 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:22 crc kubenswrapper[4676]: I0930 13:59:22.392425 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:22 crc kubenswrapper[4676]: I0930 13:59:22.392433 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:22 crc kubenswrapper[4676]: I0930 13:59:22.392448 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:22 crc kubenswrapper[4676]: I0930 13:59:22.392459 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:22Z","lastTransitionTime":"2025-09-30T13:59:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:22 crc kubenswrapper[4676]: I0930 13:59:22.432835 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:59:22 crc kubenswrapper[4676]: I0930 13:59:22.432925 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:59:22 crc kubenswrapper[4676]: E0930 13:59:22.433016 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:59:22 crc kubenswrapper[4676]: I0930 13:59:22.433112 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:59:22 crc kubenswrapper[4676]: E0930 13:59:22.433180 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:59:22 crc kubenswrapper[4676]: E0930 13:59:22.433252 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:59:22 crc kubenswrapper[4676]: I0930 13:59:22.495570 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:22 crc kubenswrapper[4676]: I0930 13:59:22.495979 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:22 crc kubenswrapper[4676]: I0930 13:59:22.495997 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:22 crc kubenswrapper[4676]: I0930 13:59:22.496020 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:22 crc kubenswrapper[4676]: I0930 13:59:22.496033 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:22Z","lastTransitionTime":"2025-09-30T13:59:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:22 crc kubenswrapper[4676]: I0930 13:59:22.598432 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:22 crc kubenswrapper[4676]: I0930 13:59:22.598683 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:22 crc kubenswrapper[4676]: I0930 13:59:22.598750 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:22 crc kubenswrapper[4676]: I0930 13:59:22.598819 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:22 crc kubenswrapper[4676]: I0930 13:59:22.598897 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:22Z","lastTransitionTime":"2025-09-30T13:59:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:22 crc kubenswrapper[4676]: I0930 13:59:22.702083 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:22 crc kubenswrapper[4676]: I0930 13:59:22.702124 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:22 crc kubenswrapper[4676]: I0930 13:59:22.702135 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:22 crc kubenswrapper[4676]: I0930 13:59:22.702153 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:22 crc kubenswrapper[4676]: I0930 13:59:22.702169 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:22Z","lastTransitionTime":"2025-09-30T13:59:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:22 crc kubenswrapper[4676]: I0930 13:59:22.804806 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:22 crc kubenswrapper[4676]: I0930 13:59:22.804848 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:22 crc kubenswrapper[4676]: I0930 13:59:22.804860 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:22 crc kubenswrapper[4676]: I0930 13:59:22.804893 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:22 crc kubenswrapper[4676]: I0930 13:59:22.804908 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:22Z","lastTransitionTime":"2025-09-30T13:59:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:22 crc kubenswrapper[4676]: I0930 13:59:22.907313 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:22 crc kubenswrapper[4676]: I0930 13:59:22.907354 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:22 crc kubenswrapper[4676]: I0930 13:59:22.907365 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:22 crc kubenswrapper[4676]: I0930 13:59:22.907378 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:22 crc kubenswrapper[4676]: I0930 13:59:22.907388 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:22Z","lastTransitionTime":"2025-09-30T13:59:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:23 crc kubenswrapper[4676]: I0930 13:59:23.010271 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:23 crc kubenswrapper[4676]: I0930 13:59:23.010327 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:23 crc kubenswrapper[4676]: I0930 13:59:23.010338 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:23 crc kubenswrapper[4676]: I0930 13:59:23.010358 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:23 crc kubenswrapper[4676]: I0930 13:59:23.010373 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:23Z","lastTransitionTime":"2025-09-30T13:59:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:23 crc kubenswrapper[4676]: I0930 13:59:23.112637 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:23 crc kubenswrapper[4676]: I0930 13:59:23.112679 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:23 crc kubenswrapper[4676]: I0930 13:59:23.112695 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:23 crc kubenswrapper[4676]: I0930 13:59:23.112712 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:23 crc kubenswrapper[4676]: I0930 13:59:23.112722 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:23Z","lastTransitionTime":"2025-09-30T13:59:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:23 crc kubenswrapper[4676]: I0930 13:59:23.216492 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:23 crc kubenswrapper[4676]: I0930 13:59:23.216796 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:23 crc kubenswrapper[4676]: I0930 13:59:23.216904 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:23 crc kubenswrapper[4676]: I0930 13:59:23.217003 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:23 crc kubenswrapper[4676]: I0930 13:59:23.217073 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:23Z","lastTransitionTime":"2025-09-30T13:59:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:23 crc kubenswrapper[4676]: I0930 13:59:23.319988 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:23 crc kubenswrapper[4676]: I0930 13:59:23.320068 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:23 crc kubenswrapper[4676]: I0930 13:59:23.320079 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:23 crc kubenswrapper[4676]: I0930 13:59:23.320100 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:23 crc kubenswrapper[4676]: I0930 13:59:23.320112 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:23Z","lastTransitionTime":"2025-09-30T13:59:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:23 crc kubenswrapper[4676]: I0930 13:59:23.423754 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:23 crc kubenswrapper[4676]: I0930 13:59:23.424259 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:23 crc kubenswrapper[4676]: I0930 13:59:23.424280 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:23 crc kubenswrapper[4676]: I0930 13:59:23.424305 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:23 crc kubenswrapper[4676]: I0930 13:59:23.424333 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:23Z","lastTransitionTime":"2025-09-30T13:59:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:23 crc kubenswrapper[4676]: I0930 13:59:23.432252 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sksn7" Sep 30 13:59:23 crc kubenswrapper[4676]: E0930 13:59:23.432511 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sksn7" podUID="e47ea2c6-e937-4411-b4c9-98048a5e5f05" Sep 30 13:59:23 crc kubenswrapper[4676]: I0930 13:59:23.531136 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:23 crc kubenswrapper[4676]: I0930 13:59:23.531190 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:23 crc kubenswrapper[4676]: I0930 13:59:23.531200 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:23 crc kubenswrapper[4676]: I0930 13:59:23.531216 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:23 crc kubenswrapper[4676]: I0930 13:59:23.531227 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:23Z","lastTransitionTime":"2025-09-30T13:59:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:23 crc kubenswrapper[4676]: I0930 13:59:23.634326 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:23 crc kubenswrapper[4676]: I0930 13:59:23.634404 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:23 crc kubenswrapper[4676]: I0930 13:59:23.634419 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:23 crc kubenswrapper[4676]: I0930 13:59:23.634444 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:23 crc kubenswrapper[4676]: I0930 13:59:23.634458 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:23Z","lastTransitionTime":"2025-09-30T13:59:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:23 crc kubenswrapper[4676]: I0930 13:59:23.736767 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:23 crc kubenswrapper[4676]: I0930 13:59:23.736824 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:23 crc kubenswrapper[4676]: I0930 13:59:23.736839 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:23 crc kubenswrapper[4676]: I0930 13:59:23.736864 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:23 crc kubenswrapper[4676]: I0930 13:59:23.736897 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:23Z","lastTransitionTime":"2025-09-30T13:59:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:23 crc kubenswrapper[4676]: I0930 13:59:23.839687 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:23 crc kubenswrapper[4676]: I0930 13:59:23.839748 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:23 crc kubenswrapper[4676]: I0930 13:59:23.839760 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:23 crc kubenswrapper[4676]: I0930 13:59:23.839780 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:23 crc kubenswrapper[4676]: I0930 13:59:23.839794 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:23Z","lastTransitionTime":"2025-09-30T13:59:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:23 crc kubenswrapper[4676]: I0930 13:59:23.944016 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:23 crc kubenswrapper[4676]: I0930 13:59:23.944069 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:23 crc kubenswrapper[4676]: I0930 13:59:23.944087 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:23 crc kubenswrapper[4676]: I0930 13:59:23.944458 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:23 crc kubenswrapper[4676]: I0930 13:59:23.944474 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:23Z","lastTransitionTime":"2025-09-30T13:59:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.017678 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.017741 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.017754 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.017780 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.017791 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:24Z","lastTransitionTime":"2025-09-30T13:59:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:24 crc kubenswrapper[4676]: E0930 13:59:24.033844 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1d613b32-e65a-4ccd-985a-b2960fc60b41\\\",\\\"systemUUID\\\":\\\"e459744e-c3c7-46eb-b48a-f9d168ffc645\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:24Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.039026 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.039076 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.039088 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.039106 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.039117 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:24Z","lastTransitionTime":"2025-09-30T13:59:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:24 crc kubenswrapper[4676]: E0930 13:59:24.052747 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1d613b32-e65a-4ccd-985a-b2960fc60b41\\\",\\\"systemUUID\\\":\\\"e459744e-c3c7-46eb-b48a-f9d168ffc645\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:24Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.057664 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.057735 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.057749 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.057774 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.057825 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:24Z","lastTransitionTime":"2025-09-30T13:59:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:24 crc kubenswrapper[4676]: E0930 13:59:24.071251 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1d613b32-e65a-4ccd-985a-b2960fc60b41\\\",\\\"systemUUID\\\":\\\"e459744e-c3c7-46eb-b48a-f9d168ffc645\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:24Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.074997 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.075056 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.075074 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.075095 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.075109 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:24Z","lastTransitionTime":"2025-09-30T13:59:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:24 crc kubenswrapper[4676]: E0930 13:59:24.088512 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1d613b32-e65a-4ccd-985a-b2960fc60b41\\\",\\\"systemUUID\\\":\\\"e459744e-c3c7-46eb-b48a-f9d168ffc645\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:24Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.092276 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.092331 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.092351 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.092373 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.092387 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:24Z","lastTransitionTime":"2025-09-30T13:59:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:24 crc kubenswrapper[4676]: E0930 13:59:24.106444 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1d613b32-e65a-4ccd-985a-b2960fc60b41\\\",\\\"systemUUID\\\":\\\"e459744e-c3c7-46eb-b48a-f9d168ffc645\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:24Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:24 crc kubenswrapper[4676]: E0930 13:59:24.106639 4676 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.108466 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.108505 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.108515 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.108530 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.108543 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:24Z","lastTransitionTime":"2025-09-30T13:59:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.210552 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.210636 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.210654 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.210690 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.210707 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:24Z","lastTransitionTime":"2025-09-30T13:59:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.314393 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.314444 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.314454 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.314474 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.314485 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:24Z","lastTransitionTime":"2025-09-30T13:59:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.417694 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.417751 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.417766 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.417788 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.417802 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:24Z","lastTransitionTime":"2025-09-30T13:59:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.432338 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.432403 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.432438 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:59:24 crc kubenswrapper[4676]: E0930 13:59:24.432486 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:59:24 crc kubenswrapper[4676]: E0930 13:59:24.432554 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:59:24 crc kubenswrapper[4676]: E0930 13:59:24.432630 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.521806 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.521869 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.521910 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.521933 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.521947 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:24Z","lastTransitionTime":"2025-09-30T13:59:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.625380 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.625463 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.625474 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.625498 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.625512 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:24Z","lastTransitionTime":"2025-09-30T13:59:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.728293 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.728334 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.728342 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.728363 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.728377 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:24Z","lastTransitionTime":"2025-09-30T13:59:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.831091 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.831169 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.831186 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.831208 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.831225 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:24Z","lastTransitionTime":"2025-09-30T13:59:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.934760 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.934832 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.934848 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.934947 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:24 crc kubenswrapper[4676]: I0930 13:59:24.934972 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:24Z","lastTransitionTime":"2025-09-30T13:59:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:25 crc kubenswrapper[4676]: I0930 13:59:25.038293 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:25 crc kubenswrapper[4676]: I0930 13:59:25.038350 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:25 crc kubenswrapper[4676]: I0930 13:59:25.038368 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:25 crc kubenswrapper[4676]: I0930 13:59:25.038393 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:25 crc kubenswrapper[4676]: I0930 13:59:25.038407 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:25Z","lastTransitionTime":"2025-09-30T13:59:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:25 crc kubenswrapper[4676]: I0930 13:59:25.141234 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:25 crc kubenswrapper[4676]: I0930 13:59:25.141282 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:25 crc kubenswrapper[4676]: I0930 13:59:25.141294 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:25 crc kubenswrapper[4676]: I0930 13:59:25.141314 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:25 crc kubenswrapper[4676]: I0930 13:59:25.141331 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:25Z","lastTransitionTime":"2025-09-30T13:59:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:25 crc kubenswrapper[4676]: I0930 13:59:25.245960 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:25 crc kubenswrapper[4676]: I0930 13:59:25.246033 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:25 crc kubenswrapper[4676]: I0930 13:59:25.246052 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:25 crc kubenswrapper[4676]: I0930 13:59:25.246083 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:25 crc kubenswrapper[4676]: I0930 13:59:25.246104 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:25Z","lastTransitionTime":"2025-09-30T13:59:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:25 crc kubenswrapper[4676]: I0930 13:59:25.349160 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:25 crc kubenswrapper[4676]: I0930 13:59:25.349458 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:25 crc kubenswrapper[4676]: I0930 13:59:25.349592 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:25 crc kubenswrapper[4676]: I0930 13:59:25.349712 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:25 crc kubenswrapper[4676]: I0930 13:59:25.349807 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:25Z","lastTransitionTime":"2025-09-30T13:59:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:25 crc kubenswrapper[4676]: I0930 13:59:25.432971 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sksn7" Sep 30 13:59:25 crc kubenswrapper[4676]: E0930 13:59:25.433754 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sksn7" podUID="e47ea2c6-e937-4411-b4c9-98048a5e5f05" Sep 30 13:59:25 crc kubenswrapper[4676]: I0930 13:59:25.452062 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:25 crc kubenswrapper[4676]: I0930 13:59:25.452111 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:25 crc kubenswrapper[4676]: I0930 13:59:25.452125 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:25 crc kubenswrapper[4676]: I0930 13:59:25.452149 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:25 crc kubenswrapper[4676]: I0930 13:59:25.452163 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:25Z","lastTransitionTime":"2025-09-30T13:59:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:25 crc kubenswrapper[4676]: I0930 13:59:25.555826 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:25 crc kubenswrapper[4676]: I0930 13:59:25.556097 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:25 crc kubenswrapper[4676]: I0930 13:59:25.556164 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:25 crc kubenswrapper[4676]: I0930 13:59:25.556235 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:25 crc kubenswrapper[4676]: I0930 13:59:25.556306 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:25Z","lastTransitionTime":"2025-09-30T13:59:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:25 crc kubenswrapper[4676]: I0930 13:59:25.659396 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:25 crc kubenswrapper[4676]: I0930 13:59:25.660585 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:25 crc kubenswrapper[4676]: I0930 13:59:25.660812 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:25 crc kubenswrapper[4676]: I0930 13:59:25.661078 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:25 crc kubenswrapper[4676]: I0930 13:59:25.661295 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:25Z","lastTransitionTime":"2025-09-30T13:59:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:25 crc kubenswrapper[4676]: I0930 13:59:25.764660 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:25 crc kubenswrapper[4676]: I0930 13:59:25.764745 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:25 crc kubenswrapper[4676]: I0930 13:59:25.764769 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:25 crc kubenswrapper[4676]: I0930 13:59:25.764800 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:25 crc kubenswrapper[4676]: I0930 13:59:25.764822 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:25Z","lastTransitionTime":"2025-09-30T13:59:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:25 crc kubenswrapper[4676]: I0930 13:59:25.867609 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:25 crc kubenswrapper[4676]: I0930 13:59:25.867663 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:25 crc kubenswrapper[4676]: I0930 13:59:25.867674 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:25 crc kubenswrapper[4676]: I0930 13:59:25.867693 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:25 crc kubenswrapper[4676]: I0930 13:59:25.867706 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:25Z","lastTransitionTime":"2025-09-30T13:59:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:25 crc kubenswrapper[4676]: I0930 13:59:25.970936 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:25 crc kubenswrapper[4676]: I0930 13:59:25.970987 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:25 crc kubenswrapper[4676]: I0930 13:59:25.971000 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:25 crc kubenswrapper[4676]: I0930 13:59:25.971024 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:25 crc kubenswrapper[4676]: I0930 13:59:25.971051 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:25Z","lastTransitionTime":"2025-09-30T13:59:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:26 crc kubenswrapper[4676]: I0930 13:59:26.074738 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:26 crc kubenswrapper[4676]: I0930 13:59:26.075242 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:26 crc kubenswrapper[4676]: I0930 13:59:26.075314 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:26 crc kubenswrapper[4676]: I0930 13:59:26.075398 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:26 crc kubenswrapper[4676]: I0930 13:59:26.075479 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:26Z","lastTransitionTime":"2025-09-30T13:59:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:26 crc kubenswrapper[4676]: I0930 13:59:26.178491 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:26 crc kubenswrapper[4676]: I0930 13:59:26.178541 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:26 crc kubenswrapper[4676]: I0930 13:59:26.178553 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:26 crc kubenswrapper[4676]: I0930 13:59:26.178573 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:26 crc kubenswrapper[4676]: I0930 13:59:26.178587 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:26Z","lastTransitionTime":"2025-09-30T13:59:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:26 crc kubenswrapper[4676]: I0930 13:59:26.281013 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:26 crc kubenswrapper[4676]: I0930 13:59:26.281069 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:26 crc kubenswrapper[4676]: I0930 13:59:26.281105 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:26 crc kubenswrapper[4676]: I0930 13:59:26.281120 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:26 crc kubenswrapper[4676]: I0930 13:59:26.281131 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:26Z","lastTransitionTime":"2025-09-30T13:59:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:26 crc kubenswrapper[4676]: I0930 13:59:26.383835 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:26 crc kubenswrapper[4676]: I0930 13:59:26.383871 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:26 crc kubenswrapper[4676]: I0930 13:59:26.383894 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:26 crc kubenswrapper[4676]: I0930 13:59:26.383909 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:26 crc kubenswrapper[4676]: I0930 13:59:26.383918 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:26Z","lastTransitionTime":"2025-09-30T13:59:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:26 crc kubenswrapper[4676]: I0930 13:59:26.432425 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:59:26 crc kubenswrapper[4676]: I0930 13:59:26.432569 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:59:26 crc kubenswrapper[4676]: E0930 13:59:26.432563 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:59:26 crc kubenswrapper[4676]: I0930 13:59:26.432837 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:59:26 crc kubenswrapper[4676]: E0930 13:59:26.433036 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:59:26 crc kubenswrapper[4676]: I0930 13:59:26.433263 4676 scope.go:117] "RemoveContainer" containerID="63ef46221a74ba486be9f66f0c25be6a8d34edccc58a35fedcd728dc2b61ba90" Sep 30 13:59:26 crc kubenswrapper[4676]: E0930 13:59:26.433303 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:59:26 crc kubenswrapper[4676]: E0930 13:59:26.433588 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-9775s_openshift-ovn-kubernetes(4fae6bdf-2a3f-4961-934d-b8f653412538)\"" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" podUID="4fae6bdf-2a3f-4961-934d-b8f653412538" Sep 30 13:59:26 crc kubenswrapper[4676]: I0930 13:59:26.486071 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:26 crc kubenswrapper[4676]: I0930 13:59:26.486104 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:26 crc kubenswrapper[4676]: I0930 13:59:26.486120 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:26 crc kubenswrapper[4676]: I0930 13:59:26.486138 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:26 crc kubenswrapper[4676]: I0930 13:59:26.486148 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:26Z","lastTransitionTime":"2025-09-30T13:59:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:26 crc kubenswrapper[4676]: I0930 13:59:26.588294 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:26 crc kubenswrapper[4676]: I0930 13:59:26.588341 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:26 crc kubenswrapper[4676]: I0930 13:59:26.588352 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:26 crc kubenswrapper[4676]: I0930 13:59:26.588364 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:26 crc kubenswrapper[4676]: I0930 13:59:26.588386 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:26Z","lastTransitionTime":"2025-09-30T13:59:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:26 crc kubenswrapper[4676]: I0930 13:59:26.691069 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:26 crc kubenswrapper[4676]: I0930 13:59:26.691114 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:26 crc kubenswrapper[4676]: I0930 13:59:26.691123 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:26 crc kubenswrapper[4676]: I0930 13:59:26.691140 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:26 crc kubenswrapper[4676]: I0930 13:59:26.691153 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:26Z","lastTransitionTime":"2025-09-30T13:59:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:26 crc kubenswrapper[4676]: I0930 13:59:26.793580 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:26 crc kubenswrapper[4676]: I0930 13:59:26.793627 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:26 crc kubenswrapper[4676]: I0930 13:59:26.793636 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:26 crc kubenswrapper[4676]: I0930 13:59:26.793657 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:26 crc kubenswrapper[4676]: I0930 13:59:26.793668 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:26Z","lastTransitionTime":"2025-09-30T13:59:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:26 crc kubenswrapper[4676]: I0930 13:59:26.896812 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:26 crc kubenswrapper[4676]: I0930 13:59:26.896897 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:26 crc kubenswrapper[4676]: I0930 13:59:26.896911 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:26 crc kubenswrapper[4676]: I0930 13:59:26.896934 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:26 crc kubenswrapper[4676]: I0930 13:59:26.896949 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:26Z","lastTransitionTime":"2025-09-30T13:59:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:27 crc kubenswrapper[4676]: I0930 13:59:27.000090 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:27 crc kubenswrapper[4676]: I0930 13:59:27.000173 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:27 crc kubenswrapper[4676]: I0930 13:59:27.000186 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:27 crc kubenswrapper[4676]: I0930 13:59:27.000202 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:27 crc kubenswrapper[4676]: I0930 13:59:27.000214 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:27Z","lastTransitionTime":"2025-09-30T13:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:27 crc kubenswrapper[4676]: I0930 13:59:27.103140 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:27 crc kubenswrapper[4676]: I0930 13:59:27.103405 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:27 crc kubenswrapper[4676]: I0930 13:59:27.103474 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:27 crc kubenswrapper[4676]: I0930 13:59:27.103544 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:27 crc kubenswrapper[4676]: I0930 13:59:27.103627 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:27Z","lastTransitionTime":"2025-09-30T13:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:27 crc kubenswrapper[4676]: I0930 13:59:27.207342 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:27 crc kubenswrapper[4676]: I0930 13:59:27.207412 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:27 crc kubenswrapper[4676]: I0930 13:59:27.207427 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:27 crc kubenswrapper[4676]: I0930 13:59:27.207451 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:27 crc kubenswrapper[4676]: I0930 13:59:27.207465 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:27Z","lastTransitionTime":"2025-09-30T13:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:27 crc kubenswrapper[4676]: I0930 13:59:27.311552 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:27 crc kubenswrapper[4676]: I0930 13:59:27.311601 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:27 crc kubenswrapper[4676]: I0930 13:59:27.311614 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:27 crc kubenswrapper[4676]: I0930 13:59:27.311638 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:27 crc kubenswrapper[4676]: I0930 13:59:27.311652 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:27Z","lastTransitionTime":"2025-09-30T13:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:27 crc kubenswrapper[4676]: I0930 13:59:27.414796 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:27 crc kubenswrapper[4676]: I0930 13:59:27.415243 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:27 crc kubenswrapper[4676]: I0930 13:59:27.415338 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:27 crc kubenswrapper[4676]: I0930 13:59:27.415461 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:27 crc kubenswrapper[4676]: I0930 13:59:27.415560 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:27Z","lastTransitionTime":"2025-09-30T13:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:27 crc kubenswrapper[4676]: I0930 13:59:27.432590 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sksn7" Sep 30 13:59:27 crc kubenswrapper[4676]: E0930 13:59:27.432788 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sksn7" podUID="e47ea2c6-e937-4411-b4c9-98048a5e5f05" Sep 30 13:59:27 crc kubenswrapper[4676]: I0930 13:59:27.446448 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:27Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:27 crc kubenswrapper[4676]: I0930 13:59:27.461389 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:27Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:27 crc kubenswrapper[4676]: I0930 13:59:27.472908 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7stxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e065baf-f38b-4397-bbbe-ef52eea10f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f5fe4110f265eb89d70f5d85485514e7fea4ed909359edaa0c76d305d1a9c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nkkjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7stxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:27Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:27 crc kubenswrapper[4676]: I0930 13:59:27.489703 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qmgfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0fc06a-eb2b-4fa4-8554-ce8a0fd62074\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb44b6e0c8081c62bace086876620d80d412864b9afb7fd495413baf0363c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mwsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qmgfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:27Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:27 crc kubenswrapper[4676]: I0930 13:59:27.510793 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fae6bdf-2a3f-4961-934d-b8f653412538\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dbb7809575c05151462f55a6d1d4eca91ba859d817d152220688e61385beff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2def96f529a78f915d69adfd74b48057081c10bf8f21bd69fce0d4b305f80a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a8ea8d40cefaa9290dfd65c7f1fa80963910c38b71de3992bdcf77fde19eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1323c0deb00658d6452b2d6aa4515233551626a4c4def3f4fff539db4a3f5038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6836ea3a8ec1a7fdff3ef83bd00963c21caeb4e8264f3b6775201c90b83d4653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f1d81c0d66a327a4174d0f39a9540f66d89a6c2049490ddad67174aa684b773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63ef46221a74ba486be9f66f0c25be6a8d34edccc58a35fedcd728dc2b61ba90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63ef46221a74ba486be9f66f0c25be6a8d34edccc58a35fedcd728dc2b61ba90\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:59:10Z\\\",\\\"message\\\":\\\"eVersion:\\\\\\\"26904\\\\\\\", FieldPath:\\\\\\\"\\\\\\\"}): type: 'Warning' reason: 'ErrorAddingResource' addLogicalPort failed for openshift-multus/network-metrics-daemon-sksn7: failed to update pod openshift-multus/network-metrics-daemon-sksn7: Internal error occurred: failed calling webhook \\\\\\\"pod.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/pod?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:10Z is after 2025-08-24T17:21:41Z\\\\nF0930 13:59:10.306789 6383 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:10Z is after 2025-0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:59:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-9775s_openshift-ovn-kubernetes(4fae6bdf-2a3f-4961-934d-b8f653412538)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d05c6ffe2c46461b1604661e27c75ac6f150de39e30e983d4c4dfbe9c84092e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6b55521ba448057d6f78ab17ef0a1a7d4c8e593e5c7a23bebb4db492fb4b54f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6b55521ba448057d6f78ab17ef0a1a7d4c8e593e5c7a23bebb4db492fb4b54f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9775s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:27Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:27 crc kubenswrapper[4676]: I0930 13:59:27.518312 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:27 crc kubenswrapper[4676]: I0930 13:59:27.518352 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:27 crc kubenswrapper[4676]: I0930 13:59:27.518389 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:27 crc kubenswrapper[4676]: I0930 13:59:27.518406 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:27 crc kubenswrapper[4676]: I0930 13:59:27.518433 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:27Z","lastTransitionTime":"2025-09-30T13:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:27 crc kubenswrapper[4676]: I0930 13:59:27.534374 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9343584b-a3e2-45a2-9f71-b815d206f44c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://168155413f2783887fde890a607fd98c1823da5a45b7fc550ef9358750451da7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69000b0d87b6f1e68fb36dd2349f95de211b17cc4e7bd7ec489969feca3810d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0228750af194e47c41da2795fb0b21b2c44b61e7d5a8943a943793954b33e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://288cb819756dee05f608c0e6875f3a2b4eb629a5d07db35aaf706a4bc4110899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9e51dab4ec7df53e36f707786a013e4fca0909f0d69cb9113dfdff6e0c9a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:27Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:27 crc kubenswrapper[4676]: I0930 13:59:27.548671 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2355fa03-bfc0-4ba0-941c-ceee570ab4b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb64f69d88447dcfe2a2e206f1e59b4b10f370ac02a3f9653e0b95e6d94529c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f628b643cc4c4a9a0d4b2c6d923d6ee5839b325a873da093e84dfc48255548f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86ba7b90dcf7a1fcd028d2fcf19bb86de8c58446ef34191d50a5055f4042bb71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5591cefc1293c938f8a2a8db06c7287d0e54d55a9d99c65a40126ca77abccd80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:27Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:27 crc kubenswrapper[4676]: I0930 13:59:27.562863 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f9bd59d-f026-4c72-9734-c670da1df387\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a7307312b0604e0dced8cbc2b8ed07583ec6650b05091d4c5f41a08da70691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14d07155343479f6d7f452a871bca1361e4fd22efd3ef828281e1836117454a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44aa745d9d01236afaa143bbc447bc230cac1190229c1c74ecdf92555cc721eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05f3170673f4e23690bc0c5543265931501d623eda84ee7df8e515e1ad8b6346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05f3170673f4e23690bc0c5543265931501d623eda84ee7df8e515e1ad8b6346\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:27Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:27 crc kubenswrapper[4676]: I0930 13:59:27.580456 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e12a8bf70722b099302a9c3b8d88526bbea68746ea168b2bf41306a3e18641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:27Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:27 crc kubenswrapper[4676]: I0930 13:59:27.595721 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e1c7778cf1d8d9f6dc68d72d153ea277d19d66d775fa02ed06687e3392040c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98be87b4cb316839099b69460cc1c1ecd98d5162c7dfc995c8f6d86ad477fedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:27Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:27 crc kubenswrapper[4676]: I0930 13:59:27.610552 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:27Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:27 crc kubenswrapper[4676]: I0930 13:59:27.622088 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:27 crc kubenswrapper[4676]: I0930 13:59:27.622143 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:27 crc kubenswrapper[4676]: I0930 13:59:27.622161 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:27 crc kubenswrapper[4676]: I0930 13:59:27.622189 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:27 crc kubenswrapper[4676]: I0930 13:59:27.622208 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:27Z","lastTransitionTime":"2025-09-30T13:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:27 crc kubenswrapper[4676]: I0930 13:59:27.624648 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938f56b18fe369ceeb13a74b0c257fd1b7fafeeb4d331327d4508f482caca32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:27Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:27 crc kubenswrapper[4676]: I0930 13:59:27.640852 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s7q5x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12808c49-1bed-4251-bcbe-fad6207eea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://397ae2103d09a79613ccc4281fb5318128fa7dc84e8013a5327cf3cd31490049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84swg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s7q5x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:27Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:27 crc kubenswrapper[4676]: I0930 13:59:27.658726 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86284cfd-7d99-4f3b-bb3a-593b66737ae3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf15b6b87f6e760bf922b02b49658153875a79c4253bc50a895372c460e0f077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f838844cf38dd9b54b6054863a395010ea5cb87ca8066642ffe388365a91f5a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://151ae8ebe882d8c0150cd97088a56e52f55f0ae066819f8e4aaae2279a61881f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff5fec130c99031e2cb7164e0a3d5c4af84e2b2ac25d3cf255008dc5c1ab967\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3049f477ecf683e57b583e8f15aa9e5335347be632ffc02799ef279fefe9b446\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 13:58:31.152502 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:58:31.168708 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3623665920/tls.crt::/tmp/serving-cert-3623665920/tls.key\\\\\\\"\\\\nI0930 13:58:36.402239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:58:36.407288 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:58:36.407313 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:58:36.407332 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:58:36.407337 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:58:36.414209 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:58:36.414232 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414236 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:58:36.414244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:58:36.414247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:58:36.414250 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:58:36.414249 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:58:36.417272 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e2c7ce3911aaca47ffc14e5628e9efb3b32594d812028b2659b8f8ca681d53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:27Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:27 crc kubenswrapper[4676]: I0930 13:59:27.675026 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pw2gp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f8ff358-e9b0-478e-acbf-e30059107be1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a57e948acbb622af47bb656f844fcdca4ff7dbf2256fc07d7a73951621b06b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kq87r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f5e3bfb55994ad5563473790f09c8a934f98e96fe7c88bd1f0dc1ae3dfc7a0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kq87r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pw2gp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:27Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:27 crc kubenswrapper[4676]: I0930 13:59:27.690535 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sksn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47ea2c6-e937-4411-b4c9-98048a5e5f05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtbn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtbn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sksn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:27Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:27 crc kubenswrapper[4676]: I0930 13:59:27.707525 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ksfzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70e11f41-7d9e-49ef-a2f5-0691d5f8f631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72f096ef608f833a542cea2dde328dd04393d90bcf9c20601499b9bb49734e8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e68434ff5ca08dde9677cf96ed4703958895aa880048ef99eead372a1a06bcf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e68434ff5ca08dde9677cf96ed4703958895aa880048ef99eead372a1a06bcf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f888604a37c64c17b61fbe754fb766b3a8dd363cd84342d3f464ac3b377b703c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f888604a37c64c17b61fbe754fb766b3a8dd363cd84342d3f464ac3b377b703c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a4a6ea3224024cc371c57de0a5be4082b7bd9092090f7af42bedda1a897d78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70a4a6ea3224024cc371c57de0a5be4082b7bd9092090f7af42bedda1a897d78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62090d66b5fc63638db525937297cd84b886fc27446d8df9935ee6b45f7d8610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62090d66b5fc63638db525937297cd84b886fc27446d8df9935ee6b45f7d8610\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4452138f5ffc57e76d26761a49ebe4f782695f4d0298426e19d9fc9f87202645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4452138f5ffc57e76d26761a49ebe4f782695f4d0298426e19d9fc9f87202645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2ac4de1ea55982e71a3ecd8eaffb3d44648249635c4cdfbe0181f0497a2d1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c2ac4de1ea55982e71a3ecd8eaffb3d44648249635c4cdfbe0181f0497a2d1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ksfzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:27Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:27 crc kubenswrapper[4676]: I0930 13:59:27.731247 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:27 crc kubenswrapper[4676]: I0930 13:59:27.731334 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:27 crc kubenswrapper[4676]: I0930 13:59:27.731352 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:27 crc kubenswrapper[4676]: I0930 13:59:27.731381 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:27 crc kubenswrapper[4676]: I0930 13:59:27.731400 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:27Z","lastTransitionTime":"2025-09-30T13:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:27 crc kubenswrapper[4676]: I0930 13:59:27.731443 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af133cb7-f0e4-428e-b348-c6e81493fc1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8aa7f7317a33eb0f00d05f59ec96a210aa79be528b5dc94d099b25fa019511a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7djb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ba88aa74993edfc49871c7cb7c45c498fc9a6d1e2cc573864d4656d4ade55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7djb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4k2dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:27Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:27 crc kubenswrapper[4676]: I0930 13:59:27.834801 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:27 crc kubenswrapper[4676]: I0930 13:59:27.834870 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:27 crc kubenswrapper[4676]: I0930 13:59:27.834906 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:27 crc kubenswrapper[4676]: I0930 13:59:27.834928 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:27 crc kubenswrapper[4676]: I0930 13:59:27.834940 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:27Z","lastTransitionTime":"2025-09-30T13:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:27 crc kubenswrapper[4676]: I0930 13:59:27.937748 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:27 crc kubenswrapper[4676]: I0930 13:59:27.937806 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:27 crc kubenswrapper[4676]: I0930 13:59:27.937818 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:27 crc kubenswrapper[4676]: I0930 13:59:27.937840 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:27 crc kubenswrapper[4676]: I0930 13:59:27.937853 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:27Z","lastTransitionTime":"2025-09-30T13:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:27 crc kubenswrapper[4676]: I0930 13:59:27.977150 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e47ea2c6-e937-4411-b4c9-98048a5e5f05-metrics-certs\") pod \"network-metrics-daemon-sksn7\" (UID: \"e47ea2c6-e937-4411-b4c9-98048a5e5f05\") " pod="openshift-multus/network-metrics-daemon-sksn7" Sep 30 13:59:27 crc kubenswrapper[4676]: E0930 13:59:27.977369 4676 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 13:59:27 crc kubenswrapper[4676]: E0930 13:59:27.977475 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e47ea2c6-e937-4411-b4c9-98048a5e5f05-metrics-certs podName:e47ea2c6-e937-4411-b4c9-98048a5e5f05 nodeName:}" failed. No retries permitted until 2025-09-30 13:59:59.977453933 +0000 UTC m=+103.960542352 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e47ea2c6-e937-4411-b4c9-98048a5e5f05-metrics-certs") pod "network-metrics-daemon-sksn7" (UID: "e47ea2c6-e937-4411-b4c9-98048a5e5f05") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 13:59:28 crc kubenswrapper[4676]: I0930 13:59:28.040730 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:28 crc kubenswrapper[4676]: I0930 13:59:28.040766 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:28 crc kubenswrapper[4676]: I0930 13:59:28.040776 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:28 crc kubenswrapper[4676]: I0930 13:59:28.040790 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:28 crc kubenswrapper[4676]: I0930 13:59:28.040800 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:28Z","lastTransitionTime":"2025-09-30T13:59:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:28 crc kubenswrapper[4676]: I0930 13:59:28.143252 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:28 crc kubenswrapper[4676]: I0930 13:59:28.143353 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:28 crc kubenswrapper[4676]: I0930 13:59:28.143375 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:28 crc kubenswrapper[4676]: I0930 13:59:28.143400 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:28 crc kubenswrapper[4676]: I0930 13:59:28.143415 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:28Z","lastTransitionTime":"2025-09-30T13:59:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:28 crc kubenswrapper[4676]: I0930 13:59:28.246278 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:28 crc kubenswrapper[4676]: I0930 13:59:28.246334 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:28 crc kubenswrapper[4676]: I0930 13:59:28.246344 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:28 crc kubenswrapper[4676]: I0930 13:59:28.246364 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:28 crc kubenswrapper[4676]: I0930 13:59:28.246377 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:28Z","lastTransitionTime":"2025-09-30T13:59:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:28 crc kubenswrapper[4676]: I0930 13:59:28.349458 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:28 crc kubenswrapper[4676]: I0930 13:59:28.349519 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:28 crc kubenswrapper[4676]: I0930 13:59:28.349535 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:28 crc kubenswrapper[4676]: I0930 13:59:28.349555 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:28 crc kubenswrapper[4676]: I0930 13:59:28.349566 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:28Z","lastTransitionTime":"2025-09-30T13:59:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:28 crc kubenswrapper[4676]: I0930 13:59:28.432768 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:59:28 crc kubenswrapper[4676]: I0930 13:59:28.432819 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:59:28 crc kubenswrapper[4676]: I0930 13:59:28.432785 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:59:28 crc kubenswrapper[4676]: E0930 13:59:28.432909 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:59:28 crc kubenswrapper[4676]: E0930 13:59:28.433001 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:59:28 crc kubenswrapper[4676]: E0930 13:59:28.433062 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:59:28 crc kubenswrapper[4676]: I0930 13:59:28.452268 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:28 crc kubenswrapper[4676]: I0930 13:59:28.452296 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:28 crc kubenswrapper[4676]: I0930 13:59:28.452306 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:28 crc kubenswrapper[4676]: I0930 13:59:28.452321 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:28 crc kubenswrapper[4676]: I0930 13:59:28.452333 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:28Z","lastTransitionTime":"2025-09-30T13:59:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:28 crc kubenswrapper[4676]: I0930 13:59:28.554491 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:28 crc kubenswrapper[4676]: I0930 13:59:28.554535 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:28 crc kubenswrapper[4676]: I0930 13:59:28.554548 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:28 crc kubenswrapper[4676]: I0930 13:59:28.554566 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:28 crc kubenswrapper[4676]: I0930 13:59:28.554576 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:28Z","lastTransitionTime":"2025-09-30T13:59:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:28 crc kubenswrapper[4676]: I0930 13:59:28.657080 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:28 crc kubenswrapper[4676]: I0930 13:59:28.657113 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:28 crc kubenswrapper[4676]: I0930 13:59:28.657122 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:28 crc kubenswrapper[4676]: I0930 13:59:28.657134 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:28 crc kubenswrapper[4676]: I0930 13:59:28.657144 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:28Z","lastTransitionTime":"2025-09-30T13:59:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:28 crc kubenswrapper[4676]: I0930 13:59:28.759337 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:28 crc kubenswrapper[4676]: I0930 13:59:28.759391 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:28 crc kubenswrapper[4676]: I0930 13:59:28.759411 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:28 crc kubenswrapper[4676]: I0930 13:59:28.759428 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:28 crc kubenswrapper[4676]: I0930 13:59:28.759436 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:28Z","lastTransitionTime":"2025-09-30T13:59:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:28 crc kubenswrapper[4676]: I0930 13:59:28.862906 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:28 crc kubenswrapper[4676]: I0930 13:59:28.862941 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:28 crc kubenswrapper[4676]: I0930 13:59:28.862950 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:28 crc kubenswrapper[4676]: I0930 13:59:28.862967 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:28 crc kubenswrapper[4676]: I0930 13:59:28.862977 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:28Z","lastTransitionTime":"2025-09-30T13:59:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:28 crc kubenswrapper[4676]: I0930 13:59:28.965841 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:28 crc kubenswrapper[4676]: I0930 13:59:28.965905 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:28 crc kubenswrapper[4676]: I0930 13:59:28.965916 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:28 crc kubenswrapper[4676]: I0930 13:59:28.965932 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:28 crc kubenswrapper[4676]: I0930 13:59:28.965945 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:28Z","lastTransitionTime":"2025-09-30T13:59:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:29 crc kubenswrapper[4676]: I0930 13:59:29.069700 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:29 crc kubenswrapper[4676]: I0930 13:59:29.069774 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:29 crc kubenswrapper[4676]: I0930 13:59:29.069789 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:29 crc kubenswrapper[4676]: I0930 13:59:29.069814 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:29 crc kubenswrapper[4676]: I0930 13:59:29.069831 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:29Z","lastTransitionTime":"2025-09-30T13:59:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:29 crc kubenswrapper[4676]: I0930 13:59:29.171866 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:29 crc kubenswrapper[4676]: I0930 13:59:29.171944 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:29 crc kubenswrapper[4676]: I0930 13:59:29.171957 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:29 crc kubenswrapper[4676]: I0930 13:59:29.171975 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:29 crc kubenswrapper[4676]: I0930 13:59:29.171987 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:29Z","lastTransitionTime":"2025-09-30T13:59:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:29 crc kubenswrapper[4676]: I0930 13:59:29.275814 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:29 crc kubenswrapper[4676]: I0930 13:59:29.275855 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:29 crc kubenswrapper[4676]: I0930 13:59:29.275868 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:29 crc kubenswrapper[4676]: I0930 13:59:29.275903 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:29 crc kubenswrapper[4676]: I0930 13:59:29.275917 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:29Z","lastTransitionTime":"2025-09-30T13:59:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:29 crc kubenswrapper[4676]: I0930 13:59:29.377971 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:29 crc kubenswrapper[4676]: I0930 13:59:29.378012 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:29 crc kubenswrapper[4676]: I0930 13:59:29.378020 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:29 crc kubenswrapper[4676]: I0930 13:59:29.378034 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:29 crc kubenswrapper[4676]: I0930 13:59:29.378042 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:29Z","lastTransitionTime":"2025-09-30T13:59:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:29 crc kubenswrapper[4676]: I0930 13:59:29.432009 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sksn7" Sep 30 13:59:29 crc kubenswrapper[4676]: E0930 13:59:29.432172 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sksn7" podUID="e47ea2c6-e937-4411-b4c9-98048a5e5f05" Sep 30 13:59:29 crc kubenswrapper[4676]: I0930 13:59:29.480789 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:29 crc kubenswrapper[4676]: I0930 13:59:29.480833 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:29 crc kubenswrapper[4676]: I0930 13:59:29.480842 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:29 crc kubenswrapper[4676]: I0930 13:59:29.480855 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:29 crc kubenswrapper[4676]: I0930 13:59:29.480864 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:29Z","lastTransitionTime":"2025-09-30T13:59:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:29 crc kubenswrapper[4676]: I0930 13:59:29.582806 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:29 crc kubenswrapper[4676]: I0930 13:59:29.582857 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:29 crc kubenswrapper[4676]: I0930 13:59:29.582867 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:29 crc kubenswrapper[4676]: I0930 13:59:29.582913 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:29 crc kubenswrapper[4676]: I0930 13:59:29.582928 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:29Z","lastTransitionTime":"2025-09-30T13:59:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:29 crc kubenswrapper[4676]: I0930 13:59:29.685443 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:29 crc kubenswrapper[4676]: I0930 13:59:29.685479 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:29 crc kubenswrapper[4676]: I0930 13:59:29.685488 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:29 crc kubenswrapper[4676]: I0930 13:59:29.685501 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:29 crc kubenswrapper[4676]: I0930 13:59:29.685511 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:29Z","lastTransitionTime":"2025-09-30T13:59:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:29 crc kubenswrapper[4676]: I0930 13:59:29.788390 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:29 crc kubenswrapper[4676]: I0930 13:59:29.788444 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:29 crc kubenswrapper[4676]: I0930 13:59:29.788454 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:29 crc kubenswrapper[4676]: I0930 13:59:29.788471 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:29 crc kubenswrapper[4676]: I0930 13:59:29.788486 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:29Z","lastTransitionTime":"2025-09-30T13:59:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:29 crc kubenswrapper[4676]: I0930 13:59:29.891341 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:29 crc kubenswrapper[4676]: I0930 13:59:29.891386 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:29 crc kubenswrapper[4676]: I0930 13:59:29.891395 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:29 crc kubenswrapper[4676]: I0930 13:59:29.891411 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:29 crc kubenswrapper[4676]: I0930 13:59:29.891420 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:29Z","lastTransitionTime":"2025-09-30T13:59:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:29 crc kubenswrapper[4676]: I0930 13:59:29.993833 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:29 crc kubenswrapper[4676]: I0930 13:59:29.993871 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:29 crc kubenswrapper[4676]: I0930 13:59:29.993914 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:29 crc kubenswrapper[4676]: I0930 13:59:29.993933 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:29 crc kubenswrapper[4676]: I0930 13:59:29.993945 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:29Z","lastTransitionTime":"2025-09-30T13:59:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:30 crc kubenswrapper[4676]: I0930 13:59:30.096372 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:30 crc kubenswrapper[4676]: I0930 13:59:30.096432 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:30 crc kubenswrapper[4676]: I0930 13:59:30.096444 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:30 crc kubenswrapper[4676]: I0930 13:59:30.096463 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:30 crc kubenswrapper[4676]: I0930 13:59:30.096475 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:30Z","lastTransitionTime":"2025-09-30T13:59:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:30 crc kubenswrapper[4676]: I0930 13:59:30.198972 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:30 crc kubenswrapper[4676]: I0930 13:59:30.199011 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:30 crc kubenswrapper[4676]: I0930 13:59:30.199020 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:30 crc kubenswrapper[4676]: I0930 13:59:30.199033 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:30 crc kubenswrapper[4676]: I0930 13:59:30.199042 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:30Z","lastTransitionTime":"2025-09-30T13:59:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:30 crc kubenswrapper[4676]: I0930 13:59:30.301299 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:30 crc kubenswrapper[4676]: I0930 13:59:30.301339 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:30 crc kubenswrapper[4676]: I0930 13:59:30.301349 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:30 crc kubenswrapper[4676]: I0930 13:59:30.301364 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:30 crc kubenswrapper[4676]: I0930 13:59:30.301376 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:30Z","lastTransitionTime":"2025-09-30T13:59:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:30 crc kubenswrapper[4676]: I0930 13:59:30.403361 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:30 crc kubenswrapper[4676]: I0930 13:59:30.403397 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:30 crc kubenswrapper[4676]: I0930 13:59:30.403405 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:30 crc kubenswrapper[4676]: I0930 13:59:30.403419 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:30 crc kubenswrapper[4676]: I0930 13:59:30.403428 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:30Z","lastTransitionTime":"2025-09-30T13:59:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:30 crc kubenswrapper[4676]: I0930 13:59:30.432812 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:59:30 crc kubenswrapper[4676]: I0930 13:59:30.432863 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:59:30 crc kubenswrapper[4676]: I0930 13:59:30.432920 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:59:30 crc kubenswrapper[4676]: E0930 13:59:30.432948 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:59:30 crc kubenswrapper[4676]: E0930 13:59:30.433122 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:59:30 crc kubenswrapper[4676]: E0930 13:59:30.433165 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:59:30 crc kubenswrapper[4676]: I0930 13:59:30.504930 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:30 crc kubenswrapper[4676]: I0930 13:59:30.504971 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:30 crc kubenswrapper[4676]: I0930 13:59:30.504983 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:30 crc kubenswrapper[4676]: I0930 13:59:30.505000 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:30 crc kubenswrapper[4676]: I0930 13:59:30.505011 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:30Z","lastTransitionTime":"2025-09-30T13:59:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:30 crc kubenswrapper[4676]: I0930 13:59:30.607544 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:30 crc kubenswrapper[4676]: I0930 13:59:30.607584 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:30 crc kubenswrapper[4676]: I0930 13:59:30.607594 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:30 crc kubenswrapper[4676]: I0930 13:59:30.607610 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:30 crc kubenswrapper[4676]: I0930 13:59:30.607621 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:30Z","lastTransitionTime":"2025-09-30T13:59:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:30 crc kubenswrapper[4676]: I0930 13:59:30.710298 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:30 crc kubenswrapper[4676]: I0930 13:59:30.710375 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:30 crc kubenswrapper[4676]: I0930 13:59:30.710387 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:30 crc kubenswrapper[4676]: I0930 13:59:30.710422 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:30 crc kubenswrapper[4676]: I0930 13:59:30.710433 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:30Z","lastTransitionTime":"2025-09-30T13:59:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:30 crc kubenswrapper[4676]: I0930 13:59:30.811641 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:30 crc kubenswrapper[4676]: I0930 13:59:30.811678 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:30 crc kubenswrapper[4676]: I0930 13:59:30.811689 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:30 crc kubenswrapper[4676]: I0930 13:59:30.811703 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:30 crc kubenswrapper[4676]: I0930 13:59:30.811714 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:30Z","lastTransitionTime":"2025-09-30T13:59:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:30 crc kubenswrapper[4676]: I0930 13:59:30.913981 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:30 crc kubenswrapper[4676]: I0930 13:59:30.914029 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:30 crc kubenswrapper[4676]: I0930 13:59:30.914046 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:30 crc kubenswrapper[4676]: I0930 13:59:30.914067 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:30 crc kubenswrapper[4676]: I0930 13:59:30.914081 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:30Z","lastTransitionTime":"2025-09-30T13:59:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:31 crc kubenswrapper[4676]: I0930 13:59:31.016937 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:31 crc kubenswrapper[4676]: I0930 13:59:31.016977 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:31 crc kubenswrapper[4676]: I0930 13:59:31.016987 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:31 crc kubenswrapper[4676]: I0930 13:59:31.017004 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:31 crc kubenswrapper[4676]: I0930 13:59:31.017017 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:31Z","lastTransitionTime":"2025-09-30T13:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:31 crc kubenswrapper[4676]: I0930 13:59:31.119338 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:31 crc kubenswrapper[4676]: I0930 13:59:31.119383 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:31 crc kubenswrapper[4676]: I0930 13:59:31.119394 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:31 crc kubenswrapper[4676]: I0930 13:59:31.119412 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:31 crc kubenswrapper[4676]: I0930 13:59:31.119426 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:31Z","lastTransitionTime":"2025-09-30T13:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:31 crc kubenswrapper[4676]: I0930 13:59:31.221732 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:31 crc kubenswrapper[4676]: I0930 13:59:31.221760 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:31 crc kubenswrapper[4676]: I0930 13:59:31.221768 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:31 crc kubenswrapper[4676]: I0930 13:59:31.221780 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:31 crc kubenswrapper[4676]: I0930 13:59:31.221788 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:31Z","lastTransitionTime":"2025-09-30T13:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:31 crc kubenswrapper[4676]: I0930 13:59:31.324625 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:31 crc kubenswrapper[4676]: I0930 13:59:31.324684 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:31 crc kubenswrapper[4676]: I0930 13:59:31.324698 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:31 crc kubenswrapper[4676]: I0930 13:59:31.324719 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:31 crc kubenswrapper[4676]: I0930 13:59:31.324732 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:31Z","lastTransitionTime":"2025-09-30T13:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:31 crc kubenswrapper[4676]: I0930 13:59:31.427784 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:31 crc kubenswrapper[4676]: I0930 13:59:31.427841 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:31 crc kubenswrapper[4676]: I0930 13:59:31.427852 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:31 crc kubenswrapper[4676]: I0930 13:59:31.427867 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:31 crc kubenswrapper[4676]: I0930 13:59:31.427901 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:31Z","lastTransitionTime":"2025-09-30T13:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:31 crc kubenswrapper[4676]: I0930 13:59:31.432507 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sksn7" Sep 30 13:59:31 crc kubenswrapper[4676]: E0930 13:59:31.432817 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sksn7" podUID="e47ea2c6-e937-4411-b4c9-98048a5e5f05" Sep 30 13:59:31 crc kubenswrapper[4676]: I0930 13:59:31.529852 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:31 crc kubenswrapper[4676]: I0930 13:59:31.529895 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:31 crc kubenswrapper[4676]: I0930 13:59:31.529905 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:31 crc kubenswrapper[4676]: I0930 13:59:31.529918 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:31 crc kubenswrapper[4676]: I0930 13:59:31.529927 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:31Z","lastTransitionTime":"2025-09-30T13:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:31 crc kubenswrapper[4676]: I0930 13:59:31.632765 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:31 crc kubenswrapper[4676]: I0930 13:59:31.632825 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:31 crc kubenswrapper[4676]: I0930 13:59:31.632837 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:31 crc kubenswrapper[4676]: I0930 13:59:31.632860 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:31 crc kubenswrapper[4676]: I0930 13:59:31.632902 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:31Z","lastTransitionTime":"2025-09-30T13:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:31 crc kubenswrapper[4676]: I0930 13:59:31.735703 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:31 crc kubenswrapper[4676]: I0930 13:59:31.735762 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:31 crc kubenswrapper[4676]: I0930 13:59:31.735776 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:31 crc kubenswrapper[4676]: I0930 13:59:31.735799 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:31 crc kubenswrapper[4676]: I0930 13:59:31.735816 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:31Z","lastTransitionTime":"2025-09-30T13:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:31 crc kubenswrapper[4676]: I0930 13:59:31.815686 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-s7q5x_12808c49-1bed-4251-bcbe-fad6207eea57/kube-multus/0.log" Sep 30 13:59:31 crc kubenswrapper[4676]: I0930 13:59:31.815736 4676 generic.go:334] "Generic (PLEG): container finished" podID="12808c49-1bed-4251-bcbe-fad6207eea57" containerID="397ae2103d09a79613ccc4281fb5318128fa7dc84e8013a5327cf3cd31490049" exitCode=1 Sep 30 13:59:31 crc kubenswrapper[4676]: I0930 13:59:31.815768 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-s7q5x" event={"ID":"12808c49-1bed-4251-bcbe-fad6207eea57","Type":"ContainerDied","Data":"397ae2103d09a79613ccc4281fb5318128fa7dc84e8013a5327cf3cd31490049"} Sep 30 13:59:31 crc kubenswrapper[4676]: I0930 13:59:31.816221 4676 scope.go:117] "RemoveContainer" containerID="397ae2103d09a79613ccc4281fb5318128fa7dc84e8013a5327cf3cd31490049" Sep 30 13:59:31 crc kubenswrapper[4676]: I0930 13:59:31.835177 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86284cfd-7d99-4f3b-bb3a-593b66737ae3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf15b6b87f6e760bf922b02b49658153875a79c4253bc50a895372c460e0f077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f838844cf38dd9b54b6054863a395010ea5cb87ca8066642ffe388365a91f5a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://151ae8ebe882d8c0150cd97088a56e52f55f0ae066819f8e4aaae2279a61881f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff5fec130c99031e2cb7164e0a3d5c4af84e2b2ac25d3cf255008dc5c1ab967\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3049f477ecf683e57b583e8f15aa9e5335347be632ffc02799ef279fefe9b446\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 13:58:31.152502 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:58:31.168708 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3623665920/tls.crt::/tmp/serving-cert-3623665920/tls.key\\\\\\\"\\\\nI0930 13:58:36.402239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:58:36.407288 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:58:36.407313 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:58:36.407332 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:58:36.407337 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:58:36.414209 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:58:36.414232 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414236 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:58:36.414244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:58:36.414247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:58:36.414250 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:58:36.414249 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:58:36.417272 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e2c7ce3911aaca47ffc14e5628e9efb3b32594d812028b2659b8f8ca681d53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:31Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:31 crc kubenswrapper[4676]: I0930 13:59:31.838680 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:31 crc kubenswrapper[4676]: I0930 13:59:31.839142 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:31 crc kubenswrapper[4676]: I0930 13:59:31.839156 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:31 crc kubenswrapper[4676]: I0930 13:59:31.839180 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:31 crc kubenswrapper[4676]: I0930 13:59:31.839193 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:31Z","lastTransitionTime":"2025-09-30T13:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:31 crc kubenswrapper[4676]: I0930 13:59:31.848573 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pw2gp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f8ff358-e9b0-478e-acbf-e30059107be1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a57e948acbb622af47bb656f844fcdca4ff7dbf2256fc07d7a73951621b06b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kq87r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f5e3bfb55994ad5563473790f09c8a934f98e96fe7c88bd1f0dc1ae3dfc7a0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kq87r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pw2gp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:31Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:31 crc kubenswrapper[4676]: I0930 13:59:31.859830 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sksn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47ea2c6-e937-4411-b4c9-98048a5e5f05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtbn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtbn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sksn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:31Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:31 crc kubenswrapper[4676]: I0930 13:59:31.873988 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ksfzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70e11f41-7d9e-49ef-a2f5-0691d5f8f631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72f096ef608f833a542cea2dde328dd04393d90bcf9c20601499b9bb49734e8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e68434ff5ca08dde9677cf96ed4703958895aa880048ef99eead372a1a06bcf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e68434ff5ca08dde9677cf96ed4703958895aa880048ef99eead372a1a06bcf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f888604a37c64c17b61fbe754fb766b3a8dd363cd84342d3f464ac3b377b703c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f888604a37c64c17b61fbe754fb766b3a8dd363cd84342d3f464ac3b377b703c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a4a6ea3224024cc371c57de0a5be4082b7bd9092090f7af42bedda1a897d78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70a4a6ea3224024cc371c57de0a5be4082b7bd9092090f7af42bedda1a897d78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62090d66b5fc63638db525937297cd84b886fc27446d8df9935ee6b45f7d8610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62090d66b5fc63638db525937297cd84b886fc27446d8df9935ee6b45f7d8610\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4452138f5ffc57e76d26761a49ebe4f782695f4d0298426e19d9fc9f87202645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4452138f5ffc57e76d26761a49ebe4f782695f4d0298426e19d9fc9f87202645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2ac4de1ea55982e71a3ecd8eaffb3d44648249635c4cdfbe0181f0497a2d1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c2ac4de1ea55982e71a3ecd8eaffb3d44648249635c4cdfbe0181f0497a2d1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ksfzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:31Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:31 crc kubenswrapper[4676]: I0930 13:59:31.884959 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af133cb7-f0e4-428e-b348-c6e81493fc1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8aa7f7317a33eb0f00d05f59ec96a210aa79be528b5dc94d099b25fa019511a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7djb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ba88aa74993edfc49871c7cb7c45c498fc9a6d1e2cc573864d4656d4ade55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7djb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4k2dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:31Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:31 crc kubenswrapper[4676]: I0930 13:59:31.895721 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qmgfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0fc06a-eb2b-4fa4-8554-ce8a0fd62074\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb44b6e0c8081c62bace086876620d80d412864b9afb7fd495413baf0363c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mwsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qmgfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:31Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:31 crc kubenswrapper[4676]: I0930 13:59:31.914797 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fae6bdf-2a3f-4961-934d-b8f653412538\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dbb7809575c05151462f55a6d1d4eca91ba859d817d152220688e61385beff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2def96f529a78f915d69adfd74b48057081c10bf8f21bd69fce0d4b305f80a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a8ea8d40cefaa9290dfd65c7f1fa80963910c38b71de3992bdcf77fde19eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1323c0deb00658d6452b2d6aa4515233551626a4c4def3f4fff539db4a3f5038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6836ea3a8ec1a7fdff3ef83bd00963c21caeb4e8264f3b6775201c90b83d4653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f1d81c0d66a327a4174d0f39a9540f66d89a6c2049490ddad67174aa684b773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63ef46221a74ba486be9f66f0c25be6a8d34edccc58a35fedcd728dc2b61ba90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63ef46221a74ba486be9f66f0c25be6a8d34edccc58a35fedcd728dc2b61ba90\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:59:10Z\\\",\\\"message\\\":\\\"eVersion:\\\\\\\"26904\\\\\\\", FieldPath:\\\\\\\"\\\\\\\"}): type: 'Warning' reason: 'ErrorAddingResource' addLogicalPort failed for openshift-multus/network-metrics-daemon-sksn7: failed to update pod openshift-multus/network-metrics-daemon-sksn7: Internal error occurred: failed calling webhook \\\\\\\"pod.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/pod?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:10Z is after 2025-08-24T17:21:41Z\\\\nF0930 13:59:10.306789 6383 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:10Z is after 2025-0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:59:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-9775s_openshift-ovn-kubernetes(4fae6bdf-2a3f-4961-934d-b8f653412538)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d05c6ffe2c46461b1604661e27c75ac6f150de39e30e983d4c4dfbe9c84092e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6b55521ba448057d6f78ab17ef0a1a7d4c8e593e5c7a23bebb4db492fb4b54f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6b55521ba448057d6f78ab17ef0a1a7d4c8e593e5c7a23bebb4db492fb4b54f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9775s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:31Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:31 crc kubenswrapper[4676]: I0930 13:59:31.928641 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:31Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:31 crc kubenswrapper[4676]: I0930 13:59:31.940795 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:31Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:31 crc kubenswrapper[4676]: I0930 13:59:31.941860 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:31 crc kubenswrapper[4676]: I0930 13:59:31.941934 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:31 crc kubenswrapper[4676]: I0930 13:59:31.941947 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:31 crc kubenswrapper[4676]: I0930 13:59:31.941966 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:31 crc kubenswrapper[4676]: I0930 13:59:31.941978 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:31Z","lastTransitionTime":"2025-09-30T13:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:31 crc kubenswrapper[4676]: I0930 13:59:31.954666 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7stxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e065baf-f38b-4397-bbbe-ef52eea10f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f5fe4110f265eb89d70f5d85485514e7fea4ed909359edaa0c76d305d1a9c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nkkjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7stxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:31Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:31 crc kubenswrapper[4676]: I0930 13:59:31.969119 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e12a8bf70722b099302a9c3b8d88526bbea68746ea168b2bf41306a3e18641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:31Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:31 crc kubenswrapper[4676]: I0930 13:59:31.982106 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e1c7778cf1d8d9f6dc68d72d153ea277d19d66d775fa02ed06687e3392040c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98be87b4cb316839099b69460cc1c1ecd98d5162c7dfc995c8f6d86ad477fedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:31Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:31 crc kubenswrapper[4676]: I0930 13:59:31.996382 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:31Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.011140 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938f56b18fe369ceeb13a74b0c257fd1b7fafeeb4d331327d4508f482caca32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:32Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.026691 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s7q5x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12808c49-1bed-4251-bcbe-fad6207eea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://397ae2103d09a79613ccc4281fb5318128fa7dc84e8013a5327cf3cd31490049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://397ae2103d09a79613ccc4281fb5318128fa7dc84e8013a5327cf3cd31490049\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:59:31Z\\\",\\\"message\\\":\\\"2025-09-30T13:58:45+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b7b957c3-a227-40bb-8e6e-866fafe1bcb0\\\\n2025-09-30T13:58:45+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b7b957c3-a227-40bb-8e6e-866fafe1bcb0 to /host/opt/cni/bin/\\\\n2025-09-30T13:58:46Z [verbose] multus-daemon started\\\\n2025-09-30T13:58:46Z [verbose] Readiness Indicator file check\\\\n2025-09-30T13:59:31Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84swg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s7q5x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:32Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.044095 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.044149 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.044182 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.044200 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.044213 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:32Z","lastTransitionTime":"2025-09-30T13:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.050552 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9343584b-a3e2-45a2-9f71-b815d206f44c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://168155413f2783887fde890a607fd98c1823da5a45b7fc550ef9358750451da7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69000b0d87b6f1e68fb36dd2349f95de211b17cc4e7bd7ec489969feca3810d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0228750af194e47c41da2795fb0b21b2c44b61e7d5a8943a943793954b33e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://288cb819756dee05f608c0e6875f3a2b4eb629a5d07db35aaf706a4bc4110899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9e51dab4ec7df53e36f707786a013e4fca0909f0d69cb9113dfdff6e0c9a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:32Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.062529 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2355fa03-bfc0-4ba0-941c-ceee570ab4b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb64f69d88447dcfe2a2e206f1e59b4b10f370ac02a3f9653e0b95e6d94529c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f628b643cc4c4a9a0d4b2c6d923d6ee5839b325a873da093e84dfc48255548f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86ba7b90dcf7a1fcd028d2fcf19bb86de8c58446ef34191d50a5055f4042bb71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5591cefc1293c938f8a2a8db06c7287d0e54d55a9d99c65a40126ca77abccd80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:32Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.074028 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f9bd59d-f026-4c72-9734-c670da1df387\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a7307312b0604e0dced8cbc2b8ed07583ec6650b05091d4c5f41a08da70691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14d07155343479f6d7f452a871bca1361e4fd22efd3ef828281e1836117454a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44aa745d9d01236afaa143bbc447bc230cac1190229c1c74ecdf92555cc721eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05f3170673f4e23690bc0c5543265931501d623eda84ee7df8e515e1ad8b6346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05f3170673f4e23690bc0c5543265931501d623eda84ee7df8e515e1ad8b6346\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:32Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.147254 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.147296 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.147307 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.147323 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.147334 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:32Z","lastTransitionTime":"2025-09-30T13:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.250070 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.250110 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.250119 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.250134 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.250144 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:32Z","lastTransitionTime":"2025-09-30T13:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.352952 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.352997 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.353006 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.353022 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.353032 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:32Z","lastTransitionTime":"2025-09-30T13:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.431926 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.431976 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.431937 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:59:32 crc kubenswrapper[4676]: E0930 13:59:32.432053 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:59:32 crc kubenswrapper[4676]: E0930 13:59:32.432147 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:59:32 crc kubenswrapper[4676]: E0930 13:59:32.432208 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.455257 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.455320 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.455331 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.455349 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.455360 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:32Z","lastTransitionTime":"2025-09-30T13:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.557280 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.557313 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.557322 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.557335 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.557344 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:32Z","lastTransitionTime":"2025-09-30T13:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.659331 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.659387 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.659398 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.659416 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.659428 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:32Z","lastTransitionTime":"2025-09-30T13:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.761871 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.761922 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.761933 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.761948 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.761959 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:32Z","lastTransitionTime":"2025-09-30T13:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.820403 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-s7q5x_12808c49-1bed-4251-bcbe-fad6207eea57/kube-multus/0.log" Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.820475 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-s7q5x" event={"ID":"12808c49-1bed-4251-bcbe-fad6207eea57","Type":"ContainerStarted","Data":"c09ecf58dd0a201d63bf48e9293f7416eb2fd5f5d394f293ae8f20758bb505b2"} Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.831799 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sksn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47ea2c6-e937-4411-b4c9-98048a5e5f05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtbn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtbn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sksn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:32Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.845461 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86284cfd-7d99-4f3b-bb3a-593b66737ae3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf15b6b87f6e760bf922b02b49658153875a79c4253bc50a895372c460e0f077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f838844cf38dd9b54b6054863a395010ea5cb87ca8066642ffe388365a91f5a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://151ae8ebe882d8c0150cd97088a56e52f55f0ae066819f8e4aaae2279a61881f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff5fec130c99031e2cb7164e0a3d5c4af84e2b2ac25d3cf255008dc5c1ab967\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3049f477ecf683e57b583e8f15aa9e5335347be632ffc02799ef279fefe9b446\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 13:58:31.152502 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:58:31.168708 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3623665920/tls.crt::/tmp/serving-cert-3623665920/tls.key\\\\\\\"\\\\nI0930 13:58:36.402239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:58:36.407288 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:58:36.407313 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:58:36.407332 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:58:36.407337 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:58:36.414209 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:58:36.414232 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414236 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:58:36.414244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:58:36.414247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:58:36.414250 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:58:36.414249 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:58:36.417272 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e2c7ce3911aaca47ffc14e5628e9efb3b32594d812028b2659b8f8ca681d53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:32Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.856493 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pw2gp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f8ff358-e9b0-478e-acbf-e30059107be1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a57e948acbb622af47bb656f844fcdca4ff7dbf2256fc07d7a73951621b06b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kq87r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f5e3bfb55994ad5563473790f09c8a934f98e96fe7c88bd1f0dc1ae3dfc7a0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kq87r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pw2gp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:32Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.864480 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.864529 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.864541 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.864557 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.864571 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:32Z","lastTransitionTime":"2025-09-30T13:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.867218 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af133cb7-f0e4-428e-b348-c6e81493fc1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8aa7f7317a33eb0f00d05f59ec96a210aa79be528b5dc94d099b25fa019511a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7djb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ba88aa74993edfc49871c7cb7c45c498fc9a6d1e2cc573864d4656d4ade55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7djb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4k2dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:32Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.880406 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ksfzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70e11f41-7d9e-49ef-a2f5-0691d5f8f631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72f096ef608f833a542cea2dde328dd04393d90bcf9c20601499b9bb49734e8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e68434ff5ca08dde9677cf96ed4703958895aa880048ef99eead372a1a06bcf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e68434ff5ca08dde9677cf96ed4703958895aa880048ef99eead372a1a06bcf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f888604a37c64c17b61fbe754fb766b3a8dd363cd84342d3f464ac3b377b703c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f888604a37c64c17b61fbe754fb766b3a8dd363cd84342d3f464ac3b377b703c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a4a6ea3224024cc371c57de0a5be4082b7bd9092090f7af42bedda1a897d78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70a4a6ea3224024cc371c57de0a5be4082b7bd9092090f7af42bedda1a897d78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62090d66b5fc63638db525937297cd84b886fc27446d8df9935ee6b45f7d8610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62090d66b5fc63638db525937297cd84b886fc27446d8df9935ee6b45f7d8610\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4452138f5ffc57e76d26761a49ebe4f782695f4d0298426e19d9fc9f87202645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4452138f5ffc57e76d26761a49ebe4f782695f4d0298426e19d9fc9f87202645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2ac4de1ea55982e71a3ecd8eaffb3d44648249635c4cdfbe0181f0497a2d1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c2ac4de1ea55982e71a3ecd8eaffb3d44648249635c4cdfbe0181f0497a2d1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ksfzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:32Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.890907 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7stxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e065baf-f38b-4397-bbbe-ef52eea10f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f5fe4110f265eb89d70f5d85485514e7fea4ed909359edaa0c76d305d1a9c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nkkjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7stxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:32Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.901539 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qmgfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0fc06a-eb2b-4fa4-8554-ce8a0fd62074\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb44b6e0c8081c62bace086876620d80d412864b9afb7fd495413baf0363c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mwsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qmgfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:32Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.919604 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fae6bdf-2a3f-4961-934d-b8f653412538\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dbb7809575c05151462f55a6d1d4eca91ba859d817d152220688e61385beff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2def96f529a78f915d69adfd74b48057081c10bf8f21bd69fce0d4b305f80a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a8ea8d40cefaa9290dfd65c7f1fa80963910c38b71de3992bdcf77fde19eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1323c0deb00658d6452b2d6aa4515233551626a4c4def3f4fff539db4a3f5038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6836ea3a8ec1a7fdff3ef83bd00963c21caeb4e8264f3b6775201c90b83d4653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f1d81c0d66a327a4174d0f39a9540f66d89a6c2049490ddad67174aa684b773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63ef46221a74ba486be9f66f0c25be6a8d34edccc58a35fedcd728dc2b61ba90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63ef46221a74ba486be9f66f0c25be6a8d34edccc58a35fedcd728dc2b61ba90\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:59:10Z\\\",\\\"message\\\":\\\"eVersion:\\\\\\\"26904\\\\\\\", FieldPath:\\\\\\\"\\\\\\\"}): type: 'Warning' reason: 'ErrorAddingResource' addLogicalPort failed for openshift-multus/network-metrics-daemon-sksn7: failed to update pod openshift-multus/network-metrics-daemon-sksn7: Internal error occurred: failed calling webhook \\\\\\\"pod.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/pod?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:10Z is after 2025-08-24T17:21:41Z\\\\nF0930 13:59:10.306789 6383 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:10Z is after 2025-0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:59:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-9775s_openshift-ovn-kubernetes(4fae6bdf-2a3f-4961-934d-b8f653412538)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d05c6ffe2c46461b1604661e27c75ac6f150de39e30e983d4c4dfbe9c84092e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6b55521ba448057d6f78ab17ef0a1a7d4c8e593e5c7a23bebb4db492fb4b54f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6b55521ba448057d6f78ab17ef0a1a7d4c8e593e5c7a23bebb4db492fb4b54f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9775s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:32Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.931123 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:32Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.942844 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:32Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.955168 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f9bd59d-f026-4c72-9734-c670da1df387\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a7307312b0604e0dced8cbc2b8ed07583ec6650b05091d4c5f41a08da70691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14d07155343479f6d7f452a871bca1361e4fd22efd3ef828281e1836117454a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44aa745d9d01236afaa143bbc447bc230cac1190229c1c74ecdf92555cc721eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05f3170673f4e23690bc0c5543265931501d623eda84ee7df8e515e1ad8b6346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05f3170673f4e23690bc0c5543265931501d623eda84ee7df8e515e1ad8b6346\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:32Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.967266 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.967324 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.967335 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.967353 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.967365 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:32Z","lastTransitionTime":"2025-09-30T13:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.970670 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e12a8bf70722b099302a9c3b8d88526bbea68746ea168b2bf41306a3e18641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:32Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:32 crc kubenswrapper[4676]: I0930 13:59:32.985136 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e1c7778cf1d8d9f6dc68d72d153ea277d19d66d775fa02ed06687e3392040c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98be87b4cb316839099b69460cc1c1ecd98d5162c7dfc995c8f6d86ad477fedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:32Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:33 crc kubenswrapper[4676]: I0930 13:59:33.000596 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:32Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:33 crc kubenswrapper[4676]: I0930 13:59:33.014560 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938f56b18fe369ceeb13a74b0c257fd1b7fafeeb4d331327d4508f482caca32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:33Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:33 crc kubenswrapper[4676]: I0930 13:59:33.028545 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s7q5x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12808c49-1bed-4251-bcbe-fad6207eea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c09ecf58dd0a201d63bf48e9293f7416eb2fd5f5d394f293ae8f20758bb505b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://397ae2103d09a79613ccc4281fb5318128fa7dc84e8013a5327cf3cd31490049\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:59:31Z\\\",\\\"message\\\":\\\"2025-09-30T13:58:45+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b7b957c3-a227-40bb-8e6e-866fafe1bcb0\\\\n2025-09-30T13:58:45+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b7b957c3-a227-40bb-8e6e-866fafe1bcb0 to /host/opt/cni/bin/\\\\n2025-09-30T13:58:46Z [verbose] multus-daemon started\\\\n2025-09-30T13:58:46Z [verbose] Readiness Indicator file check\\\\n2025-09-30T13:59:31Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:59:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84swg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s7q5x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:33Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:33 crc kubenswrapper[4676]: I0930 13:59:33.049576 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9343584b-a3e2-45a2-9f71-b815d206f44c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://168155413f2783887fde890a607fd98c1823da5a45b7fc550ef9358750451da7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69000b0d87b6f1e68fb36dd2349f95de211b17cc4e7bd7ec489969feca3810d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0228750af194e47c41da2795fb0b21b2c44b61e7d5a8943a943793954b33e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://288cb819756dee05f608c0e6875f3a2b4eb629a5d07db35aaf706a4bc4110899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9e51dab4ec7df53e36f707786a013e4fca0909f0d69cb9113dfdff6e0c9a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:33Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:33 crc kubenswrapper[4676]: I0930 13:59:33.062775 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2355fa03-bfc0-4ba0-941c-ceee570ab4b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb64f69d88447dcfe2a2e206f1e59b4b10f370ac02a3f9653e0b95e6d94529c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f628b643cc4c4a9a0d4b2c6d923d6ee5839b325a873da093e84dfc48255548f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86ba7b90dcf7a1fcd028d2fcf19bb86de8c58446ef34191d50a5055f4042bb71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5591cefc1293c938f8a2a8db06c7287d0e54d55a9d99c65a40126ca77abccd80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:33Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:33 crc kubenswrapper[4676]: I0930 13:59:33.070002 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:33 crc kubenswrapper[4676]: I0930 13:59:33.070056 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:33 crc kubenswrapper[4676]: I0930 13:59:33.070069 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:33 crc kubenswrapper[4676]: I0930 13:59:33.070087 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:33 crc kubenswrapper[4676]: I0930 13:59:33.070099 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:33Z","lastTransitionTime":"2025-09-30T13:59:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:33 crc kubenswrapper[4676]: I0930 13:59:33.172644 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:33 crc kubenswrapper[4676]: I0930 13:59:33.172717 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:33 crc kubenswrapper[4676]: I0930 13:59:33.172735 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:33 crc kubenswrapper[4676]: I0930 13:59:33.172763 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:33 crc kubenswrapper[4676]: I0930 13:59:33.172782 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:33Z","lastTransitionTime":"2025-09-30T13:59:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:33 crc kubenswrapper[4676]: I0930 13:59:33.275491 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:33 crc kubenswrapper[4676]: I0930 13:59:33.275563 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:33 crc kubenswrapper[4676]: I0930 13:59:33.275574 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:33 crc kubenswrapper[4676]: I0930 13:59:33.275596 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:33 crc kubenswrapper[4676]: I0930 13:59:33.275609 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:33Z","lastTransitionTime":"2025-09-30T13:59:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:33 crc kubenswrapper[4676]: I0930 13:59:33.378610 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:33 crc kubenswrapper[4676]: I0930 13:59:33.378665 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:33 crc kubenswrapper[4676]: I0930 13:59:33.378678 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:33 crc kubenswrapper[4676]: I0930 13:59:33.378696 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:33 crc kubenswrapper[4676]: I0930 13:59:33.378709 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:33Z","lastTransitionTime":"2025-09-30T13:59:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:33 crc kubenswrapper[4676]: I0930 13:59:33.432623 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sksn7" Sep 30 13:59:33 crc kubenswrapper[4676]: E0930 13:59:33.432935 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sksn7" podUID="e47ea2c6-e937-4411-b4c9-98048a5e5f05" Sep 30 13:59:33 crc kubenswrapper[4676]: I0930 13:59:33.481957 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:33 crc kubenswrapper[4676]: I0930 13:59:33.482022 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:33 crc kubenswrapper[4676]: I0930 13:59:33.482061 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:33 crc kubenswrapper[4676]: I0930 13:59:33.482084 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:33 crc kubenswrapper[4676]: I0930 13:59:33.482098 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:33Z","lastTransitionTime":"2025-09-30T13:59:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:33 crc kubenswrapper[4676]: I0930 13:59:33.585295 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:33 crc kubenswrapper[4676]: I0930 13:59:33.585337 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:33 crc kubenswrapper[4676]: I0930 13:59:33.585345 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:33 crc kubenswrapper[4676]: I0930 13:59:33.585360 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:33 crc kubenswrapper[4676]: I0930 13:59:33.585370 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:33Z","lastTransitionTime":"2025-09-30T13:59:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:33 crc kubenswrapper[4676]: I0930 13:59:33.688098 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:33 crc kubenswrapper[4676]: I0930 13:59:33.688135 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:33 crc kubenswrapper[4676]: I0930 13:59:33.688146 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:33 crc kubenswrapper[4676]: I0930 13:59:33.688161 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:33 crc kubenswrapper[4676]: I0930 13:59:33.688169 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:33Z","lastTransitionTime":"2025-09-30T13:59:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:33 crc kubenswrapper[4676]: I0930 13:59:33.789741 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:33 crc kubenswrapper[4676]: I0930 13:59:33.789807 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:33 crc kubenswrapper[4676]: I0930 13:59:33.789818 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:33 crc kubenswrapper[4676]: I0930 13:59:33.789832 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:33 crc kubenswrapper[4676]: I0930 13:59:33.789843 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:33Z","lastTransitionTime":"2025-09-30T13:59:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:33 crc kubenswrapper[4676]: I0930 13:59:33.891770 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:33 crc kubenswrapper[4676]: I0930 13:59:33.891827 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:33 crc kubenswrapper[4676]: I0930 13:59:33.891844 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:33 crc kubenswrapper[4676]: I0930 13:59:33.891867 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:33 crc kubenswrapper[4676]: I0930 13:59:33.891913 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:33Z","lastTransitionTime":"2025-09-30T13:59:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:33 crc kubenswrapper[4676]: I0930 13:59:33.994214 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:33 crc kubenswrapper[4676]: I0930 13:59:33.994251 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:33 crc kubenswrapper[4676]: I0930 13:59:33.994262 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:33 crc kubenswrapper[4676]: I0930 13:59:33.994276 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:33 crc kubenswrapper[4676]: I0930 13:59:33.994285 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:33Z","lastTransitionTime":"2025-09-30T13:59:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.097782 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.097859 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.097940 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.098020 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.098044 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:34Z","lastTransitionTime":"2025-09-30T13:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.164938 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.164978 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.164986 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.165000 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.165009 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:34Z","lastTransitionTime":"2025-09-30T13:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:34 crc kubenswrapper[4676]: E0930 13:59:34.176602 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1d613b32-e65a-4ccd-985a-b2960fc60b41\\\",\\\"systemUUID\\\":\\\"e459744e-c3c7-46eb-b48a-f9d168ffc645\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:34Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.182623 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.182657 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.182670 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.182686 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.182697 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:34Z","lastTransitionTime":"2025-09-30T13:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:34 crc kubenswrapper[4676]: E0930 13:59:34.193638 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1d613b32-e65a-4ccd-985a-b2960fc60b41\\\",\\\"systemUUID\\\":\\\"e459744e-c3c7-46eb-b48a-f9d168ffc645\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:34Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.196690 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.196729 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.196740 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.196755 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.196771 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:34Z","lastTransitionTime":"2025-09-30T13:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:34 crc kubenswrapper[4676]: E0930 13:59:34.207136 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1d613b32-e65a-4ccd-985a-b2960fc60b41\\\",\\\"systemUUID\\\":\\\"e459744e-c3c7-46eb-b48a-f9d168ffc645\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:34Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.210240 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.210291 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.210308 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.210328 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.210342 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:34Z","lastTransitionTime":"2025-09-30T13:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:34 crc kubenswrapper[4676]: E0930 13:59:34.221871 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1d613b32-e65a-4ccd-985a-b2960fc60b41\\\",\\\"systemUUID\\\":\\\"e459744e-c3c7-46eb-b48a-f9d168ffc645\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:34Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.225403 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.225438 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.225450 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.225465 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.225476 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:34Z","lastTransitionTime":"2025-09-30T13:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:34 crc kubenswrapper[4676]: E0930 13:59:34.237119 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1d613b32-e65a-4ccd-985a-b2960fc60b41\\\",\\\"systemUUID\\\":\\\"e459744e-c3c7-46eb-b48a-f9d168ffc645\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:34Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:34 crc kubenswrapper[4676]: E0930 13:59:34.237261 4676 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.238751 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.238809 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.238821 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.238840 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.238851 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:34Z","lastTransitionTime":"2025-09-30T13:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.341495 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.341543 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.341552 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.341583 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.341594 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:34Z","lastTransitionTime":"2025-09-30T13:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.432605 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.432663 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:59:34 crc kubenswrapper[4676]: E0930 13:59:34.432763 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.432814 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:59:34 crc kubenswrapper[4676]: E0930 13:59:34.432923 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:59:34 crc kubenswrapper[4676]: E0930 13:59:34.433019 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.443540 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.443575 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.443584 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.443596 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.443607 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:34Z","lastTransitionTime":"2025-09-30T13:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.545365 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.545407 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.545418 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.545432 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.545443 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:34Z","lastTransitionTime":"2025-09-30T13:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.647095 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.647137 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.647148 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.647164 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.647176 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:34Z","lastTransitionTime":"2025-09-30T13:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.749125 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.749162 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.749171 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.749187 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.749197 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:34Z","lastTransitionTime":"2025-09-30T13:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.851181 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.851443 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.851583 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.851720 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.851876 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:34Z","lastTransitionTime":"2025-09-30T13:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.954982 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.955034 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.955045 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.955060 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:34 crc kubenswrapper[4676]: I0930 13:59:34.955070 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:34Z","lastTransitionTime":"2025-09-30T13:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:35 crc kubenswrapper[4676]: I0930 13:59:35.058145 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:35 crc kubenswrapper[4676]: I0930 13:59:35.058210 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:35 crc kubenswrapper[4676]: I0930 13:59:35.058222 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:35 crc kubenswrapper[4676]: I0930 13:59:35.058240 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:35 crc kubenswrapper[4676]: I0930 13:59:35.058250 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:35Z","lastTransitionTime":"2025-09-30T13:59:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:35 crc kubenswrapper[4676]: I0930 13:59:35.160126 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:35 crc kubenswrapper[4676]: I0930 13:59:35.160174 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:35 crc kubenswrapper[4676]: I0930 13:59:35.160218 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:35 crc kubenswrapper[4676]: I0930 13:59:35.160236 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:35 crc kubenswrapper[4676]: I0930 13:59:35.160255 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:35Z","lastTransitionTime":"2025-09-30T13:59:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:35 crc kubenswrapper[4676]: I0930 13:59:35.262913 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:35 crc kubenswrapper[4676]: I0930 13:59:35.262957 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:35 crc kubenswrapper[4676]: I0930 13:59:35.262967 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:35 crc kubenswrapper[4676]: I0930 13:59:35.262983 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:35 crc kubenswrapper[4676]: I0930 13:59:35.262997 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:35Z","lastTransitionTime":"2025-09-30T13:59:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:35 crc kubenswrapper[4676]: I0930 13:59:35.365709 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:35 crc kubenswrapper[4676]: I0930 13:59:35.365756 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:35 crc kubenswrapper[4676]: I0930 13:59:35.365765 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:35 crc kubenswrapper[4676]: I0930 13:59:35.365779 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:35 crc kubenswrapper[4676]: I0930 13:59:35.365790 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:35Z","lastTransitionTime":"2025-09-30T13:59:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:35 crc kubenswrapper[4676]: I0930 13:59:35.432949 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sksn7" Sep 30 13:59:35 crc kubenswrapper[4676]: E0930 13:59:35.433092 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sksn7" podUID="e47ea2c6-e937-4411-b4c9-98048a5e5f05" Sep 30 13:59:35 crc kubenswrapper[4676]: I0930 13:59:35.468621 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:35 crc kubenswrapper[4676]: I0930 13:59:35.468648 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:35 crc kubenswrapper[4676]: I0930 13:59:35.468662 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:35 crc kubenswrapper[4676]: I0930 13:59:35.468685 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:35 crc kubenswrapper[4676]: I0930 13:59:35.468696 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:35Z","lastTransitionTime":"2025-09-30T13:59:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:35 crc kubenswrapper[4676]: I0930 13:59:35.571307 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:35 crc kubenswrapper[4676]: I0930 13:59:35.571343 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:35 crc kubenswrapper[4676]: I0930 13:59:35.571351 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:35 crc kubenswrapper[4676]: I0930 13:59:35.571364 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:35 crc kubenswrapper[4676]: I0930 13:59:35.571372 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:35Z","lastTransitionTime":"2025-09-30T13:59:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:35 crc kubenswrapper[4676]: I0930 13:59:35.673918 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:35 crc kubenswrapper[4676]: I0930 13:59:35.673962 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:35 crc kubenswrapper[4676]: I0930 13:59:35.673975 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:35 crc kubenswrapper[4676]: I0930 13:59:35.673992 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:35 crc kubenswrapper[4676]: I0930 13:59:35.674004 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:35Z","lastTransitionTime":"2025-09-30T13:59:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:35 crc kubenswrapper[4676]: I0930 13:59:35.776694 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:35 crc kubenswrapper[4676]: I0930 13:59:35.776746 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:35 crc kubenswrapper[4676]: I0930 13:59:35.776757 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:35 crc kubenswrapper[4676]: I0930 13:59:35.776773 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:35 crc kubenswrapper[4676]: I0930 13:59:35.776786 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:35Z","lastTransitionTime":"2025-09-30T13:59:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:35 crc kubenswrapper[4676]: I0930 13:59:35.879267 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:35 crc kubenswrapper[4676]: I0930 13:59:35.879301 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:35 crc kubenswrapper[4676]: I0930 13:59:35.879310 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:35 crc kubenswrapper[4676]: I0930 13:59:35.879323 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:35 crc kubenswrapper[4676]: I0930 13:59:35.879331 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:35Z","lastTransitionTime":"2025-09-30T13:59:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:35 crc kubenswrapper[4676]: I0930 13:59:35.981287 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:35 crc kubenswrapper[4676]: I0930 13:59:35.981355 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:35 crc kubenswrapper[4676]: I0930 13:59:35.981370 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:35 crc kubenswrapper[4676]: I0930 13:59:35.981388 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:35 crc kubenswrapper[4676]: I0930 13:59:35.981403 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:35Z","lastTransitionTime":"2025-09-30T13:59:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:36 crc kubenswrapper[4676]: I0930 13:59:36.083796 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:36 crc kubenswrapper[4676]: I0930 13:59:36.083851 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:36 crc kubenswrapper[4676]: I0930 13:59:36.083868 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:36 crc kubenswrapper[4676]: I0930 13:59:36.083972 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:36 crc kubenswrapper[4676]: I0930 13:59:36.083991 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:36Z","lastTransitionTime":"2025-09-30T13:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:36 crc kubenswrapper[4676]: I0930 13:59:36.187201 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:36 crc kubenswrapper[4676]: I0930 13:59:36.187272 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:36 crc kubenswrapper[4676]: I0930 13:59:36.187285 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:36 crc kubenswrapper[4676]: I0930 13:59:36.187312 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:36 crc kubenswrapper[4676]: I0930 13:59:36.187327 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:36Z","lastTransitionTime":"2025-09-30T13:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:36 crc kubenswrapper[4676]: I0930 13:59:36.289585 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:36 crc kubenswrapper[4676]: I0930 13:59:36.289667 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:36 crc kubenswrapper[4676]: I0930 13:59:36.289691 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:36 crc kubenswrapper[4676]: I0930 13:59:36.289724 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:36 crc kubenswrapper[4676]: I0930 13:59:36.289752 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:36Z","lastTransitionTime":"2025-09-30T13:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:36 crc kubenswrapper[4676]: I0930 13:59:36.393171 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:36 crc kubenswrapper[4676]: I0930 13:59:36.393226 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:36 crc kubenswrapper[4676]: I0930 13:59:36.393239 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:36 crc kubenswrapper[4676]: I0930 13:59:36.393258 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:36 crc kubenswrapper[4676]: I0930 13:59:36.393296 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:36Z","lastTransitionTime":"2025-09-30T13:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:36 crc kubenswrapper[4676]: I0930 13:59:36.433032 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:59:36 crc kubenswrapper[4676]: I0930 13:59:36.433032 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:59:36 crc kubenswrapper[4676]: I0930 13:59:36.433153 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:59:36 crc kubenswrapper[4676]: E0930 13:59:36.433857 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:59:36 crc kubenswrapper[4676]: E0930 13:59:36.434003 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:59:36 crc kubenswrapper[4676]: E0930 13:59:36.434125 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:59:36 crc kubenswrapper[4676]: I0930 13:59:36.495873 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:36 crc kubenswrapper[4676]: I0930 13:59:36.495942 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:36 crc kubenswrapper[4676]: I0930 13:59:36.495958 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:36 crc kubenswrapper[4676]: I0930 13:59:36.495974 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:36 crc kubenswrapper[4676]: I0930 13:59:36.496253 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:36Z","lastTransitionTime":"2025-09-30T13:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:36 crc kubenswrapper[4676]: I0930 13:59:36.598624 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:36 crc kubenswrapper[4676]: I0930 13:59:36.598867 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:36 crc kubenswrapper[4676]: I0930 13:59:36.598952 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:36 crc kubenswrapper[4676]: I0930 13:59:36.599015 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:36 crc kubenswrapper[4676]: I0930 13:59:36.599079 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:36Z","lastTransitionTime":"2025-09-30T13:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:36 crc kubenswrapper[4676]: I0930 13:59:36.701660 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:36 crc kubenswrapper[4676]: I0930 13:59:36.701698 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:36 crc kubenswrapper[4676]: I0930 13:59:36.701710 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:36 crc kubenswrapper[4676]: I0930 13:59:36.701726 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:36 crc kubenswrapper[4676]: I0930 13:59:36.701767 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:36Z","lastTransitionTime":"2025-09-30T13:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:36 crc kubenswrapper[4676]: I0930 13:59:36.804656 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:36 crc kubenswrapper[4676]: I0930 13:59:36.804692 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:36 crc kubenswrapper[4676]: I0930 13:59:36.804700 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:36 crc kubenswrapper[4676]: I0930 13:59:36.804713 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:36 crc kubenswrapper[4676]: I0930 13:59:36.804722 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:36Z","lastTransitionTime":"2025-09-30T13:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:36 crc kubenswrapper[4676]: I0930 13:59:36.906570 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:36 crc kubenswrapper[4676]: I0930 13:59:36.906611 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:36 crc kubenswrapper[4676]: I0930 13:59:36.906623 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:36 crc kubenswrapper[4676]: I0930 13:59:36.906639 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:36 crc kubenswrapper[4676]: I0930 13:59:36.906648 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:36Z","lastTransitionTime":"2025-09-30T13:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:37 crc kubenswrapper[4676]: I0930 13:59:37.009321 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:37 crc kubenswrapper[4676]: I0930 13:59:37.009359 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:37 crc kubenswrapper[4676]: I0930 13:59:37.009370 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:37 crc kubenswrapper[4676]: I0930 13:59:37.009385 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:37 crc kubenswrapper[4676]: I0930 13:59:37.009396 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:37Z","lastTransitionTime":"2025-09-30T13:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:37 crc kubenswrapper[4676]: I0930 13:59:37.112282 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:37 crc kubenswrapper[4676]: I0930 13:59:37.112315 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:37 crc kubenswrapper[4676]: I0930 13:59:37.112325 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:37 crc kubenswrapper[4676]: I0930 13:59:37.112340 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:37 crc kubenswrapper[4676]: I0930 13:59:37.112349 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:37Z","lastTransitionTime":"2025-09-30T13:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:37 crc kubenswrapper[4676]: I0930 13:59:37.214354 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:37 crc kubenswrapper[4676]: I0930 13:59:37.214395 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:37 crc kubenswrapper[4676]: I0930 13:59:37.214405 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:37 crc kubenswrapper[4676]: I0930 13:59:37.214417 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:37 crc kubenswrapper[4676]: I0930 13:59:37.214426 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:37Z","lastTransitionTime":"2025-09-30T13:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:37 crc kubenswrapper[4676]: I0930 13:59:37.316994 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:37 crc kubenswrapper[4676]: I0930 13:59:37.317046 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:37 crc kubenswrapper[4676]: I0930 13:59:37.317057 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:37 crc kubenswrapper[4676]: I0930 13:59:37.317071 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:37 crc kubenswrapper[4676]: I0930 13:59:37.317081 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:37Z","lastTransitionTime":"2025-09-30T13:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:37 crc kubenswrapper[4676]: I0930 13:59:37.419581 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:37 crc kubenswrapper[4676]: I0930 13:59:37.419615 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:37 crc kubenswrapper[4676]: I0930 13:59:37.419623 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:37 crc kubenswrapper[4676]: I0930 13:59:37.419636 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:37 crc kubenswrapper[4676]: I0930 13:59:37.419645 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:37Z","lastTransitionTime":"2025-09-30T13:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:37 crc kubenswrapper[4676]: I0930 13:59:37.433357 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sksn7" Sep 30 13:59:37 crc kubenswrapper[4676]: E0930 13:59:37.433502 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sksn7" podUID="e47ea2c6-e937-4411-b4c9-98048a5e5f05" Sep 30 13:59:37 crc kubenswrapper[4676]: I0930 13:59:37.433630 4676 scope.go:117] "RemoveContainer" containerID="63ef46221a74ba486be9f66f0c25be6a8d34edccc58a35fedcd728dc2b61ba90" Sep 30 13:59:37 crc kubenswrapper[4676]: I0930 13:59:37.453784 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9343584b-a3e2-45a2-9f71-b815d206f44c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://168155413f2783887fde890a607fd98c1823da5a45b7fc550ef9358750451da7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69000b0d87b6f1e68fb36dd2349f95de211b17cc4e7bd7ec489969feca3810d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0228750af194e47c41da2795fb0b21b2c44b61e7d5a8943a943793954b33e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://288cb819756dee05f608c0e6875f3a2b4eb629a5d07db35aaf706a4bc4110899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9e51dab4ec7df53e36f707786a013e4fca0909f0d69cb9113dfdff6e0c9a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:37Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:37 crc kubenswrapper[4676]: I0930 13:59:37.466811 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2355fa03-bfc0-4ba0-941c-ceee570ab4b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb64f69d88447dcfe2a2e206f1e59b4b10f370ac02a3f9653e0b95e6d94529c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f628b643cc4c4a9a0d4b2c6d923d6ee5839b325a873da093e84dfc48255548f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86ba7b90dcf7a1fcd028d2fcf19bb86de8c58446ef34191d50a5055f4042bb71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5591cefc1293c938f8a2a8db06c7287d0e54d55a9d99c65a40126ca77abccd80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:37Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:37 crc kubenswrapper[4676]: I0930 13:59:37.478023 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f9bd59d-f026-4c72-9734-c670da1df387\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a7307312b0604e0dced8cbc2b8ed07583ec6650b05091d4c5f41a08da70691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14d07155343479f6d7f452a871bca1361e4fd22efd3ef828281e1836117454a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44aa745d9d01236afaa143bbc447bc230cac1190229c1c74ecdf92555cc721eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05f3170673f4e23690bc0c5543265931501d623eda84ee7df8e515e1ad8b6346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05f3170673f4e23690bc0c5543265931501d623eda84ee7df8e515e1ad8b6346\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:37Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:37 crc kubenswrapper[4676]: I0930 13:59:37.490009 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e12a8bf70722b099302a9c3b8d88526bbea68746ea168b2bf41306a3e18641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:37Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:37 crc kubenswrapper[4676]: I0930 13:59:37.501957 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e1c7778cf1d8d9f6dc68d72d153ea277d19d66d775fa02ed06687e3392040c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98be87b4cb316839099b69460cc1c1ecd98d5162c7dfc995c8f6d86ad477fedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:37Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:37 crc kubenswrapper[4676]: I0930 13:59:37.513568 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:37Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:37 crc kubenswrapper[4676]: I0930 13:59:37.521484 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:37 crc kubenswrapper[4676]: I0930 13:59:37.521536 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:37 crc kubenswrapper[4676]: I0930 13:59:37.521550 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:37 crc kubenswrapper[4676]: I0930 13:59:37.521565 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:37 crc kubenswrapper[4676]: I0930 13:59:37.521574 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:37Z","lastTransitionTime":"2025-09-30T13:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:37 crc kubenswrapper[4676]: I0930 13:59:37.527948 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938f56b18fe369ceeb13a74b0c257fd1b7fafeeb4d331327d4508f482caca32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:37Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:37 crc kubenswrapper[4676]: I0930 13:59:37.540252 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s7q5x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12808c49-1bed-4251-bcbe-fad6207eea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c09ecf58dd0a201d63bf48e9293f7416eb2fd5f5d394f293ae8f20758bb505b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://397ae2103d09a79613ccc4281fb5318128fa7dc84e8013a5327cf3cd31490049\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:59:31Z\\\",\\\"message\\\":\\\"2025-09-30T13:58:45+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b7b957c3-a227-40bb-8e6e-866fafe1bcb0\\\\n2025-09-30T13:58:45+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b7b957c3-a227-40bb-8e6e-866fafe1bcb0 to /host/opt/cni/bin/\\\\n2025-09-30T13:58:46Z [verbose] multus-daemon started\\\\n2025-09-30T13:58:46Z [verbose] Readiness Indicator file check\\\\n2025-09-30T13:59:31Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:59:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84swg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s7q5x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:37Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:37 crc kubenswrapper[4676]: I0930 13:59:37.553111 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86284cfd-7d99-4f3b-bb3a-593b66737ae3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf15b6b87f6e760bf922b02b49658153875a79c4253bc50a895372c460e0f077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f838844cf38dd9b54b6054863a395010ea5cb87ca8066642ffe388365a91f5a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://151ae8ebe882d8c0150cd97088a56e52f55f0ae066819f8e4aaae2279a61881f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff5fec130c99031e2cb7164e0a3d5c4af84e2b2ac25d3cf255008dc5c1ab967\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3049f477ecf683e57b583e8f15aa9e5335347be632ffc02799ef279fefe9b446\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 13:58:31.152502 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:58:31.168708 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3623665920/tls.crt::/tmp/serving-cert-3623665920/tls.key\\\\\\\"\\\\nI0930 13:58:36.402239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:58:36.407288 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:58:36.407313 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:58:36.407332 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:58:36.407337 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:58:36.414209 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:58:36.414232 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414236 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:58:36.414244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:58:36.414247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:58:36.414250 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:58:36.414249 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:58:36.417272 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e2c7ce3911aaca47ffc14e5628e9efb3b32594d812028b2659b8f8ca681d53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:37Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:37 crc kubenswrapper[4676]: I0930 13:59:37.564124 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pw2gp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f8ff358-e9b0-478e-acbf-e30059107be1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a57e948acbb622af47bb656f844fcdca4ff7dbf2256fc07d7a73951621b06b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kq87r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f5e3bfb55994ad5563473790f09c8a934f98e96fe7c88bd1f0dc1ae3dfc7a0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kq87r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pw2gp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:37Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:37 crc kubenswrapper[4676]: I0930 13:59:37.573683 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sksn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47ea2c6-e937-4411-b4c9-98048a5e5f05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtbn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtbn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sksn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:37Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:37 crc kubenswrapper[4676]: I0930 13:59:37.586449 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ksfzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70e11f41-7d9e-49ef-a2f5-0691d5f8f631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72f096ef608f833a542cea2dde328dd04393d90bcf9c20601499b9bb49734e8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e68434ff5ca08dde9677cf96ed4703958895aa880048ef99eead372a1a06bcf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e68434ff5ca08dde9677cf96ed4703958895aa880048ef99eead372a1a06bcf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f888604a37c64c17b61fbe754fb766b3a8dd363cd84342d3f464ac3b377b703c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f888604a37c64c17b61fbe754fb766b3a8dd363cd84342d3f464ac3b377b703c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a4a6ea3224024cc371c57de0a5be4082b7bd9092090f7af42bedda1a897d78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70a4a6ea3224024cc371c57de0a5be4082b7bd9092090f7af42bedda1a897d78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62090d66b5fc63638db525937297cd84b886fc27446d8df9935ee6b45f7d8610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62090d66b5fc63638db525937297cd84b886fc27446d8df9935ee6b45f7d8610\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4452138f5ffc57e76d26761a49ebe4f782695f4d0298426e19d9fc9f87202645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4452138f5ffc57e76d26761a49ebe4f782695f4d0298426e19d9fc9f87202645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2ac4de1ea55982e71a3ecd8eaffb3d44648249635c4cdfbe0181f0497a2d1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c2ac4de1ea55982e71a3ecd8eaffb3d44648249635c4cdfbe0181f0497a2d1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ksfzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:37Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:37 crc kubenswrapper[4676]: I0930 13:59:37.596183 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af133cb7-f0e4-428e-b348-c6e81493fc1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8aa7f7317a33eb0f00d05f59ec96a210aa79be528b5dc94d099b25fa019511a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7djb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ba88aa74993edfc49871c7cb7c45c498fc9a6d1e2cc573864d4656d4ade55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7djb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4k2dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:37Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:37 crc kubenswrapper[4676]: I0930 13:59:37.605827 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:37Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:37 crc kubenswrapper[4676]: I0930 13:59:37.616688 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:37Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:37 crc kubenswrapper[4676]: I0930 13:59:37.624570 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:37 crc kubenswrapper[4676]: I0930 13:59:37.624612 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:37 crc kubenswrapper[4676]: I0930 13:59:37.624623 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:37 crc kubenswrapper[4676]: I0930 13:59:37.624638 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:37 crc kubenswrapper[4676]: I0930 13:59:37.624651 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:37Z","lastTransitionTime":"2025-09-30T13:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:37 crc kubenswrapper[4676]: I0930 13:59:37.629568 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7stxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e065baf-f38b-4397-bbbe-ef52eea10f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f5fe4110f265eb89d70f5d85485514e7fea4ed909359edaa0c76d305d1a9c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nkkjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7stxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:37Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:37 crc kubenswrapper[4676]: I0930 13:59:37.638937 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qmgfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0fc06a-eb2b-4fa4-8554-ce8a0fd62074\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb44b6e0c8081c62bace086876620d80d412864b9afb7fd495413baf0363c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mwsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qmgfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:37Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:37 crc kubenswrapper[4676]: I0930 13:59:37.658133 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fae6bdf-2a3f-4961-934d-b8f653412538\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dbb7809575c05151462f55a6d1d4eca91ba859d817d152220688e61385beff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2def96f529a78f915d69adfd74b48057081c10bf8f21bd69fce0d4b305f80a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a8ea8d40cefaa9290dfd65c7f1fa80963910c38b71de3992bdcf77fde19eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1323c0deb00658d6452b2d6aa4515233551626a4c4def3f4fff539db4a3f5038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6836ea3a8ec1a7fdff3ef83bd00963c21caeb4e8264f3b6775201c90b83d4653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f1d81c0d66a327a4174d0f39a9540f66d89a6c2049490ddad67174aa684b773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63ef46221a74ba486be9f66f0c25be6a8d34edccc58a35fedcd728dc2b61ba90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63ef46221a74ba486be9f66f0c25be6a8d34edccc58a35fedcd728dc2b61ba90\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:59:10Z\\\",\\\"message\\\":\\\"eVersion:\\\\\\\"26904\\\\\\\", FieldPath:\\\\\\\"\\\\\\\"}): type: 'Warning' reason: 'ErrorAddingResource' addLogicalPort failed for openshift-multus/network-metrics-daemon-sksn7: failed to update pod openshift-multus/network-metrics-daemon-sksn7: Internal error occurred: failed calling webhook \\\\\\\"pod.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/pod?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:10Z is after 2025-08-24T17:21:41Z\\\\nF0930 13:59:10.306789 6383 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:10Z is after 2025-0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:59:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-9775s_openshift-ovn-kubernetes(4fae6bdf-2a3f-4961-934d-b8f653412538)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d05c6ffe2c46461b1604661e27c75ac6f150de39e30e983d4c4dfbe9c84092e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6b55521ba448057d6f78ab17ef0a1a7d4c8e593e5c7a23bebb4db492fb4b54f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6b55521ba448057d6f78ab17ef0a1a7d4c8e593e5c7a23bebb4db492fb4b54f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9775s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:37Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:37 crc kubenswrapper[4676]: I0930 13:59:37.727615 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:37 crc kubenswrapper[4676]: I0930 13:59:37.727656 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:37 crc kubenswrapper[4676]: I0930 13:59:37.727668 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:37 crc kubenswrapper[4676]: I0930 13:59:37.727684 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:37 crc kubenswrapper[4676]: I0930 13:59:37.727695 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:37Z","lastTransitionTime":"2025-09-30T13:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:37 crc kubenswrapper[4676]: I0930 13:59:37.830640 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:37 crc kubenswrapper[4676]: I0930 13:59:37.830690 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:37 crc kubenswrapper[4676]: I0930 13:59:37.830703 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:37 crc kubenswrapper[4676]: I0930 13:59:37.830721 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:37 crc kubenswrapper[4676]: I0930 13:59:37.830773 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:37Z","lastTransitionTime":"2025-09-30T13:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:37 crc kubenswrapper[4676]: I0930 13:59:37.933564 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:37 crc kubenswrapper[4676]: I0930 13:59:37.933608 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:37 crc kubenswrapper[4676]: I0930 13:59:37.933619 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:37 crc kubenswrapper[4676]: I0930 13:59:37.933635 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:37 crc kubenswrapper[4676]: I0930 13:59:37.933645 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:37Z","lastTransitionTime":"2025-09-30T13:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:38 crc kubenswrapper[4676]: I0930 13:59:38.039483 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:38 crc kubenswrapper[4676]: I0930 13:59:38.039531 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:38 crc kubenswrapper[4676]: I0930 13:59:38.039543 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:38 crc kubenswrapper[4676]: I0930 13:59:38.039565 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:38 crc kubenswrapper[4676]: I0930 13:59:38.039578 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:38Z","lastTransitionTime":"2025-09-30T13:59:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:38 crc kubenswrapper[4676]: I0930 13:59:38.142635 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:38 crc kubenswrapper[4676]: I0930 13:59:38.142683 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:38 crc kubenswrapper[4676]: I0930 13:59:38.142691 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:38 crc kubenswrapper[4676]: I0930 13:59:38.142710 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:38 crc kubenswrapper[4676]: I0930 13:59:38.142721 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:38Z","lastTransitionTime":"2025-09-30T13:59:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:38 crc kubenswrapper[4676]: I0930 13:59:38.252716 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:38 crc kubenswrapper[4676]: I0930 13:59:38.252763 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:38 crc kubenswrapper[4676]: I0930 13:59:38.252773 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:38 crc kubenswrapper[4676]: I0930 13:59:38.252791 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:38 crc kubenswrapper[4676]: I0930 13:59:38.252839 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:38Z","lastTransitionTime":"2025-09-30T13:59:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:38 crc kubenswrapper[4676]: I0930 13:59:38.355632 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:38 crc kubenswrapper[4676]: I0930 13:59:38.355671 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:38 crc kubenswrapper[4676]: I0930 13:59:38.355680 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:38 crc kubenswrapper[4676]: I0930 13:59:38.355696 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:38 crc kubenswrapper[4676]: I0930 13:59:38.355708 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:38Z","lastTransitionTime":"2025-09-30T13:59:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:38 crc kubenswrapper[4676]: I0930 13:59:38.432176 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:59:38 crc kubenswrapper[4676]: I0930 13:59:38.432244 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:59:38 crc kubenswrapper[4676]: I0930 13:59:38.432395 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:59:38 crc kubenswrapper[4676]: E0930 13:59:38.432689 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:59:38 crc kubenswrapper[4676]: E0930 13:59:38.432786 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:59:38 crc kubenswrapper[4676]: E0930 13:59:38.432931 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:59:38 crc kubenswrapper[4676]: I0930 13:59:38.467195 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:38 crc kubenswrapper[4676]: I0930 13:59:38.467251 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:38 crc kubenswrapper[4676]: I0930 13:59:38.467264 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:38 crc kubenswrapper[4676]: I0930 13:59:38.467286 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:38 crc kubenswrapper[4676]: I0930 13:59:38.467300 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:38Z","lastTransitionTime":"2025-09-30T13:59:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:38 crc kubenswrapper[4676]: I0930 13:59:38.571020 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:38 crc kubenswrapper[4676]: I0930 13:59:38.571121 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:38 crc kubenswrapper[4676]: I0930 13:59:38.571136 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:38 crc kubenswrapper[4676]: I0930 13:59:38.571163 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:38 crc kubenswrapper[4676]: I0930 13:59:38.571179 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:38Z","lastTransitionTime":"2025-09-30T13:59:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:38 crc kubenswrapper[4676]: I0930 13:59:38.673818 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:38 crc kubenswrapper[4676]: I0930 13:59:38.673862 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:38 crc kubenswrapper[4676]: I0930 13:59:38.673872 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:38 crc kubenswrapper[4676]: I0930 13:59:38.673909 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:38 crc kubenswrapper[4676]: I0930 13:59:38.673923 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:38Z","lastTransitionTime":"2025-09-30T13:59:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:38 crc kubenswrapper[4676]: I0930 13:59:38.776362 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:38 crc kubenswrapper[4676]: I0930 13:59:38.776421 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:38 crc kubenswrapper[4676]: I0930 13:59:38.776434 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:38 crc kubenswrapper[4676]: I0930 13:59:38.776453 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:38 crc kubenswrapper[4676]: I0930 13:59:38.776464 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:38Z","lastTransitionTime":"2025-09-30T13:59:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:38 crc kubenswrapper[4676]: I0930 13:59:38.844305 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9775s_4fae6bdf-2a3f-4961-934d-b8f653412538/ovnkube-controller/3.log" Sep 30 13:59:38 crc kubenswrapper[4676]: I0930 13:59:38.845674 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9775s_4fae6bdf-2a3f-4961-934d-b8f653412538/ovnkube-controller/2.log" Sep 30 13:59:38 crc kubenswrapper[4676]: I0930 13:59:38.849345 4676 generic.go:334] "Generic (PLEG): container finished" podID="4fae6bdf-2a3f-4961-934d-b8f653412538" containerID="b8cab1fe8b98cb8ace2412a246546b498693392bd88d2ce95ed838184c795670" exitCode=1 Sep 30 13:59:38 crc kubenswrapper[4676]: I0930 13:59:38.849408 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" event={"ID":"4fae6bdf-2a3f-4961-934d-b8f653412538","Type":"ContainerDied","Data":"b8cab1fe8b98cb8ace2412a246546b498693392bd88d2ce95ed838184c795670"} Sep 30 13:59:38 crc kubenswrapper[4676]: I0930 13:59:38.849479 4676 scope.go:117] "RemoveContainer" containerID="63ef46221a74ba486be9f66f0c25be6a8d34edccc58a35fedcd728dc2b61ba90" Sep 30 13:59:38 crc kubenswrapper[4676]: I0930 13:59:38.850276 4676 scope.go:117] "RemoveContainer" containerID="b8cab1fe8b98cb8ace2412a246546b498693392bd88d2ce95ed838184c795670" Sep 30 13:59:38 crc kubenswrapper[4676]: E0930 13:59:38.850595 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-9775s_openshift-ovn-kubernetes(4fae6bdf-2a3f-4961-934d-b8f653412538)\"" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" podUID="4fae6bdf-2a3f-4961-934d-b8f653412538" Sep 30 13:59:38 crc kubenswrapper[4676]: I0930 13:59:38.872682 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fae6bdf-2a3f-4961-934d-b8f653412538\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dbb7809575c05151462f55a6d1d4eca91ba859d817d152220688e61385beff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2def96f529a78f915d69adfd74b48057081c10bf8f21bd69fce0d4b305f80a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a8ea8d40cefaa9290dfd65c7f1fa80963910c38b71de3992bdcf77fde19eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1323c0deb00658d6452b2d6aa4515233551626a4c4def3f4fff539db4a3f5038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6836ea3a8ec1a7fdff3ef83bd00963c21caeb4e8264f3b6775201c90b83d4653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f1d81c0d66a327a4174d0f39a9540f66d89a6c2049490ddad67174aa684b773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8cab1fe8b98cb8ace2412a246546b498693392bd88d2ce95ed838184c795670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63ef46221a74ba486be9f66f0c25be6a8d34edccc58a35fedcd728dc2b61ba90\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:59:10Z\\\",\\\"message\\\":\\\"eVersion:\\\\\\\"26904\\\\\\\", FieldPath:\\\\\\\"\\\\\\\"}): type: 'Warning' reason: 'ErrorAddingResource' addLogicalPort failed for openshift-multus/network-metrics-daemon-sksn7: failed to update pod openshift-multus/network-metrics-daemon-sksn7: Internal error occurred: failed calling webhook \\\\\\\"pod.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/pod?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:10Z is after 2025-08-24T17:21:41Z\\\\nF0930 13:59:10.306789 6383 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:10Z is after 2025-0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:59:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8cab1fe8b98cb8ace2412a246546b498693392bd88d2ce95ed838184c795670\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:59:38Z\\\",\\\"message\\\":\\\"nt:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.219:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7594bb65-e742-44b3-a975-d639b1128be5}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0930 13:59:38.701574 6796 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for namespace Informer during admin network policy controller initialization, handler {0x1fcbf20 0x1fcbc00 0x1fcbba0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:38Z is after 2025-08-24T17:21:41Z]\\\\nI0930 13:59:38.701581 6796 services_controller.go:356] Processing sync for service openshift-ingress-operator/metrics for network=default\\\\nI09\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:59:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d05c6ffe2c46461b1604661e27c75ac6f150de39e30e983d4c4dfbe9c84092e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6b55521ba448057d6f78ab17ef0a1a7d4c8e593e5c7a23bebb4db492fb4b54f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6b55521ba448057d6f78ab17ef0a1a7d4c8e593e5c7a23bebb4db492fb4b54f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9775s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:38Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:38 crc kubenswrapper[4676]: I0930 13:59:38.879356 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:38 crc kubenswrapper[4676]: I0930 13:59:38.879395 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:38 crc kubenswrapper[4676]: I0930 13:59:38.879406 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:38 crc kubenswrapper[4676]: I0930 13:59:38.879425 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:38 crc kubenswrapper[4676]: I0930 13:59:38.879437 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:38Z","lastTransitionTime":"2025-09-30T13:59:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:38 crc kubenswrapper[4676]: I0930 13:59:38.888952 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:38Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:38 crc kubenswrapper[4676]: I0930 13:59:38.906062 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:38Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:38 crc kubenswrapper[4676]: I0930 13:59:38.918779 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7stxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e065baf-f38b-4397-bbbe-ef52eea10f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f5fe4110f265eb89d70f5d85485514e7fea4ed909359edaa0c76d305d1a9c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nkkjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7stxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:38Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:38 crc kubenswrapper[4676]: I0930 13:59:38.933008 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qmgfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0fc06a-eb2b-4fa4-8554-ce8a0fd62074\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb44b6e0c8081c62bace086876620d80d412864b9afb7fd495413baf0363c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mwsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qmgfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:38Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:38 crc kubenswrapper[4676]: I0930 13:59:38.950965 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e1c7778cf1d8d9f6dc68d72d153ea277d19d66d775fa02ed06687e3392040c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98be87b4cb316839099b69460cc1c1ecd98d5162c7dfc995c8f6d86ad477fedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:38Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:38 crc kubenswrapper[4676]: I0930 13:59:38.966634 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:38Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:38 crc kubenswrapper[4676]: I0930 13:59:38.979177 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938f56b18fe369ceeb13a74b0c257fd1b7fafeeb4d331327d4508f482caca32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:38Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:38 crc kubenswrapper[4676]: I0930 13:59:38.982438 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:38 crc kubenswrapper[4676]: I0930 13:59:38.982469 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:38 crc kubenswrapper[4676]: I0930 13:59:38.982479 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:38 crc kubenswrapper[4676]: I0930 13:59:38.982501 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:38 crc kubenswrapper[4676]: I0930 13:59:38.982513 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:38Z","lastTransitionTime":"2025-09-30T13:59:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:38 crc kubenswrapper[4676]: I0930 13:59:38.993836 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s7q5x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12808c49-1bed-4251-bcbe-fad6207eea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c09ecf58dd0a201d63bf48e9293f7416eb2fd5f5d394f293ae8f20758bb505b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://397ae2103d09a79613ccc4281fb5318128fa7dc84e8013a5327cf3cd31490049\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:59:31Z\\\",\\\"message\\\":\\\"2025-09-30T13:58:45+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b7b957c3-a227-40bb-8e6e-866fafe1bcb0\\\\n2025-09-30T13:58:45+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b7b957c3-a227-40bb-8e6e-866fafe1bcb0 to /host/opt/cni/bin/\\\\n2025-09-30T13:58:46Z [verbose] multus-daemon started\\\\n2025-09-30T13:58:46Z [verbose] Readiness Indicator file check\\\\n2025-09-30T13:59:31Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:59:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84swg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s7q5x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:38Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:39 crc kubenswrapper[4676]: I0930 13:59:39.024468 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9343584b-a3e2-45a2-9f71-b815d206f44c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://168155413f2783887fde890a607fd98c1823da5a45b7fc550ef9358750451da7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69000b0d87b6f1e68fb36dd2349f95de211b17cc4e7bd7ec489969feca3810d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0228750af194e47c41da2795fb0b21b2c44b61e7d5a8943a943793954b33e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://288cb819756dee05f608c0e6875f3a2b4eb629a5d07db35aaf706a4bc4110899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9e51dab4ec7df53e36f707786a013e4fca0909f0d69cb9113dfdff6e0c9a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:39Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:39 crc kubenswrapper[4676]: I0930 13:59:39.038868 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2355fa03-bfc0-4ba0-941c-ceee570ab4b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb64f69d88447dcfe2a2e206f1e59b4b10f370ac02a3f9653e0b95e6d94529c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f628b643cc4c4a9a0d4b2c6d923d6ee5839b325a873da093e84dfc48255548f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86ba7b90dcf7a1fcd028d2fcf19bb86de8c58446ef34191d50a5055f4042bb71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5591cefc1293c938f8a2a8db06c7287d0e54d55a9d99c65a40126ca77abccd80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:39Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:39 crc kubenswrapper[4676]: I0930 13:59:39.052374 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f9bd59d-f026-4c72-9734-c670da1df387\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a7307312b0604e0dced8cbc2b8ed07583ec6650b05091d4c5f41a08da70691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14d07155343479f6d7f452a871bca1361e4fd22efd3ef828281e1836117454a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44aa745d9d01236afaa143bbc447bc230cac1190229c1c74ecdf92555cc721eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05f3170673f4e23690bc0c5543265931501d623eda84ee7df8e515e1ad8b6346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05f3170673f4e23690bc0c5543265931501d623eda84ee7df8e515e1ad8b6346\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:39Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:39 crc kubenswrapper[4676]: I0930 13:59:39.068529 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e12a8bf70722b099302a9c3b8d88526bbea68746ea168b2bf41306a3e18641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:39Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:39 crc kubenswrapper[4676]: I0930 13:59:39.086664 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:39 crc kubenswrapper[4676]: I0930 13:59:39.086708 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:39 crc kubenswrapper[4676]: I0930 13:59:39.086734 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:39 crc kubenswrapper[4676]: I0930 13:59:39.086752 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:39 crc kubenswrapper[4676]: I0930 13:59:39.086772 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:39Z","lastTransitionTime":"2025-09-30T13:59:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:39 crc kubenswrapper[4676]: I0930 13:59:39.087503 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86284cfd-7d99-4f3b-bb3a-593b66737ae3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf15b6b87f6e760bf922b02b49658153875a79c4253bc50a895372c460e0f077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f838844cf38dd9b54b6054863a395010ea5cb87ca8066642ffe388365a91f5a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://151ae8ebe882d8c0150cd97088a56e52f55f0ae066819f8e4aaae2279a61881f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff5fec130c99031e2cb7164e0a3d5c4af84e2b2ac25d3cf255008dc5c1ab967\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3049f477ecf683e57b583e8f15aa9e5335347be632ffc02799ef279fefe9b446\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 13:58:31.152502 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:58:31.168708 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3623665920/tls.crt::/tmp/serving-cert-3623665920/tls.key\\\\\\\"\\\\nI0930 13:58:36.402239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:58:36.407288 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:58:36.407313 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:58:36.407332 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:58:36.407337 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:58:36.414209 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:58:36.414232 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414236 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:58:36.414244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:58:36.414247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:58:36.414250 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:58:36.414249 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:58:36.417272 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e2c7ce3911aaca47ffc14e5628e9efb3b32594d812028b2659b8f8ca681d53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:39Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:39 crc kubenswrapper[4676]: I0930 13:59:39.105123 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pw2gp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f8ff358-e9b0-478e-acbf-e30059107be1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a57e948acbb622af47bb656f844fcdca4ff7dbf2256fc07d7a73951621b06b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kq87r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f5e3bfb55994ad5563473790f09c8a934f98e96fe7c88bd1f0dc1ae3dfc7a0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kq87r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pw2gp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:39Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:39 crc kubenswrapper[4676]: I0930 13:59:39.116605 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sksn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47ea2c6-e937-4411-b4c9-98048a5e5f05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtbn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtbn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sksn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:39Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:39 crc kubenswrapper[4676]: I0930 13:59:39.131465 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ksfzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70e11f41-7d9e-49ef-a2f5-0691d5f8f631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72f096ef608f833a542cea2dde328dd04393d90bcf9c20601499b9bb49734e8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e68434ff5ca08dde9677cf96ed4703958895aa880048ef99eead372a1a06bcf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e68434ff5ca08dde9677cf96ed4703958895aa880048ef99eead372a1a06bcf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f888604a37c64c17b61fbe754fb766b3a8dd363cd84342d3f464ac3b377b703c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f888604a37c64c17b61fbe754fb766b3a8dd363cd84342d3f464ac3b377b703c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a4a6ea3224024cc371c57de0a5be4082b7bd9092090f7af42bedda1a897d78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70a4a6ea3224024cc371c57de0a5be4082b7bd9092090f7af42bedda1a897d78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62090d66b5fc63638db525937297cd84b886fc27446d8df9935ee6b45f7d8610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62090d66b5fc63638db525937297cd84b886fc27446d8df9935ee6b45f7d8610\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4452138f5ffc57e76d26761a49ebe4f782695f4d0298426e19d9fc9f87202645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4452138f5ffc57e76d26761a49ebe4f782695f4d0298426e19d9fc9f87202645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2ac4de1ea55982e71a3ecd8eaffb3d44648249635c4cdfbe0181f0497a2d1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c2ac4de1ea55982e71a3ecd8eaffb3d44648249635c4cdfbe0181f0497a2d1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ksfzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:39Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:39 crc kubenswrapper[4676]: I0930 13:59:39.144157 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af133cb7-f0e4-428e-b348-c6e81493fc1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8aa7f7317a33eb0f00d05f59ec96a210aa79be528b5dc94d099b25fa019511a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7djb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ba88aa74993edfc49871c7cb7c45c498fc9a6d1e2cc573864d4656d4ade55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7djb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4k2dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:39Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:39 crc kubenswrapper[4676]: I0930 13:59:39.190999 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:39 crc kubenswrapper[4676]: I0930 13:59:39.191263 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:39 crc kubenswrapper[4676]: I0930 13:59:39.191342 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:39 crc kubenswrapper[4676]: I0930 13:59:39.191428 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:39 crc kubenswrapper[4676]: I0930 13:59:39.191506 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:39Z","lastTransitionTime":"2025-09-30T13:59:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:39 crc kubenswrapper[4676]: I0930 13:59:39.294856 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:39 crc kubenswrapper[4676]: I0930 13:59:39.294928 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:39 crc kubenswrapper[4676]: I0930 13:59:39.294946 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:39 crc kubenswrapper[4676]: I0930 13:59:39.294968 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:39 crc kubenswrapper[4676]: I0930 13:59:39.294981 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:39Z","lastTransitionTime":"2025-09-30T13:59:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:39 crc kubenswrapper[4676]: I0930 13:59:39.397525 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:39 crc kubenswrapper[4676]: I0930 13:59:39.397598 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:39 crc kubenswrapper[4676]: I0930 13:59:39.397616 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:39 crc kubenswrapper[4676]: I0930 13:59:39.397646 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:39 crc kubenswrapper[4676]: I0930 13:59:39.397664 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:39Z","lastTransitionTime":"2025-09-30T13:59:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:39 crc kubenswrapper[4676]: I0930 13:59:39.433095 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sksn7" Sep 30 13:59:39 crc kubenswrapper[4676]: E0930 13:59:39.433242 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sksn7" podUID="e47ea2c6-e937-4411-b4c9-98048a5e5f05" Sep 30 13:59:39 crc kubenswrapper[4676]: I0930 13:59:39.500441 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:39 crc kubenswrapper[4676]: I0930 13:59:39.500476 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:39 crc kubenswrapper[4676]: I0930 13:59:39.500485 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:39 crc kubenswrapper[4676]: I0930 13:59:39.500501 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:39 crc kubenswrapper[4676]: I0930 13:59:39.500510 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:39Z","lastTransitionTime":"2025-09-30T13:59:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:39 crc kubenswrapper[4676]: I0930 13:59:39.603527 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:39 crc kubenswrapper[4676]: I0930 13:59:39.603565 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:39 crc kubenswrapper[4676]: I0930 13:59:39.603573 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:39 crc kubenswrapper[4676]: I0930 13:59:39.603588 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:39 crc kubenswrapper[4676]: I0930 13:59:39.603598 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:39Z","lastTransitionTime":"2025-09-30T13:59:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:39 crc kubenswrapper[4676]: I0930 13:59:39.705962 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:39 crc kubenswrapper[4676]: I0930 13:59:39.706029 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:39 crc kubenswrapper[4676]: I0930 13:59:39.706042 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:39 crc kubenswrapper[4676]: I0930 13:59:39.706062 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:39 crc kubenswrapper[4676]: I0930 13:59:39.706073 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:39Z","lastTransitionTime":"2025-09-30T13:59:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:39 crc kubenswrapper[4676]: I0930 13:59:39.809369 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:39 crc kubenswrapper[4676]: I0930 13:59:39.809427 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:39 crc kubenswrapper[4676]: I0930 13:59:39.809443 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:39 crc kubenswrapper[4676]: I0930 13:59:39.809466 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:39 crc kubenswrapper[4676]: I0930 13:59:39.809482 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:39Z","lastTransitionTime":"2025-09-30T13:59:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:39 crc kubenswrapper[4676]: I0930 13:59:39.855285 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9775s_4fae6bdf-2a3f-4961-934d-b8f653412538/ovnkube-controller/3.log" Sep 30 13:59:39 crc kubenswrapper[4676]: I0930 13:59:39.911913 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:39 crc kubenswrapper[4676]: I0930 13:59:39.911964 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:39 crc kubenswrapper[4676]: I0930 13:59:39.911976 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:39 crc kubenswrapper[4676]: I0930 13:59:39.911996 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:39 crc kubenswrapper[4676]: I0930 13:59:39.912010 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:39Z","lastTransitionTime":"2025-09-30T13:59:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:40 crc kubenswrapper[4676]: I0930 13:59:40.014234 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:40 crc kubenswrapper[4676]: I0930 13:59:40.014271 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:40 crc kubenswrapper[4676]: I0930 13:59:40.014281 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:40 crc kubenswrapper[4676]: I0930 13:59:40.014297 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:40 crc kubenswrapper[4676]: I0930 13:59:40.014307 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:40Z","lastTransitionTime":"2025-09-30T13:59:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:40 crc kubenswrapper[4676]: I0930 13:59:40.117117 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:40 crc kubenswrapper[4676]: I0930 13:59:40.117165 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:40 crc kubenswrapper[4676]: I0930 13:59:40.117177 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:40 crc kubenswrapper[4676]: I0930 13:59:40.117198 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:40 crc kubenswrapper[4676]: I0930 13:59:40.117212 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:40Z","lastTransitionTime":"2025-09-30T13:59:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:40 crc kubenswrapper[4676]: I0930 13:59:40.210379 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 13:59:40 crc kubenswrapper[4676]: I0930 13:59:40.210485 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:59:40 crc kubenswrapper[4676]: I0930 13:59:40.210526 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:59:40 crc kubenswrapper[4676]: E0930 13:59:40.210626 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 14:00:44.21060757 +0000 UTC m=+148.193695999 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 13:59:40 crc kubenswrapper[4676]: E0930 13:59:40.210676 4676 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 13:59:40 crc kubenswrapper[4676]: E0930 13:59:40.210724 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 14:00:44.210717163 +0000 UTC m=+148.193805592 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 13:59:40 crc kubenswrapper[4676]: E0930 13:59:40.210736 4676 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 13:59:40 crc kubenswrapper[4676]: E0930 13:59:40.210857 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 14:00:44.210826595 +0000 UTC m=+148.193915204 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 13:59:40 crc kubenswrapper[4676]: I0930 13:59:40.219792 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:40 crc kubenswrapper[4676]: I0930 13:59:40.219951 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:40 crc kubenswrapper[4676]: I0930 13:59:40.220038 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:40 crc kubenswrapper[4676]: I0930 13:59:40.220107 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:40 crc kubenswrapper[4676]: I0930 13:59:40.220173 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:40Z","lastTransitionTime":"2025-09-30T13:59:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:40 crc kubenswrapper[4676]: I0930 13:59:40.312077 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:59:40 crc kubenswrapper[4676]: I0930 13:59:40.312636 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:59:40 crc kubenswrapper[4676]: E0930 13:59:40.312325 4676 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 13:59:40 crc kubenswrapper[4676]: E0930 13:59:40.313083 4676 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 13:59:40 crc kubenswrapper[4676]: E0930 13:59:40.313162 4676 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:59:40 crc kubenswrapper[4676]: E0930 13:59:40.313256 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 14:00:44.313228683 +0000 UTC m=+148.296317112 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:59:40 crc kubenswrapper[4676]: E0930 13:59:40.312710 4676 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 13:59:40 crc kubenswrapper[4676]: E0930 13:59:40.313362 4676 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 13:59:40 crc kubenswrapper[4676]: E0930 13:59:40.313391 4676 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:59:40 crc kubenswrapper[4676]: E0930 13:59:40.313490 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 14:00:44.313463469 +0000 UTC m=+148.296551938 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 13:59:40 crc kubenswrapper[4676]: I0930 13:59:40.323284 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:40 crc kubenswrapper[4676]: I0930 13:59:40.323315 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:40 crc kubenswrapper[4676]: I0930 13:59:40.323340 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:40 crc kubenswrapper[4676]: I0930 13:59:40.323358 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:40 crc kubenswrapper[4676]: I0930 13:59:40.323370 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:40Z","lastTransitionTime":"2025-09-30T13:59:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:40 crc kubenswrapper[4676]: I0930 13:59:40.426374 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:40 crc kubenswrapper[4676]: I0930 13:59:40.426439 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:40 crc kubenswrapper[4676]: I0930 13:59:40.426456 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:40 crc kubenswrapper[4676]: I0930 13:59:40.426479 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:40 crc kubenswrapper[4676]: I0930 13:59:40.426495 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:40Z","lastTransitionTime":"2025-09-30T13:59:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:40 crc kubenswrapper[4676]: I0930 13:59:40.432080 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:59:40 crc kubenswrapper[4676]: I0930 13:59:40.432147 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:59:40 crc kubenswrapper[4676]: I0930 13:59:40.432219 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:59:40 crc kubenswrapper[4676]: E0930 13:59:40.432352 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:59:40 crc kubenswrapper[4676]: E0930 13:59:40.432451 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:59:40 crc kubenswrapper[4676]: E0930 13:59:40.432545 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:59:40 crc kubenswrapper[4676]: I0930 13:59:40.529207 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:40 crc kubenswrapper[4676]: I0930 13:59:40.529254 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:40 crc kubenswrapper[4676]: I0930 13:59:40.529265 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:40 crc kubenswrapper[4676]: I0930 13:59:40.529282 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:40 crc kubenswrapper[4676]: I0930 13:59:40.529295 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:40Z","lastTransitionTime":"2025-09-30T13:59:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:40 crc kubenswrapper[4676]: I0930 13:59:40.633425 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:40 crc kubenswrapper[4676]: I0930 13:59:40.633474 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:40 crc kubenswrapper[4676]: I0930 13:59:40.633487 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:40 crc kubenswrapper[4676]: I0930 13:59:40.633507 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:40 crc kubenswrapper[4676]: I0930 13:59:40.633519 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:40Z","lastTransitionTime":"2025-09-30T13:59:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:40 crc kubenswrapper[4676]: I0930 13:59:40.737258 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:40 crc kubenswrapper[4676]: I0930 13:59:40.737305 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:40 crc kubenswrapper[4676]: I0930 13:59:40.737317 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:40 crc kubenswrapper[4676]: I0930 13:59:40.737337 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:40 crc kubenswrapper[4676]: I0930 13:59:40.737349 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:40Z","lastTransitionTime":"2025-09-30T13:59:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:40 crc kubenswrapper[4676]: I0930 13:59:40.839834 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:40 crc kubenswrapper[4676]: I0930 13:59:40.839911 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:40 crc kubenswrapper[4676]: I0930 13:59:40.839921 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:40 crc kubenswrapper[4676]: I0930 13:59:40.839938 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:40 crc kubenswrapper[4676]: I0930 13:59:40.839947 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:40Z","lastTransitionTime":"2025-09-30T13:59:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:40 crc kubenswrapper[4676]: I0930 13:59:40.942273 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:40 crc kubenswrapper[4676]: I0930 13:59:40.942321 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:40 crc kubenswrapper[4676]: I0930 13:59:40.942333 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:40 crc kubenswrapper[4676]: I0930 13:59:40.942444 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:40 crc kubenswrapper[4676]: I0930 13:59:40.942460 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:40Z","lastTransitionTime":"2025-09-30T13:59:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:41 crc kubenswrapper[4676]: I0930 13:59:41.044668 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:41 crc kubenswrapper[4676]: I0930 13:59:41.044714 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:41 crc kubenswrapper[4676]: I0930 13:59:41.044722 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:41 crc kubenswrapper[4676]: I0930 13:59:41.044737 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:41 crc kubenswrapper[4676]: I0930 13:59:41.044746 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:41Z","lastTransitionTime":"2025-09-30T13:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:41 crc kubenswrapper[4676]: I0930 13:59:41.147569 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:41 crc kubenswrapper[4676]: I0930 13:59:41.147635 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:41 crc kubenswrapper[4676]: I0930 13:59:41.147673 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:41 crc kubenswrapper[4676]: I0930 13:59:41.147705 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:41 crc kubenswrapper[4676]: I0930 13:59:41.147727 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:41Z","lastTransitionTime":"2025-09-30T13:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:41 crc kubenswrapper[4676]: I0930 13:59:41.250055 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:41 crc kubenswrapper[4676]: I0930 13:59:41.250106 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:41 crc kubenswrapper[4676]: I0930 13:59:41.250118 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:41 crc kubenswrapper[4676]: I0930 13:59:41.250140 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:41 crc kubenswrapper[4676]: I0930 13:59:41.250153 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:41Z","lastTransitionTime":"2025-09-30T13:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:41 crc kubenswrapper[4676]: I0930 13:59:41.353016 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:41 crc kubenswrapper[4676]: I0930 13:59:41.353071 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:41 crc kubenswrapper[4676]: I0930 13:59:41.353088 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:41 crc kubenswrapper[4676]: I0930 13:59:41.353108 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:41 crc kubenswrapper[4676]: I0930 13:59:41.353118 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:41Z","lastTransitionTime":"2025-09-30T13:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:41 crc kubenswrapper[4676]: I0930 13:59:41.432549 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sksn7" Sep 30 13:59:41 crc kubenswrapper[4676]: E0930 13:59:41.433247 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sksn7" podUID="e47ea2c6-e937-4411-b4c9-98048a5e5f05" Sep 30 13:59:41 crc kubenswrapper[4676]: I0930 13:59:41.442597 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Sep 30 13:59:41 crc kubenswrapper[4676]: I0930 13:59:41.455483 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:41 crc kubenswrapper[4676]: I0930 13:59:41.455528 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:41 crc kubenswrapper[4676]: I0930 13:59:41.455541 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:41 crc kubenswrapper[4676]: I0930 13:59:41.455596 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:41 crc kubenswrapper[4676]: I0930 13:59:41.455634 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:41Z","lastTransitionTime":"2025-09-30T13:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:41 crc kubenswrapper[4676]: I0930 13:59:41.558213 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:41 crc kubenswrapper[4676]: I0930 13:59:41.558263 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:41 crc kubenswrapper[4676]: I0930 13:59:41.558273 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:41 crc kubenswrapper[4676]: I0930 13:59:41.558311 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:41 crc kubenswrapper[4676]: I0930 13:59:41.558323 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:41Z","lastTransitionTime":"2025-09-30T13:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:41 crc kubenswrapper[4676]: I0930 13:59:41.664383 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:41 crc kubenswrapper[4676]: I0930 13:59:41.664428 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:41 crc kubenswrapper[4676]: I0930 13:59:41.664437 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:41 crc kubenswrapper[4676]: I0930 13:59:41.664455 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:41 crc kubenswrapper[4676]: I0930 13:59:41.664466 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:41Z","lastTransitionTime":"2025-09-30T13:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:41 crc kubenswrapper[4676]: I0930 13:59:41.767387 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:41 crc kubenswrapper[4676]: I0930 13:59:41.767437 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:41 crc kubenswrapper[4676]: I0930 13:59:41.767449 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:41 crc kubenswrapper[4676]: I0930 13:59:41.767471 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:41 crc kubenswrapper[4676]: I0930 13:59:41.767485 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:41Z","lastTransitionTime":"2025-09-30T13:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:41 crc kubenswrapper[4676]: I0930 13:59:41.870363 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:41 crc kubenswrapper[4676]: I0930 13:59:41.870410 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:41 crc kubenswrapper[4676]: I0930 13:59:41.870421 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:41 crc kubenswrapper[4676]: I0930 13:59:41.870438 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:41 crc kubenswrapper[4676]: I0930 13:59:41.870450 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:41Z","lastTransitionTime":"2025-09-30T13:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:41 crc kubenswrapper[4676]: I0930 13:59:41.972778 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:41 crc kubenswrapper[4676]: I0930 13:59:41.973012 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:41 crc kubenswrapper[4676]: I0930 13:59:41.973121 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:41 crc kubenswrapper[4676]: I0930 13:59:41.973187 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:41 crc kubenswrapper[4676]: I0930 13:59:41.973260 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:41Z","lastTransitionTime":"2025-09-30T13:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:42 crc kubenswrapper[4676]: I0930 13:59:42.075142 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:42 crc kubenswrapper[4676]: I0930 13:59:42.075405 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:42 crc kubenswrapper[4676]: I0930 13:59:42.075490 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:42 crc kubenswrapper[4676]: I0930 13:59:42.075581 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:42 crc kubenswrapper[4676]: I0930 13:59:42.075655 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:42Z","lastTransitionTime":"2025-09-30T13:59:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:42 crc kubenswrapper[4676]: I0930 13:59:42.178816 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:42 crc kubenswrapper[4676]: I0930 13:59:42.178863 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:42 crc kubenswrapper[4676]: I0930 13:59:42.178873 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:42 crc kubenswrapper[4676]: I0930 13:59:42.178915 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:42 crc kubenswrapper[4676]: I0930 13:59:42.178927 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:42Z","lastTransitionTime":"2025-09-30T13:59:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:42 crc kubenswrapper[4676]: I0930 13:59:42.281543 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:42 crc kubenswrapper[4676]: I0930 13:59:42.281596 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:42 crc kubenswrapper[4676]: I0930 13:59:42.281606 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:42 crc kubenswrapper[4676]: I0930 13:59:42.281623 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:42 crc kubenswrapper[4676]: I0930 13:59:42.281647 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:42Z","lastTransitionTime":"2025-09-30T13:59:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:42 crc kubenswrapper[4676]: I0930 13:59:42.384821 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:42 crc kubenswrapper[4676]: I0930 13:59:42.384863 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:42 crc kubenswrapper[4676]: I0930 13:59:42.384901 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:42 crc kubenswrapper[4676]: I0930 13:59:42.384925 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:42 crc kubenswrapper[4676]: I0930 13:59:42.384941 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:42Z","lastTransitionTime":"2025-09-30T13:59:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:42 crc kubenswrapper[4676]: I0930 13:59:42.432458 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:59:42 crc kubenswrapper[4676]: I0930 13:59:42.432526 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:59:42 crc kubenswrapper[4676]: I0930 13:59:42.432553 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:59:42 crc kubenswrapper[4676]: E0930 13:59:42.432618 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:59:42 crc kubenswrapper[4676]: E0930 13:59:42.432714 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:59:42 crc kubenswrapper[4676]: E0930 13:59:42.432934 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:59:42 crc kubenswrapper[4676]: I0930 13:59:42.488666 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:42 crc kubenswrapper[4676]: I0930 13:59:42.488728 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:42 crc kubenswrapper[4676]: I0930 13:59:42.488741 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:42 crc kubenswrapper[4676]: I0930 13:59:42.488780 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:42 crc kubenswrapper[4676]: I0930 13:59:42.488791 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:42Z","lastTransitionTime":"2025-09-30T13:59:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:42 crc kubenswrapper[4676]: I0930 13:59:42.591217 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:42 crc kubenswrapper[4676]: I0930 13:59:42.591260 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:42 crc kubenswrapper[4676]: I0930 13:59:42.591271 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:42 crc kubenswrapper[4676]: I0930 13:59:42.591289 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:42 crc kubenswrapper[4676]: I0930 13:59:42.591301 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:42Z","lastTransitionTime":"2025-09-30T13:59:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:42 crc kubenswrapper[4676]: I0930 13:59:42.694101 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:42 crc kubenswrapper[4676]: I0930 13:59:42.694151 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:42 crc kubenswrapper[4676]: I0930 13:59:42.694162 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:42 crc kubenswrapper[4676]: I0930 13:59:42.694184 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:42 crc kubenswrapper[4676]: I0930 13:59:42.694197 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:42Z","lastTransitionTime":"2025-09-30T13:59:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:42 crc kubenswrapper[4676]: I0930 13:59:42.796632 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:42 crc kubenswrapper[4676]: I0930 13:59:42.796945 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:42 crc kubenswrapper[4676]: I0930 13:59:42.797048 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:42 crc kubenswrapper[4676]: I0930 13:59:42.797133 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:42 crc kubenswrapper[4676]: I0930 13:59:42.797206 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:42Z","lastTransitionTime":"2025-09-30T13:59:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:42 crc kubenswrapper[4676]: I0930 13:59:42.899718 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:42 crc kubenswrapper[4676]: I0930 13:59:42.899768 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:42 crc kubenswrapper[4676]: I0930 13:59:42.899779 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:42 crc kubenswrapper[4676]: I0930 13:59:42.899794 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:42 crc kubenswrapper[4676]: I0930 13:59:42.899805 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:42Z","lastTransitionTime":"2025-09-30T13:59:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:43 crc kubenswrapper[4676]: I0930 13:59:43.002117 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:43 crc kubenswrapper[4676]: I0930 13:59:43.002474 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:43 crc kubenswrapper[4676]: I0930 13:59:43.002584 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:43 crc kubenswrapper[4676]: I0930 13:59:43.002712 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:43 crc kubenswrapper[4676]: I0930 13:59:43.002795 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:43Z","lastTransitionTime":"2025-09-30T13:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:43 crc kubenswrapper[4676]: I0930 13:59:43.198326 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:43 crc kubenswrapper[4676]: I0930 13:59:43.198382 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:43 crc kubenswrapper[4676]: I0930 13:59:43.198393 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:43 crc kubenswrapper[4676]: I0930 13:59:43.198410 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:43 crc kubenswrapper[4676]: I0930 13:59:43.198420 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:43Z","lastTransitionTime":"2025-09-30T13:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:43 crc kubenswrapper[4676]: I0930 13:59:43.300820 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:43 crc kubenswrapper[4676]: I0930 13:59:43.300857 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:43 crc kubenswrapper[4676]: I0930 13:59:43.300867 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:43 crc kubenswrapper[4676]: I0930 13:59:43.300899 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:43 crc kubenswrapper[4676]: I0930 13:59:43.300909 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:43Z","lastTransitionTime":"2025-09-30T13:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:43 crc kubenswrapper[4676]: I0930 13:59:43.403599 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:43 crc kubenswrapper[4676]: I0930 13:59:43.403850 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:43 crc kubenswrapper[4676]: I0930 13:59:43.404066 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:43 crc kubenswrapper[4676]: I0930 13:59:43.404287 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:43 crc kubenswrapper[4676]: I0930 13:59:43.404493 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:43Z","lastTransitionTime":"2025-09-30T13:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:43 crc kubenswrapper[4676]: I0930 13:59:43.433128 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sksn7" Sep 30 13:59:43 crc kubenswrapper[4676]: E0930 13:59:43.433657 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sksn7" podUID="e47ea2c6-e937-4411-b4c9-98048a5e5f05" Sep 30 13:59:43 crc kubenswrapper[4676]: I0930 13:59:43.507243 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:43 crc kubenswrapper[4676]: I0930 13:59:43.507488 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:43 crc kubenswrapper[4676]: I0930 13:59:43.507576 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:43 crc kubenswrapper[4676]: I0930 13:59:43.507700 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:43 crc kubenswrapper[4676]: I0930 13:59:43.507802 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:43Z","lastTransitionTime":"2025-09-30T13:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:43 crc kubenswrapper[4676]: I0930 13:59:43.610762 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:43 crc kubenswrapper[4676]: I0930 13:59:43.610810 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:43 crc kubenswrapper[4676]: I0930 13:59:43.610819 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:43 crc kubenswrapper[4676]: I0930 13:59:43.610835 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:43 crc kubenswrapper[4676]: I0930 13:59:43.610845 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:43Z","lastTransitionTime":"2025-09-30T13:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:43 crc kubenswrapper[4676]: I0930 13:59:43.713353 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:43 crc kubenswrapper[4676]: I0930 13:59:43.713422 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:43 crc kubenswrapper[4676]: I0930 13:59:43.713436 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:43 crc kubenswrapper[4676]: I0930 13:59:43.713453 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:43 crc kubenswrapper[4676]: I0930 13:59:43.713464 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:43Z","lastTransitionTime":"2025-09-30T13:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:43 crc kubenswrapper[4676]: I0930 13:59:43.815524 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:43 crc kubenswrapper[4676]: I0930 13:59:43.815567 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:43 crc kubenswrapper[4676]: I0930 13:59:43.815579 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:43 crc kubenswrapper[4676]: I0930 13:59:43.815596 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:43 crc kubenswrapper[4676]: I0930 13:59:43.815607 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:43Z","lastTransitionTime":"2025-09-30T13:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:43 crc kubenswrapper[4676]: I0930 13:59:43.917833 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:43 crc kubenswrapper[4676]: I0930 13:59:43.918144 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:43 crc kubenswrapper[4676]: I0930 13:59:43.918222 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:43 crc kubenswrapper[4676]: I0930 13:59:43.918316 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:43 crc kubenswrapper[4676]: I0930 13:59:43.918428 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:43Z","lastTransitionTime":"2025-09-30T13:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.020684 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.020735 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.020744 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.020780 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.020791 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:44Z","lastTransitionTime":"2025-09-30T13:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.123663 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.123705 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.123714 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.123729 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.123740 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:44Z","lastTransitionTime":"2025-09-30T13:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.225738 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.225786 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.225794 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.225811 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.225822 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:44Z","lastTransitionTime":"2025-09-30T13:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.327769 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.327830 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.327862 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.327906 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.327927 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:44Z","lastTransitionTime":"2025-09-30T13:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:44 crc kubenswrapper[4676]: E0930 13:59:44.342366 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1d613b32-e65a-4ccd-985a-b2960fc60b41\\\",\\\"systemUUID\\\":\\\"e459744e-c3c7-46eb-b48a-f9d168ffc645\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:44Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.345891 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.345930 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.345939 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.345956 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.345966 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:44Z","lastTransitionTime":"2025-09-30T13:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:44 crc kubenswrapper[4676]: E0930 13:59:44.356433 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1d613b32-e65a-4ccd-985a-b2960fc60b41\\\",\\\"systemUUID\\\":\\\"e459744e-c3c7-46eb-b48a-f9d168ffc645\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:44Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.359680 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.359804 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.359864 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.359967 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.360079 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:44Z","lastTransitionTime":"2025-09-30T13:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:44 crc kubenswrapper[4676]: E0930 13:59:44.371918 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1d613b32-e65a-4ccd-985a-b2960fc60b41\\\",\\\"systemUUID\\\":\\\"e459744e-c3c7-46eb-b48a-f9d168ffc645\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:44Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.375500 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.375531 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.375541 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.375559 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.375569 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:44Z","lastTransitionTime":"2025-09-30T13:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:44 crc kubenswrapper[4676]: E0930 13:59:44.390744 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1d613b32-e65a-4ccd-985a-b2960fc60b41\\\",\\\"systemUUID\\\":\\\"e459744e-c3c7-46eb-b48a-f9d168ffc645\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:44Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.395833 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.395914 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.395926 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.395948 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.395959 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:44Z","lastTransitionTime":"2025-09-30T13:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:44 crc kubenswrapper[4676]: E0930 13:59:44.408040 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T13:59:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1d613b32-e65a-4ccd-985a-b2960fc60b41\\\",\\\"systemUUID\\\":\\\"e459744e-c3c7-46eb-b48a-f9d168ffc645\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:44Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:44 crc kubenswrapper[4676]: E0930 13:59:44.408165 4676 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.410281 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.410334 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.410348 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.410366 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.410376 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:44Z","lastTransitionTime":"2025-09-30T13:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.432559 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.432584 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.432710 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:59:44 crc kubenswrapper[4676]: E0930 13:59:44.432711 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:59:44 crc kubenswrapper[4676]: E0930 13:59:44.432848 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:59:44 crc kubenswrapper[4676]: E0930 13:59:44.432953 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.512736 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.512780 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.512789 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.512807 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.512818 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:44Z","lastTransitionTime":"2025-09-30T13:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.615119 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.615360 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.615428 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.615501 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.615595 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:44Z","lastTransitionTime":"2025-09-30T13:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.718586 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.718862 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.718963 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.719094 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.719173 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:44Z","lastTransitionTime":"2025-09-30T13:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.822457 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.822494 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.822504 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.822548 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.822558 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:44Z","lastTransitionTime":"2025-09-30T13:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.924704 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.924741 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.924751 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.924767 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:44 crc kubenswrapper[4676]: I0930 13:59:44.924778 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:44Z","lastTransitionTime":"2025-09-30T13:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:45 crc kubenswrapper[4676]: I0930 13:59:45.027168 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:45 crc kubenswrapper[4676]: I0930 13:59:45.027210 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:45 crc kubenswrapper[4676]: I0930 13:59:45.027222 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:45 crc kubenswrapper[4676]: I0930 13:59:45.027238 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:45 crc kubenswrapper[4676]: I0930 13:59:45.027248 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:45Z","lastTransitionTime":"2025-09-30T13:59:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:45 crc kubenswrapper[4676]: I0930 13:59:45.129815 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:45 crc kubenswrapper[4676]: I0930 13:59:45.129865 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:45 crc kubenswrapper[4676]: I0930 13:59:45.129916 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:45 crc kubenswrapper[4676]: I0930 13:59:45.129939 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:45 crc kubenswrapper[4676]: I0930 13:59:45.129958 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:45Z","lastTransitionTime":"2025-09-30T13:59:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:45 crc kubenswrapper[4676]: I0930 13:59:45.232934 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:45 crc kubenswrapper[4676]: I0930 13:59:45.232984 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:45 crc kubenswrapper[4676]: I0930 13:59:45.232996 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:45 crc kubenswrapper[4676]: I0930 13:59:45.233015 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:45 crc kubenswrapper[4676]: I0930 13:59:45.233062 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:45Z","lastTransitionTime":"2025-09-30T13:59:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:45 crc kubenswrapper[4676]: I0930 13:59:45.335814 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:45 crc kubenswrapper[4676]: I0930 13:59:45.335853 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:45 crc kubenswrapper[4676]: I0930 13:59:45.335864 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:45 crc kubenswrapper[4676]: I0930 13:59:45.335903 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:45 crc kubenswrapper[4676]: I0930 13:59:45.335915 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:45Z","lastTransitionTime":"2025-09-30T13:59:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:45 crc kubenswrapper[4676]: I0930 13:59:45.433097 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sksn7" Sep 30 13:59:45 crc kubenswrapper[4676]: E0930 13:59:45.433277 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sksn7" podUID="e47ea2c6-e937-4411-b4c9-98048a5e5f05" Sep 30 13:59:45 crc kubenswrapper[4676]: I0930 13:59:45.437529 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:45 crc kubenswrapper[4676]: I0930 13:59:45.437674 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:45 crc kubenswrapper[4676]: I0930 13:59:45.437769 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:45 crc kubenswrapper[4676]: I0930 13:59:45.437852 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:45 crc kubenswrapper[4676]: I0930 13:59:45.437958 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:45Z","lastTransitionTime":"2025-09-30T13:59:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:45 crc kubenswrapper[4676]: I0930 13:59:45.540952 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:45 crc kubenswrapper[4676]: I0930 13:59:45.541002 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:45 crc kubenswrapper[4676]: I0930 13:59:45.541014 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:45 crc kubenswrapper[4676]: I0930 13:59:45.541029 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:45 crc kubenswrapper[4676]: I0930 13:59:45.541038 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:45Z","lastTransitionTime":"2025-09-30T13:59:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:45 crc kubenswrapper[4676]: I0930 13:59:45.643147 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:45 crc kubenswrapper[4676]: I0930 13:59:45.643424 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:45 crc kubenswrapper[4676]: I0930 13:59:45.643548 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:45 crc kubenswrapper[4676]: I0930 13:59:45.643629 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:45 crc kubenswrapper[4676]: I0930 13:59:45.643691 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:45Z","lastTransitionTime":"2025-09-30T13:59:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:45 crc kubenswrapper[4676]: I0930 13:59:45.746133 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:45 crc kubenswrapper[4676]: I0930 13:59:45.746176 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:45 crc kubenswrapper[4676]: I0930 13:59:45.746185 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:45 crc kubenswrapper[4676]: I0930 13:59:45.746228 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:45 crc kubenswrapper[4676]: I0930 13:59:45.746241 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:45Z","lastTransitionTime":"2025-09-30T13:59:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:45 crc kubenswrapper[4676]: I0930 13:59:45.848010 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:45 crc kubenswrapper[4676]: I0930 13:59:45.848048 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:45 crc kubenswrapper[4676]: I0930 13:59:45.848056 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:45 crc kubenswrapper[4676]: I0930 13:59:45.848072 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:45 crc kubenswrapper[4676]: I0930 13:59:45.848081 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:45Z","lastTransitionTime":"2025-09-30T13:59:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:45 crc kubenswrapper[4676]: I0930 13:59:45.950841 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:45 crc kubenswrapper[4676]: I0930 13:59:45.950900 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:45 crc kubenswrapper[4676]: I0930 13:59:45.950926 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:45 crc kubenswrapper[4676]: I0930 13:59:45.950945 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:45 crc kubenswrapper[4676]: I0930 13:59:45.950955 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:45Z","lastTransitionTime":"2025-09-30T13:59:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:46 crc kubenswrapper[4676]: I0930 13:59:46.054742 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:46 crc kubenswrapper[4676]: I0930 13:59:46.054791 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:46 crc kubenswrapper[4676]: I0930 13:59:46.054801 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:46 crc kubenswrapper[4676]: I0930 13:59:46.054817 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:46 crc kubenswrapper[4676]: I0930 13:59:46.054827 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:46Z","lastTransitionTime":"2025-09-30T13:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:46 crc kubenswrapper[4676]: I0930 13:59:46.158188 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:46 crc kubenswrapper[4676]: I0930 13:59:46.158241 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:46 crc kubenswrapper[4676]: I0930 13:59:46.158253 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:46 crc kubenswrapper[4676]: I0930 13:59:46.158273 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:46 crc kubenswrapper[4676]: I0930 13:59:46.158287 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:46Z","lastTransitionTime":"2025-09-30T13:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:46 crc kubenswrapper[4676]: I0930 13:59:46.260590 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:46 crc kubenswrapper[4676]: I0930 13:59:46.260642 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:46 crc kubenswrapper[4676]: I0930 13:59:46.260658 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:46 crc kubenswrapper[4676]: I0930 13:59:46.260683 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:46 crc kubenswrapper[4676]: I0930 13:59:46.260702 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:46Z","lastTransitionTime":"2025-09-30T13:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:46 crc kubenswrapper[4676]: I0930 13:59:46.363243 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:46 crc kubenswrapper[4676]: I0930 13:59:46.363297 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:46 crc kubenswrapper[4676]: I0930 13:59:46.363312 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:46 crc kubenswrapper[4676]: I0930 13:59:46.363331 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:46 crc kubenswrapper[4676]: I0930 13:59:46.363346 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:46Z","lastTransitionTime":"2025-09-30T13:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:46 crc kubenswrapper[4676]: I0930 13:59:46.432674 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:59:46 crc kubenswrapper[4676]: I0930 13:59:46.432747 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:59:46 crc kubenswrapper[4676]: E0930 13:59:46.432850 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:59:46 crc kubenswrapper[4676]: E0930 13:59:46.433033 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:59:46 crc kubenswrapper[4676]: I0930 13:59:46.433143 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:59:46 crc kubenswrapper[4676]: E0930 13:59:46.433366 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:59:46 crc kubenswrapper[4676]: I0930 13:59:46.466716 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:46 crc kubenswrapper[4676]: I0930 13:59:46.466759 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:46 crc kubenswrapper[4676]: I0930 13:59:46.466769 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:46 crc kubenswrapper[4676]: I0930 13:59:46.466784 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:46 crc kubenswrapper[4676]: I0930 13:59:46.466794 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:46Z","lastTransitionTime":"2025-09-30T13:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:46 crc kubenswrapper[4676]: I0930 13:59:46.569464 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:46 crc kubenswrapper[4676]: I0930 13:59:46.569501 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:46 crc kubenswrapper[4676]: I0930 13:59:46.569509 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:46 crc kubenswrapper[4676]: I0930 13:59:46.569524 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:46 crc kubenswrapper[4676]: I0930 13:59:46.569533 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:46Z","lastTransitionTime":"2025-09-30T13:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:46 crc kubenswrapper[4676]: I0930 13:59:46.672979 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:46 crc kubenswrapper[4676]: I0930 13:59:46.673016 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:46 crc kubenswrapper[4676]: I0930 13:59:46.673025 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:46 crc kubenswrapper[4676]: I0930 13:59:46.673048 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:46 crc kubenswrapper[4676]: I0930 13:59:46.673059 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:46Z","lastTransitionTime":"2025-09-30T13:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:46 crc kubenswrapper[4676]: I0930 13:59:46.775741 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:46 crc kubenswrapper[4676]: I0930 13:59:46.775781 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:46 crc kubenswrapper[4676]: I0930 13:59:46.775792 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:46 crc kubenswrapper[4676]: I0930 13:59:46.775809 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:46 crc kubenswrapper[4676]: I0930 13:59:46.775820 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:46Z","lastTransitionTime":"2025-09-30T13:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:46 crc kubenswrapper[4676]: I0930 13:59:46.878862 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:46 crc kubenswrapper[4676]: I0930 13:59:46.878944 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:46 crc kubenswrapper[4676]: I0930 13:59:46.878961 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:46 crc kubenswrapper[4676]: I0930 13:59:46.878983 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:46 crc kubenswrapper[4676]: I0930 13:59:46.878999 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:46Z","lastTransitionTime":"2025-09-30T13:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:46 crc kubenswrapper[4676]: I0930 13:59:46.982374 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:46 crc kubenswrapper[4676]: I0930 13:59:46.982437 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:46 crc kubenswrapper[4676]: I0930 13:59:46.982449 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:46 crc kubenswrapper[4676]: I0930 13:59:46.982468 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:46 crc kubenswrapper[4676]: I0930 13:59:46.982481 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:46Z","lastTransitionTime":"2025-09-30T13:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:47 crc kubenswrapper[4676]: I0930 13:59:47.084895 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:47 crc kubenswrapper[4676]: I0930 13:59:47.084934 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:47 crc kubenswrapper[4676]: I0930 13:59:47.084944 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:47 crc kubenswrapper[4676]: I0930 13:59:47.084960 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:47 crc kubenswrapper[4676]: I0930 13:59:47.084980 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:47Z","lastTransitionTime":"2025-09-30T13:59:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:47 crc kubenswrapper[4676]: I0930 13:59:47.187738 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:47 crc kubenswrapper[4676]: I0930 13:59:47.187790 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:47 crc kubenswrapper[4676]: I0930 13:59:47.187800 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:47 crc kubenswrapper[4676]: I0930 13:59:47.187818 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:47 crc kubenswrapper[4676]: I0930 13:59:47.187832 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:47Z","lastTransitionTime":"2025-09-30T13:59:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:47 crc kubenswrapper[4676]: I0930 13:59:47.290764 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:47 crc kubenswrapper[4676]: I0930 13:59:47.290812 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:47 crc kubenswrapper[4676]: I0930 13:59:47.290825 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:47 crc kubenswrapper[4676]: I0930 13:59:47.290844 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:47 crc kubenswrapper[4676]: I0930 13:59:47.290855 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:47Z","lastTransitionTime":"2025-09-30T13:59:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:47 crc kubenswrapper[4676]: I0930 13:59:47.394270 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:47 crc kubenswrapper[4676]: I0930 13:59:47.394321 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:47 crc kubenswrapper[4676]: I0930 13:59:47.394334 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:47 crc kubenswrapper[4676]: I0930 13:59:47.394352 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:47 crc kubenswrapper[4676]: I0930 13:59:47.394366 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:47Z","lastTransitionTime":"2025-09-30T13:59:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:47 crc kubenswrapper[4676]: I0930 13:59:47.432451 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sksn7" Sep 30 13:59:47 crc kubenswrapper[4676]: E0930 13:59:47.432636 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sksn7" podUID="e47ea2c6-e937-4411-b4c9-98048a5e5f05" Sep 30 13:59:47 crc kubenswrapper[4676]: I0930 13:59:47.456227 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f9bd59d-f026-4c72-9734-c670da1df387\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3a7307312b0604e0dced8cbc2b8ed07583ec6650b05091d4c5f41a08da70691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14d07155343479f6d7f452a871bca1361e4fd22efd3ef828281e1836117454a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44aa745d9d01236afaa143bbc447bc230cac1190229c1c74ecdf92555cc721eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05f3170673f4e23690bc0c5543265931501d623eda84ee7df8e515e1ad8b6346\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05f3170673f4e23690bc0c5543265931501d623eda84ee7df8e515e1ad8b6346\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:47Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:47 crc kubenswrapper[4676]: I0930 13:59:47.477372 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03e12a8bf70722b099302a9c3b8d88526bbea68746ea168b2bf41306a3e18641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:47Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:47 crc kubenswrapper[4676]: I0930 13:59:47.494123 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54e1c7778cf1d8d9f6dc68d72d153ea277d19d66d775fa02ed06687e3392040c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98be87b4cb316839099b69460cc1c1ecd98d5162c7dfc995c8f6d86ad477fedd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:47Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:47 crc kubenswrapper[4676]: I0930 13:59:47.496756 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:47 crc kubenswrapper[4676]: I0930 13:59:47.496788 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:47 crc kubenswrapper[4676]: I0930 13:59:47.496799 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:47 crc kubenswrapper[4676]: I0930 13:59:47.496818 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:47 crc kubenswrapper[4676]: I0930 13:59:47.496831 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:47Z","lastTransitionTime":"2025-09-30T13:59:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:47 crc kubenswrapper[4676]: I0930 13:59:47.515111 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:47Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:47 crc kubenswrapper[4676]: I0930 13:59:47.533486 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://938f56b18fe369ceeb13a74b0c257fd1b7fafeeb4d331327d4508f482caca32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:47Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:47 crc kubenswrapper[4676]: I0930 13:59:47.548633 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s7q5x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12808c49-1bed-4251-bcbe-fad6207eea57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:59:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c09ecf58dd0a201d63bf48e9293f7416eb2fd5f5d394f293ae8f20758bb505b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://397ae2103d09a79613ccc4281fb5318128fa7dc84e8013a5327cf3cd31490049\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:59:31Z\\\",\\\"message\\\":\\\"2025-09-30T13:58:45+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b7b957c3-a227-40bb-8e6e-866fafe1bcb0\\\\n2025-09-30T13:58:45+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b7b957c3-a227-40bb-8e6e-866fafe1bcb0 to /host/opt/cni/bin/\\\\n2025-09-30T13:58:46Z [verbose] multus-daemon started\\\\n2025-09-30T13:58:46Z [verbose] Readiness Indicator file check\\\\n2025-09-30T13:59:31Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:59:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-84swg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s7q5x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:47Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:47 crc kubenswrapper[4676]: I0930 13:59:47.570940 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9343584b-a3e2-45a2-9f71-b815d206f44c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://168155413f2783887fde890a607fd98c1823da5a45b7fc550ef9358750451da7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69000b0d87b6f1e68fb36dd2349f95de211b17cc4e7bd7ec489969feca3810d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0228750af194e47c41da2795fb0b21b2c44b61e7d5a8943a943793954b33e0d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://288cb819756dee05f608c0e6875f3a2b4eb629a5d07db35aaf706a4bc4110899\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c9e51dab4ec7df53e36f707786a013e4fca0909f0d69cb9113dfdff6e0c9a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fadd840f5f6c9e788fbe0e7d681edaf1deb04c62830ff54705c72f75d51eacb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ac24082e5eb0c2a0d037e6eddcca39f94822db8d6034e39d70ceebe2a0e787d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac6999fdb5545e3df7c9c8e66ac4ea1344a9f79d16881ada1f4f8046539da73b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:47Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:47 crc kubenswrapper[4676]: I0930 13:59:47.583765 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2355fa03-bfc0-4ba0-941c-ceee570ab4b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb64f69d88447dcfe2a2e206f1e59b4b10f370ac02a3f9653e0b95e6d94529c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f628b643cc4c4a9a0d4b2c6d923d6ee5839b325a873da093e84dfc48255548f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86ba7b90dcf7a1fcd028d2fcf19bb86de8c58446ef34191d50a5055f4042bb71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5591cefc1293c938f8a2a8db06c7287d0e54d55a9d99c65a40126ca77abccd80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:47Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:47 crc kubenswrapper[4676]: I0930 13:59:47.595288 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-sksn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e47ea2c6-e937-4411-b4c9-98048a5e5f05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtbn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gtbn7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-sksn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:47Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:47 crc kubenswrapper[4676]: I0930 13:59:47.599412 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:47 crc kubenswrapper[4676]: I0930 13:59:47.599491 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:47 crc kubenswrapper[4676]: I0930 13:59:47.599511 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:47 crc kubenswrapper[4676]: I0930 13:59:47.599546 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:47 crc kubenswrapper[4676]: I0930 13:59:47.599566 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:47Z","lastTransitionTime":"2025-09-30T13:59:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:47 crc kubenswrapper[4676]: I0930 13:59:47.609128 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86284cfd-7d99-4f3b-bb3a-593b66737ae3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf15b6b87f6e760bf922b02b49658153875a79c4253bc50a895372c460e0f077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f838844cf38dd9b54b6054863a395010ea5cb87ca8066642ffe388365a91f5a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://151ae8ebe882d8c0150cd97088a56e52f55f0ae066819f8e4aaae2279a61881f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff5fec130c99031e2cb7164e0a3d5c4af84e2b2ac25d3cf255008dc5c1ab967\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3049f477ecf683e57b583e8f15aa9e5335347be632ffc02799ef279fefe9b446\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0930 13:58:31.152502 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 13:58:31.168708 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3623665920/tls.crt::/tmp/serving-cert-3623665920/tls.key\\\\\\\"\\\\nI0930 13:58:36.402239 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 13:58:36.407288 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 13:58:36.407313 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 13:58:36.407332 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 13:58:36.407337 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 13:58:36.414209 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 13:58:36.414232 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414236 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 13:58:36.414241 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 13:58:36.414244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 13:58:36.414247 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 13:58:36.414250 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 13:58:36.414249 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 13:58:36.417272 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2e2c7ce3911aaca47ffc14e5628e9efb3b32594d812028b2659b8f8ca681d53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd1addc747ae070d141b4cef8297dbee1e8ecf98202d21d259bb44d84e53d6da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:47Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:47 crc kubenswrapper[4676]: I0930 13:59:47.619968 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pw2gp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f8ff358-e9b0-478e-acbf-e30059107be1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a57e948acbb622af47bb656f844fcdca4ff7dbf2256fc07d7a73951621b06b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kq87r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f5e3bfb55994ad5563473790f09c8a934f98e96fe7c88bd1f0dc1ae3dfc7a0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kq87r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pw2gp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:47Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:47 crc kubenswrapper[4676]: I0930 13:59:47.630182 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af133cb7-f0e4-428e-b348-c6e81493fc1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8aa7f7317a33eb0f00d05f59ec96a210aa79be528b5dc94d099b25fa019511a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7djb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ba88aa74993edfc49871c7cb7c45c498fc9a6d1e2cc573864d4656d4ade55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z7djb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4k2dp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:47Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:47 crc kubenswrapper[4676]: I0930 13:59:47.639490 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0981ceda-e99c-4074-826c-9b8c5005eb81\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b470ddef2c5c0f7a683cfc156369cebaab5235260a98145e32ae118fafdd214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c198babd51af386813fff3e1bc7b3428c21d2bbfc18c77c3d1b69d664b471e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5c198babd51af386813fff3e1bc7b3428c21d2bbfc18c77c3d1b69d664b471e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:47Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:47 crc kubenswrapper[4676]: I0930 13:59:47.656446 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ksfzg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70e11f41-7d9e-49ef-a2f5-0691d5f8f631\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72f096ef608f833a542cea2dde328dd04393d90bcf9c20601499b9bb49734e8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e68434ff5ca08dde9677cf96ed4703958895aa880048ef99eead372a1a06bcf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e68434ff5ca08dde9677cf96ed4703958895aa880048ef99eead372a1a06bcf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f888604a37c64c17b61fbe754fb766b3a8dd363cd84342d3f464ac3b377b703c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f888604a37c64c17b61fbe754fb766b3a8dd363cd84342d3f464ac3b377b703c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70a4a6ea3224024cc371c57de0a5be4082b7bd9092090f7af42bedda1a897d78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70a4a6ea3224024cc371c57de0a5be4082b7bd9092090f7af42bedda1a897d78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62090d66b5fc63638db525937297cd84b886fc27446d8df9935ee6b45f7d8610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62090d66b5fc63638db525937297cd84b886fc27446d8df9935ee6b45f7d8610\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4452138f5ffc57e76d26761a49ebe4f782695f4d0298426e19d9fc9f87202645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4452138f5ffc57e76d26761a49ebe4f782695f4d0298426e19d9fc9f87202645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2ac4de1ea55982e71a3ecd8eaffb3d44648249635c4cdfbe0181f0497a2d1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c2ac4de1ea55982e71a3ecd8eaffb3d44648249635c4cdfbe0181f0497a2d1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94m48\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ksfzg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:47Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:47 crc kubenswrapper[4676]: I0930 13:59:47.670158 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7stxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e065baf-f38b-4397-bbbe-ef52eea10f84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f5fe4110f265eb89d70f5d85485514e7fea4ed909359edaa0c76d305d1a9c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nkkjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7stxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:47Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:47 crc kubenswrapper[4676]: I0930 13:59:47.679919 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qmgfw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c0fc06a-eb2b-4fa4-8554-ce8a0fd62074\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbb44b6e0c8081c62bace086876620d80d412864b9afb7fd495413baf0363c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mwsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qmgfw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:47Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:47 crc kubenswrapper[4676]: I0930 13:59:47.699031 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fae6bdf-2a3f-4961-934d-b8f653412538\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dbb7809575c05151462f55a6d1d4eca91ba859d817d152220688e61385beff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2def96f529a78f915d69adfd74b48057081c10bf8f21bd69fce0d4b305f80a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2a8ea8d40cefaa9290dfd65c7f1fa80963910c38b71de3992bdcf77fde19eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1323c0deb00658d6452b2d6aa4515233551626a4c4def3f4fff539db4a3f5038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6836ea3a8ec1a7fdff3ef83bd00963c21caeb4e8264f3b6775201c90b83d4653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f1d81c0d66a327a4174d0f39a9540f66d89a6c2049490ddad67174aa684b773\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8cab1fe8b98cb8ace2412a246546b498693392bd88d2ce95ed838184c795670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63ef46221a74ba486be9f66f0c25be6a8d34edccc58a35fedcd728dc2b61ba90\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:59:10Z\\\",\\\"message\\\":\\\"eVersion:\\\\\\\"26904\\\\\\\", FieldPath:\\\\\\\"\\\\\\\"}): type: 'Warning' reason: 'ErrorAddingResource' addLogicalPort failed for openshift-multus/network-metrics-daemon-sksn7: failed to update pod openshift-multus/network-metrics-daemon-sksn7: Internal error occurred: failed calling webhook \\\\\\\"pod.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/pod?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:10Z is after 2025-08-24T17:21:41Z\\\\nF0930 13:59:10.306789 6383 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:10Z is after 2025-0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:59:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8cab1fe8b98cb8ace2412a246546b498693392bd88d2ce95ed838184c795670\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T13:59:38Z\\\",\\\"message\\\":\\\"nt:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.219:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7594bb65-e742-44b3-a975-d639b1128be5}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0930 13:59:38.701574 6796 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for namespace Informer during admin network policy controller initialization, handler {0x1fcbf20 0x1fcbc00 0x1fcbba0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:38Z is after 2025-08-24T17:21:41Z]\\\\nI0930 13:59:38.701581 6796 services_controller.go:356] Processing sync for service openshift-ingress-operator/metrics for network=default\\\\nI09\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T13:59:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d05c6ffe2c46461b1604661e27c75ac6f150de39e30e983d4c4dfbe9c84092e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T13:58:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6b55521ba448057d6f78ab17ef0a1a7d4c8e593e5c7a23bebb4db492fb4b54f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6b55521ba448057d6f78ab17ef0a1a7d4c8e593e5c7a23bebb4db492fb4b54f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T13:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r529k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T13:58:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9775s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:47Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:47 crc kubenswrapper[4676]: I0930 13:59:47.702144 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:47 crc kubenswrapper[4676]: I0930 13:59:47.702171 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:47 crc kubenswrapper[4676]: I0930 13:59:47.702179 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:47 crc kubenswrapper[4676]: I0930 13:59:47.702213 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:47 crc kubenswrapper[4676]: I0930 13:59:47.702222 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:47Z","lastTransitionTime":"2025-09-30T13:59:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:47 crc kubenswrapper[4676]: I0930 13:59:47.709675 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:47Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:47 crc kubenswrapper[4676]: I0930 13:59:47.719840 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T13:58:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T13:59:47Z is after 2025-08-24T17:21:41Z" Sep 30 13:59:47 crc kubenswrapper[4676]: I0930 13:59:47.804648 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:47 crc kubenswrapper[4676]: I0930 13:59:47.804692 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:47 crc kubenswrapper[4676]: I0930 13:59:47.804703 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:47 crc kubenswrapper[4676]: I0930 13:59:47.804720 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:47 crc kubenswrapper[4676]: I0930 13:59:47.804731 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:47Z","lastTransitionTime":"2025-09-30T13:59:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:47 crc kubenswrapper[4676]: I0930 13:59:47.907578 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:47 crc kubenswrapper[4676]: I0930 13:59:47.907634 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:47 crc kubenswrapper[4676]: I0930 13:59:47.907644 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:47 crc kubenswrapper[4676]: I0930 13:59:47.907660 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:47 crc kubenswrapper[4676]: I0930 13:59:47.907670 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:47Z","lastTransitionTime":"2025-09-30T13:59:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:48 crc kubenswrapper[4676]: I0930 13:59:48.010562 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:48 crc kubenswrapper[4676]: I0930 13:59:48.010941 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:48 crc kubenswrapper[4676]: I0930 13:59:48.011030 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:48 crc kubenswrapper[4676]: I0930 13:59:48.011101 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:48 crc kubenswrapper[4676]: I0930 13:59:48.011169 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:48Z","lastTransitionTime":"2025-09-30T13:59:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:48 crc kubenswrapper[4676]: I0930 13:59:48.113714 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:48 crc kubenswrapper[4676]: I0930 13:59:48.113760 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:48 crc kubenswrapper[4676]: I0930 13:59:48.113769 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:48 crc kubenswrapper[4676]: I0930 13:59:48.113784 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:48 crc kubenswrapper[4676]: I0930 13:59:48.113795 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:48Z","lastTransitionTime":"2025-09-30T13:59:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:48 crc kubenswrapper[4676]: I0930 13:59:48.216809 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:48 crc kubenswrapper[4676]: I0930 13:59:48.216905 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:48 crc kubenswrapper[4676]: I0930 13:59:48.216926 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:48 crc kubenswrapper[4676]: I0930 13:59:48.216950 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:48 crc kubenswrapper[4676]: I0930 13:59:48.216965 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:48Z","lastTransitionTime":"2025-09-30T13:59:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:48 crc kubenswrapper[4676]: I0930 13:59:48.319631 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:48 crc kubenswrapper[4676]: I0930 13:59:48.319673 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:48 crc kubenswrapper[4676]: I0930 13:59:48.319683 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:48 crc kubenswrapper[4676]: I0930 13:59:48.319697 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:48 crc kubenswrapper[4676]: I0930 13:59:48.319708 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:48Z","lastTransitionTime":"2025-09-30T13:59:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:48 crc kubenswrapper[4676]: I0930 13:59:48.422331 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:48 crc kubenswrapper[4676]: I0930 13:59:48.422373 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:48 crc kubenswrapper[4676]: I0930 13:59:48.422387 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:48 crc kubenswrapper[4676]: I0930 13:59:48.422406 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:48 crc kubenswrapper[4676]: I0930 13:59:48.422418 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:48Z","lastTransitionTime":"2025-09-30T13:59:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:48 crc kubenswrapper[4676]: I0930 13:59:48.432761 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:59:48 crc kubenswrapper[4676]: E0930 13:59:48.432914 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:59:48 crc kubenswrapper[4676]: I0930 13:59:48.433077 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:59:48 crc kubenswrapper[4676]: E0930 13:59:48.433149 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:59:48 crc kubenswrapper[4676]: I0930 13:59:48.433088 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:59:48 crc kubenswrapper[4676]: E0930 13:59:48.433245 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:59:48 crc kubenswrapper[4676]: I0930 13:59:48.526013 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:48 crc kubenswrapper[4676]: I0930 13:59:48.526065 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:48 crc kubenswrapper[4676]: I0930 13:59:48.526077 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:48 crc kubenswrapper[4676]: I0930 13:59:48.526099 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:48 crc kubenswrapper[4676]: I0930 13:59:48.526117 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:48Z","lastTransitionTime":"2025-09-30T13:59:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:48 crc kubenswrapper[4676]: I0930 13:59:48.629431 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:48 crc kubenswrapper[4676]: I0930 13:59:48.629485 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:48 crc kubenswrapper[4676]: I0930 13:59:48.629497 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:48 crc kubenswrapper[4676]: I0930 13:59:48.629516 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:48 crc kubenswrapper[4676]: I0930 13:59:48.629531 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:48Z","lastTransitionTime":"2025-09-30T13:59:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:48 crc kubenswrapper[4676]: I0930 13:59:48.732092 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:48 crc kubenswrapper[4676]: I0930 13:59:48.732148 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:48 crc kubenswrapper[4676]: I0930 13:59:48.732161 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:48 crc kubenswrapper[4676]: I0930 13:59:48.732180 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:48 crc kubenswrapper[4676]: I0930 13:59:48.732191 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:48Z","lastTransitionTime":"2025-09-30T13:59:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:48 crc kubenswrapper[4676]: I0930 13:59:48.834964 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:48 crc kubenswrapper[4676]: I0930 13:59:48.835030 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:48 crc kubenswrapper[4676]: I0930 13:59:48.835049 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:48 crc kubenswrapper[4676]: I0930 13:59:48.835079 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:48 crc kubenswrapper[4676]: I0930 13:59:48.835097 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:48Z","lastTransitionTime":"2025-09-30T13:59:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:48 crc kubenswrapper[4676]: I0930 13:59:48.938449 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:48 crc kubenswrapper[4676]: I0930 13:59:48.938519 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:48 crc kubenswrapper[4676]: I0930 13:59:48.938542 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:48 crc kubenswrapper[4676]: I0930 13:59:48.938574 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:48 crc kubenswrapper[4676]: I0930 13:59:48.938598 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:48Z","lastTransitionTime":"2025-09-30T13:59:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:49 crc kubenswrapper[4676]: I0930 13:59:49.041387 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:49 crc kubenswrapper[4676]: I0930 13:59:49.041450 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:49 crc kubenswrapper[4676]: I0930 13:59:49.041469 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:49 crc kubenswrapper[4676]: I0930 13:59:49.041494 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:49 crc kubenswrapper[4676]: I0930 13:59:49.041512 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:49Z","lastTransitionTime":"2025-09-30T13:59:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:49 crc kubenswrapper[4676]: I0930 13:59:49.144083 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:49 crc kubenswrapper[4676]: I0930 13:59:49.144151 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:49 crc kubenswrapper[4676]: I0930 13:59:49.144164 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:49 crc kubenswrapper[4676]: I0930 13:59:49.144185 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:49 crc kubenswrapper[4676]: I0930 13:59:49.144200 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:49Z","lastTransitionTime":"2025-09-30T13:59:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:49 crc kubenswrapper[4676]: I0930 13:59:49.247813 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:49 crc kubenswrapper[4676]: I0930 13:59:49.247987 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:49 crc kubenswrapper[4676]: I0930 13:59:49.248077 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:49 crc kubenswrapper[4676]: I0930 13:59:49.248169 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:49 crc kubenswrapper[4676]: I0930 13:59:49.248260 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:49Z","lastTransitionTime":"2025-09-30T13:59:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:49 crc kubenswrapper[4676]: I0930 13:59:49.356772 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:49 crc kubenswrapper[4676]: I0930 13:59:49.356839 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:49 crc kubenswrapper[4676]: I0930 13:59:49.356850 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:49 crc kubenswrapper[4676]: I0930 13:59:49.356869 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:49 crc kubenswrapper[4676]: I0930 13:59:49.356896 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:49Z","lastTransitionTime":"2025-09-30T13:59:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:49 crc kubenswrapper[4676]: I0930 13:59:49.432300 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sksn7" Sep 30 13:59:49 crc kubenswrapper[4676]: E0930 13:59:49.432535 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sksn7" podUID="e47ea2c6-e937-4411-b4c9-98048a5e5f05" Sep 30 13:59:49 crc kubenswrapper[4676]: I0930 13:59:49.459211 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:49 crc kubenswrapper[4676]: I0930 13:59:49.459284 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:49 crc kubenswrapper[4676]: I0930 13:59:49.459296 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:49 crc kubenswrapper[4676]: I0930 13:59:49.459315 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:49 crc kubenswrapper[4676]: I0930 13:59:49.459326 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:49Z","lastTransitionTime":"2025-09-30T13:59:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:49 crc kubenswrapper[4676]: I0930 13:59:49.562234 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:49 crc kubenswrapper[4676]: I0930 13:59:49.562306 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:49 crc kubenswrapper[4676]: I0930 13:59:49.562332 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:49 crc kubenswrapper[4676]: I0930 13:59:49.562366 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:49 crc kubenswrapper[4676]: I0930 13:59:49.562389 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:49Z","lastTransitionTime":"2025-09-30T13:59:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:49 crc kubenswrapper[4676]: I0930 13:59:49.664700 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:49 crc kubenswrapper[4676]: I0930 13:59:49.664755 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:49 crc kubenswrapper[4676]: I0930 13:59:49.664767 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:49 crc kubenswrapper[4676]: I0930 13:59:49.664786 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:49 crc kubenswrapper[4676]: I0930 13:59:49.664800 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:49Z","lastTransitionTime":"2025-09-30T13:59:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:49 crc kubenswrapper[4676]: I0930 13:59:49.775538 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:49 crc kubenswrapper[4676]: I0930 13:59:49.775589 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:49 crc kubenswrapper[4676]: I0930 13:59:49.775604 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:49 crc kubenswrapper[4676]: I0930 13:59:49.775631 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:49 crc kubenswrapper[4676]: I0930 13:59:49.775644 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:49Z","lastTransitionTime":"2025-09-30T13:59:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:49 crc kubenswrapper[4676]: I0930 13:59:49.878468 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:49 crc kubenswrapper[4676]: I0930 13:59:49.878531 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:49 crc kubenswrapper[4676]: I0930 13:59:49.878542 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:49 crc kubenswrapper[4676]: I0930 13:59:49.878565 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:49 crc kubenswrapper[4676]: I0930 13:59:49.878574 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:49Z","lastTransitionTime":"2025-09-30T13:59:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:49 crc kubenswrapper[4676]: I0930 13:59:49.982527 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:49 crc kubenswrapper[4676]: I0930 13:59:49.982576 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:49 crc kubenswrapper[4676]: I0930 13:59:49.982592 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:49 crc kubenswrapper[4676]: I0930 13:59:49.982633 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:49 crc kubenswrapper[4676]: I0930 13:59:49.982646 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:49Z","lastTransitionTime":"2025-09-30T13:59:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:50 crc kubenswrapper[4676]: I0930 13:59:50.085814 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:50 crc kubenswrapper[4676]: I0930 13:59:50.085942 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:50 crc kubenswrapper[4676]: I0930 13:59:50.085971 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:50 crc kubenswrapper[4676]: I0930 13:59:50.086005 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:50 crc kubenswrapper[4676]: I0930 13:59:50.086027 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:50Z","lastTransitionTime":"2025-09-30T13:59:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:50 crc kubenswrapper[4676]: I0930 13:59:50.189177 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:50 crc kubenswrapper[4676]: I0930 13:59:50.189687 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:50 crc kubenswrapper[4676]: I0930 13:59:50.189765 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:50 crc kubenswrapper[4676]: I0930 13:59:50.189851 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:50 crc kubenswrapper[4676]: I0930 13:59:50.189968 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:50Z","lastTransitionTime":"2025-09-30T13:59:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:50 crc kubenswrapper[4676]: I0930 13:59:50.292161 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:50 crc kubenswrapper[4676]: I0930 13:59:50.292218 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:50 crc kubenswrapper[4676]: I0930 13:59:50.292229 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:50 crc kubenswrapper[4676]: I0930 13:59:50.292249 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:50 crc kubenswrapper[4676]: I0930 13:59:50.292261 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:50Z","lastTransitionTime":"2025-09-30T13:59:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:50 crc kubenswrapper[4676]: I0930 13:59:50.395207 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:50 crc kubenswrapper[4676]: I0930 13:59:50.395258 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:50 crc kubenswrapper[4676]: I0930 13:59:50.395267 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:50 crc kubenswrapper[4676]: I0930 13:59:50.395284 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:50 crc kubenswrapper[4676]: I0930 13:59:50.395295 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:50Z","lastTransitionTime":"2025-09-30T13:59:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:50 crc kubenswrapper[4676]: I0930 13:59:50.432858 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:59:50 crc kubenswrapper[4676]: I0930 13:59:50.432934 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:59:50 crc kubenswrapper[4676]: I0930 13:59:50.432995 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:59:50 crc kubenswrapper[4676]: E0930 13:59:50.433022 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:59:50 crc kubenswrapper[4676]: E0930 13:59:50.433097 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:59:50 crc kubenswrapper[4676]: E0930 13:59:50.433280 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:59:50 crc kubenswrapper[4676]: I0930 13:59:50.497539 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:50 crc kubenswrapper[4676]: I0930 13:59:50.497582 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:50 crc kubenswrapper[4676]: I0930 13:59:50.497593 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:50 crc kubenswrapper[4676]: I0930 13:59:50.497611 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:50 crc kubenswrapper[4676]: I0930 13:59:50.497624 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:50Z","lastTransitionTime":"2025-09-30T13:59:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:50 crc kubenswrapper[4676]: I0930 13:59:50.600147 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:50 crc kubenswrapper[4676]: I0930 13:59:50.600496 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:50 crc kubenswrapper[4676]: I0930 13:59:50.600628 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:50 crc kubenswrapper[4676]: I0930 13:59:50.600776 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:50 crc kubenswrapper[4676]: I0930 13:59:50.600915 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:50Z","lastTransitionTime":"2025-09-30T13:59:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:50 crc kubenswrapper[4676]: I0930 13:59:50.703804 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:50 crc kubenswrapper[4676]: I0930 13:59:50.703893 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:50 crc kubenswrapper[4676]: I0930 13:59:50.703911 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:50 crc kubenswrapper[4676]: I0930 13:59:50.703930 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:50 crc kubenswrapper[4676]: I0930 13:59:50.703942 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:50Z","lastTransitionTime":"2025-09-30T13:59:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:50 crc kubenswrapper[4676]: I0930 13:59:50.806319 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:50 crc kubenswrapper[4676]: I0930 13:59:50.806352 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:50 crc kubenswrapper[4676]: I0930 13:59:50.806360 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:50 crc kubenswrapper[4676]: I0930 13:59:50.806373 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:50 crc kubenswrapper[4676]: I0930 13:59:50.806382 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:50Z","lastTransitionTime":"2025-09-30T13:59:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:50 crc kubenswrapper[4676]: I0930 13:59:50.908853 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:50 crc kubenswrapper[4676]: I0930 13:59:50.908908 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:50 crc kubenswrapper[4676]: I0930 13:59:50.908920 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:50 crc kubenswrapper[4676]: I0930 13:59:50.908938 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:50 crc kubenswrapper[4676]: I0930 13:59:50.908951 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:50Z","lastTransitionTime":"2025-09-30T13:59:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:51 crc kubenswrapper[4676]: I0930 13:59:51.012043 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:51 crc kubenswrapper[4676]: I0930 13:59:51.012085 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:51 crc kubenswrapper[4676]: I0930 13:59:51.012093 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:51 crc kubenswrapper[4676]: I0930 13:59:51.012109 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:51 crc kubenswrapper[4676]: I0930 13:59:51.012119 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:51Z","lastTransitionTime":"2025-09-30T13:59:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:51 crc kubenswrapper[4676]: I0930 13:59:51.114982 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:51 crc kubenswrapper[4676]: I0930 13:59:51.115032 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:51 crc kubenswrapper[4676]: I0930 13:59:51.115044 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:51 crc kubenswrapper[4676]: I0930 13:59:51.115061 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:51 crc kubenswrapper[4676]: I0930 13:59:51.115072 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:51Z","lastTransitionTime":"2025-09-30T13:59:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:51 crc kubenswrapper[4676]: I0930 13:59:51.218305 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:51 crc kubenswrapper[4676]: I0930 13:59:51.218552 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:51 crc kubenswrapper[4676]: I0930 13:59:51.218616 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:51 crc kubenswrapper[4676]: I0930 13:59:51.218686 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:51 crc kubenswrapper[4676]: I0930 13:59:51.218794 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:51Z","lastTransitionTime":"2025-09-30T13:59:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:51 crc kubenswrapper[4676]: I0930 13:59:51.321371 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:51 crc kubenswrapper[4676]: I0930 13:59:51.321449 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:51 crc kubenswrapper[4676]: I0930 13:59:51.321469 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:51 crc kubenswrapper[4676]: I0930 13:59:51.321494 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:51 crc kubenswrapper[4676]: I0930 13:59:51.321509 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:51Z","lastTransitionTime":"2025-09-30T13:59:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:51 crc kubenswrapper[4676]: I0930 13:59:51.424066 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:51 crc kubenswrapper[4676]: I0930 13:59:51.424660 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:51 crc kubenswrapper[4676]: I0930 13:59:51.424730 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:51 crc kubenswrapper[4676]: I0930 13:59:51.424802 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:51 crc kubenswrapper[4676]: I0930 13:59:51.424905 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:51Z","lastTransitionTime":"2025-09-30T13:59:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:51 crc kubenswrapper[4676]: I0930 13:59:51.432734 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sksn7" Sep 30 13:59:51 crc kubenswrapper[4676]: E0930 13:59:51.433182 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sksn7" podUID="e47ea2c6-e937-4411-b4c9-98048a5e5f05" Sep 30 13:59:51 crc kubenswrapper[4676]: I0930 13:59:51.532184 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:51 crc kubenswrapper[4676]: I0930 13:59:51.532238 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:51 crc kubenswrapper[4676]: I0930 13:59:51.532251 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:51 crc kubenswrapper[4676]: I0930 13:59:51.532442 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:51 crc kubenswrapper[4676]: I0930 13:59:51.532740 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:51Z","lastTransitionTime":"2025-09-30T13:59:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:51 crc kubenswrapper[4676]: I0930 13:59:51.636640 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:51 crc kubenswrapper[4676]: I0930 13:59:51.636703 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:51 crc kubenswrapper[4676]: I0930 13:59:51.636715 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:51 crc kubenswrapper[4676]: I0930 13:59:51.636738 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:51 crc kubenswrapper[4676]: I0930 13:59:51.636750 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:51Z","lastTransitionTime":"2025-09-30T13:59:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:51 crc kubenswrapper[4676]: I0930 13:59:51.739518 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:51 crc kubenswrapper[4676]: I0930 13:59:51.739558 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:51 crc kubenswrapper[4676]: I0930 13:59:51.739568 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:51 crc kubenswrapper[4676]: I0930 13:59:51.739583 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:51 crc kubenswrapper[4676]: I0930 13:59:51.739594 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:51Z","lastTransitionTime":"2025-09-30T13:59:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:51 crc kubenswrapper[4676]: I0930 13:59:51.842055 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:51 crc kubenswrapper[4676]: I0930 13:59:51.842097 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:51 crc kubenswrapper[4676]: I0930 13:59:51.842107 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:51 crc kubenswrapper[4676]: I0930 13:59:51.842123 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:51 crc kubenswrapper[4676]: I0930 13:59:51.842133 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:51Z","lastTransitionTime":"2025-09-30T13:59:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:51 crc kubenswrapper[4676]: I0930 13:59:51.945185 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:51 crc kubenswrapper[4676]: I0930 13:59:51.945253 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:51 crc kubenswrapper[4676]: I0930 13:59:51.945264 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:51 crc kubenswrapper[4676]: I0930 13:59:51.945284 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:51 crc kubenswrapper[4676]: I0930 13:59:51.945296 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:51Z","lastTransitionTime":"2025-09-30T13:59:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:52 crc kubenswrapper[4676]: I0930 13:59:52.047113 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:52 crc kubenswrapper[4676]: I0930 13:59:52.047158 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:52 crc kubenswrapper[4676]: I0930 13:59:52.047169 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:52 crc kubenswrapper[4676]: I0930 13:59:52.047186 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:52 crc kubenswrapper[4676]: I0930 13:59:52.047198 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:52Z","lastTransitionTime":"2025-09-30T13:59:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:52 crc kubenswrapper[4676]: I0930 13:59:52.149821 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:52 crc kubenswrapper[4676]: I0930 13:59:52.149906 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:52 crc kubenswrapper[4676]: I0930 13:59:52.149920 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:52 crc kubenswrapper[4676]: I0930 13:59:52.149937 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:52 crc kubenswrapper[4676]: I0930 13:59:52.149948 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:52Z","lastTransitionTime":"2025-09-30T13:59:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:52 crc kubenswrapper[4676]: I0930 13:59:52.252371 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:52 crc kubenswrapper[4676]: I0930 13:59:52.252444 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:52 crc kubenswrapper[4676]: I0930 13:59:52.252453 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:52 crc kubenswrapper[4676]: I0930 13:59:52.252475 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:52 crc kubenswrapper[4676]: I0930 13:59:52.252492 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:52Z","lastTransitionTime":"2025-09-30T13:59:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:52 crc kubenswrapper[4676]: I0930 13:59:52.355740 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:52 crc kubenswrapper[4676]: I0930 13:59:52.355782 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:52 crc kubenswrapper[4676]: I0930 13:59:52.355791 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:52 crc kubenswrapper[4676]: I0930 13:59:52.355833 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:52 crc kubenswrapper[4676]: I0930 13:59:52.355850 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:52Z","lastTransitionTime":"2025-09-30T13:59:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:52 crc kubenswrapper[4676]: I0930 13:59:52.432845 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:59:52 crc kubenswrapper[4676]: I0930 13:59:52.432985 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:59:52 crc kubenswrapper[4676]: E0930 13:59:52.433157 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:59:52 crc kubenswrapper[4676]: I0930 13:59:52.433241 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:59:52 crc kubenswrapper[4676]: E0930 13:59:52.433319 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:59:52 crc kubenswrapper[4676]: E0930 13:59:52.433416 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:59:52 crc kubenswrapper[4676]: I0930 13:59:52.458852 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:52 crc kubenswrapper[4676]: I0930 13:59:52.458916 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:52 crc kubenswrapper[4676]: I0930 13:59:52.458928 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:52 crc kubenswrapper[4676]: I0930 13:59:52.458949 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:52 crc kubenswrapper[4676]: I0930 13:59:52.458961 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:52Z","lastTransitionTime":"2025-09-30T13:59:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:52 crc kubenswrapper[4676]: I0930 13:59:52.487342 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 13:59:52 crc kubenswrapper[4676]: I0930 13:59:52.488242 4676 scope.go:117] "RemoveContainer" containerID="b8cab1fe8b98cb8ace2412a246546b498693392bd88d2ce95ed838184c795670" Sep 30 13:59:52 crc kubenswrapper[4676]: E0930 13:59:52.488423 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-9775s_openshift-ovn-kubernetes(4fae6bdf-2a3f-4961-934d-b8f653412538)\"" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" podUID="4fae6bdf-2a3f-4961-934d-b8f653412538" Sep 30 13:59:52 crc kubenswrapper[4676]: I0930 13:59:52.551657 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-7stxd" podStartSLOduration=71.551636578 podStartE2EDuration="1m11.551636578s" podCreationTimestamp="2025-09-30 13:58:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:59:52.550816047 +0000 UTC m=+96.533904476" watchObservedRunningTime="2025-09-30 13:59:52.551636578 +0000 UTC m=+96.534725007" Sep 30 13:59:52 crc kubenswrapper[4676]: I0930 13:59:52.561695 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:52 crc kubenswrapper[4676]: I0930 13:59:52.561757 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:52 crc kubenswrapper[4676]: I0930 13:59:52.561768 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:52 crc kubenswrapper[4676]: I0930 13:59:52.561790 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:52 crc kubenswrapper[4676]: I0930 13:59:52.561806 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:52Z","lastTransitionTime":"2025-09-30T13:59:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:52 crc kubenswrapper[4676]: I0930 13:59:52.563409 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-qmgfw" podStartSLOduration=71.563395036 podStartE2EDuration="1m11.563395036s" podCreationTimestamp="2025-09-30 13:58:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:59:52.562976316 +0000 UTC m=+96.546064755" watchObservedRunningTime="2025-09-30 13:59:52.563395036 +0000 UTC m=+96.546483465" Sep 30 13:59:52 crc kubenswrapper[4676]: I0930 13:59:52.642431 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-s7q5x" podStartSLOduration=71.642411621 podStartE2EDuration="1m11.642411621s" podCreationTimestamp="2025-09-30 13:58:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:59:52.618452365 +0000 UTC m=+96.601540804" watchObservedRunningTime="2025-09-30 13:59:52.642411621 +0000 UTC m=+96.625500050" Sep 30 13:59:52 crc kubenswrapper[4676]: I0930 13:59:52.642861 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=75.642854412 podStartE2EDuration="1m15.642854412s" podCreationTimestamp="2025-09-30 13:58:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:59:52.641826397 +0000 UTC m=+96.624914856" watchObservedRunningTime="2025-09-30 13:59:52.642854412 +0000 UTC m=+96.625942841" Sep 30 13:59:52 crc kubenswrapper[4676]: I0930 13:59:52.656511 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=74.656476546 podStartE2EDuration="1m14.656476546s" podCreationTimestamp="2025-09-30 13:58:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:59:52.656438865 +0000 UTC m=+96.639527304" watchObservedRunningTime="2025-09-30 13:59:52.656476546 +0000 UTC m=+96.639564975" Sep 30 13:59:52 crc kubenswrapper[4676]: I0930 13:59:52.663937 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:52 crc kubenswrapper[4676]: I0930 13:59:52.663984 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:52 crc kubenswrapper[4676]: I0930 13:59:52.663996 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:52 crc kubenswrapper[4676]: I0930 13:59:52.664014 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:52 crc kubenswrapper[4676]: I0930 13:59:52.664028 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:52Z","lastTransitionTime":"2025-09-30T13:59:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:52 crc kubenswrapper[4676]: I0930 13:59:52.673450 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=42.673422051 podStartE2EDuration="42.673422051s" podCreationTimestamp="2025-09-30 13:59:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:59:52.670840548 +0000 UTC m=+96.653928997" watchObservedRunningTime="2025-09-30 13:59:52.673422051 +0000 UTC m=+96.656510480" Sep 30 13:59:52 crc kubenswrapper[4676]: I0930 13:59:52.724367 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=76.724349108 podStartE2EDuration="1m16.724349108s" podCreationTimestamp="2025-09-30 13:58:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:59:52.711244877 +0000 UTC m=+96.694333306" watchObservedRunningTime="2025-09-30 13:59:52.724349108 +0000 UTC m=+96.707437537" Sep 30 13:59:52 crc kubenswrapper[4676]: I0930 13:59:52.724928 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pw2gp" podStartSLOduration=70.724923812 podStartE2EDuration="1m10.724923812s" podCreationTimestamp="2025-09-30 13:58:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:59:52.724316117 +0000 UTC m=+96.707404556" watchObservedRunningTime="2025-09-30 13:59:52.724923812 +0000 UTC m=+96.708012241" Sep 30 13:59:52 crc kubenswrapper[4676]: I0930 13:59:52.746021 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=11.745994939 podStartE2EDuration="11.745994939s" podCreationTimestamp="2025-09-30 13:59:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:59:52.745137477 +0000 UTC m=+96.728225916" watchObservedRunningTime="2025-09-30 13:59:52.745994939 +0000 UTC m=+96.729083368" Sep 30 13:59:52 crc kubenswrapper[4676]: I0930 13:59:52.762085 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-ksfzg" podStartSLOduration=71.762061671 podStartE2EDuration="1m11.762061671s" podCreationTimestamp="2025-09-30 13:58:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:59:52.760419381 +0000 UTC m=+96.743507810" watchObservedRunningTime="2025-09-30 13:59:52.762061671 +0000 UTC m=+96.745150100" Sep 30 13:59:52 crc kubenswrapper[4676]: I0930 13:59:52.766077 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:52 crc kubenswrapper[4676]: I0930 13:59:52.766117 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:52 crc kubenswrapper[4676]: I0930 13:59:52.766129 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:52 crc kubenswrapper[4676]: I0930 13:59:52.766148 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:52 crc kubenswrapper[4676]: I0930 13:59:52.766166 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:52Z","lastTransitionTime":"2025-09-30T13:59:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:52 crc kubenswrapper[4676]: I0930 13:59:52.776050 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podStartSLOduration=71.776024464 podStartE2EDuration="1m11.776024464s" podCreationTimestamp="2025-09-30 13:58:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:59:52.775379628 +0000 UTC m=+96.758468067" watchObservedRunningTime="2025-09-30 13:59:52.776024464 +0000 UTC m=+96.759112903" Sep 30 13:59:52 crc kubenswrapper[4676]: I0930 13:59:52.869500 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:52 crc kubenswrapper[4676]: I0930 13:59:52.869551 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:52 crc kubenswrapper[4676]: I0930 13:59:52.869564 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:52 crc kubenswrapper[4676]: I0930 13:59:52.869583 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:52 crc kubenswrapper[4676]: I0930 13:59:52.869594 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:52Z","lastTransitionTime":"2025-09-30T13:59:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:52 crc kubenswrapper[4676]: I0930 13:59:52.973124 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:52 crc kubenswrapper[4676]: I0930 13:59:52.973184 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:52 crc kubenswrapper[4676]: I0930 13:59:52.973196 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:52 crc kubenswrapper[4676]: I0930 13:59:52.973223 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:52 crc kubenswrapper[4676]: I0930 13:59:52.973234 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:52Z","lastTransitionTime":"2025-09-30T13:59:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:53 crc kubenswrapper[4676]: I0930 13:59:53.075624 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:53 crc kubenswrapper[4676]: I0930 13:59:53.075671 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:53 crc kubenswrapper[4676]: I0930 13:59:53.075680 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:53 crc kubenswrapper[4676]: I0930 13:59:53.075697 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:53 crc kubenswrapper[4676]: I0930 13:59:53.075711 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:53Z","lastTransitionTime":"2025-09-30T13:59:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:53 crc kubenswrapper[4676]: I0930 13:59:53.177998 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:53 crc kubenswrapper[4676]: I0930 13:59:53.178036 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:53 crc kubenswrapper[4676]: I0930 13:59:53.178045 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:53 crc kubenswrapper[4676]: I0930 13:59:53.178061 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:53 crc kubenswrapper[4676]: I0930 13:59:53.178074 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:53Z","lastTransitionTime":"2025-09-30T13:59:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:53 crc kubenswrapper[4676]: I0930 13:59:53.280811 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:53 crc kubenswrapper[4676]: I0930 13:59:53.280861 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:53 crc kubenswrapper[4676]: I0930 13:59:53.280874 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:53 crc kubenswrapper[4676]: I0930 13:59:53.280916 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:53 crc kubenswrapper[4676]: I0930 13:59:53.280934 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:53Z","lastTransitionTime":"2025-09-30T13:59:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:53 crc kubenswrapper[4676]: I0930 13:59:53.384002 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:53 crc kubenswrapper[4676]: I0930 13:59:53.384051 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:53 crc kubenswrapper[4676]: I0930 13:59:53.384061 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:53 crc kubenswrapper[4676]: I0930 13:59:53.384079 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:53 crc kubenswrapper[4676]: I0930 13:59:53.384089 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:53Z","lastTransitionTime":"2025-09-30T13:59:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:53 crc kubenswrapper[4676]: I0930 13:59:53.432654 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sksn7" Sep 30 13:59:53 crc kubenswrapper[4676]: E0930 13:59:53.432872 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sksn7" podUID="e47ea2c6-e937-4411-b4c9-98048a5e5f05" Sep 30 13:59:53 crc kubenswrapper[4676]: I0930 13:59:53.485893 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:53 crc kubenswrapper[4676]: I0930 13:59:53.485937 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:53 crc kubenswrapper[4676]: I0930 13:59:53.485948 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:53 crc kubenswrapper[4676]: I0930 13:59:53.485965 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:53 crc kubenswrapper[4676]: I0930 13:59:53.485976 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:53Z","lastTransitionTime":"2025-09-30T13:59:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:53 crc kubenswrapper[4676]: I0930 13:59:53.588191 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:53 crc kubenswrapper[4676]: I0930 13:59:53.588229 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:53 crc kubenswrapper[4676]: I0930 13:59:53.588239 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:53 crc kubenswrapper[4676]: I0930 13:59:53.588254 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:53 crc kubenswrapper[4676]: I0930 13:59:53.588265 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:53Z","lastTransitionTime":"2025-09-30T13:59:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:53 crc kubenswrapper[4676]: I0930 13:59:53.690842 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:53 crc kubenswrapper[4676]: I0930 13:59:53.690895 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:53 crc kubenswrapper[4676]: I0930 13:59:53.690966 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:53 crc kubenswrapper[4676]: I0930 13:59:53.690990 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:53 crc kubenswrapper[4676]: I0930 13:59:53.691002 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:53Z","lastTransitionTime":"2025-09-30T13:59:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:53 crc kubenswrapper[4676]: I0930 13:59:53.793832 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:53 crc kubenswrapper[4676]: I0930 13:59:53.793891 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:53 crc kubenswrapper[4676]: I0930 13:59:53.793902 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:53 crc kubenswrapper[4676]: I0930 13:59:53.793925 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:53 crc kubenswrapper[4676]: I0930 13:59:53.793944 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:53Z","lastTransitionTime":"2025-09-30T13:59:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:53 crc kubenswrapper[4676]: I0930 13:59:53.897720 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:53 crc kubenswrapper[4676]: I0930 13:59:53.897793 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:53 crc kubenswrapper[4676]: I0930 13:59:53.897808 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:53 crc kubenswrapper[4676]: I0930 13:59:53.897832 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:53 crc kubenswrapper[4676]: I0930 13:59:53.897846 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:53Z","lastTransitionTime":"2025-09-30T13:59:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:54 crc kubenswrapper[4676]: I0930 13:59:54.002107 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:54 crc kubenswrapper[4676]: I0930 13:59:54.002171 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:54 crc kubenswrapper[4676]: I0930 13:59:54.002185 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:54 crc kubenswrapper[4676]: I0930 13:59:54.002209 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:54 crc kubenswrapper[4676]: I0930 13:59:54.002228 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:54Z","lastTransitionTime":"2025-09-30T13:59:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:54 crc kubenswrapper[4676]: I0930 13:59:54.105055 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:54 crc kubenswrapper[4676]: I0930 13:59:54.105153 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:54 crc kubenswrapper[4676]: I0930 13:59:54.105168 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:54 crc kubenswrapper[4676]: I0930 13:59:54.105189 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:54 crc kubenswrapper[4676]: I0930 13:59:54.105202 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:54Z","lastTransitionTime":"2025-09-30T13:59:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:54 crc kubenswrapper[4676]: I0930 13:59:54.208053 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:54 crc kubenswrapper[4676]: I0930 13:59:54.208097 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:54 crc kubenswrapper[4676]: I0930 13:59:54.208108 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:54 crc kubenswrapper[4676]: I0930 13:59:54.208124 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:54 crc kubenswrapper[4676]: I0930 13:59:54.208133 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:54Z","lastTransitionTime":"2025-09-30T13:59:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:54 crc kubenswrapper[4676]: I0930 13:59:54.310337 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:54 crc kubenswrapper[4676]: I0930 13:59:54.310389 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:54 crc kubenswrapper[4676]: I0930 13:59:54.310398 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:54 crc kubenswrapper[4676]: I0930 13:59:54.310413 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:54 crc kubenswrapper[4676]: I0930 13:59:54.310423 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:54Z","lastTransitionTime":"2025-09-30T13:59:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:54 crc kubenswrapper[4676]: I0930 13:59:54.414127 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:54 crc kubenswrapper[4676]: I0930 13:59:54.414165 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:54 crc kubenswrapper[4676]: I0930 13:59:54.414174 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:54 crc kubenswrapper[4676]: I0930 13:59:54.414189 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:54 crc kubenswrapper[4676]: I0930 13:59:54.414199 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:54Z","lastTransitionTime":"2025-09-30T13:59:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:54 crc kubenswrapper[4676]: I0930 13:59:54.432522 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:59:54 crc kubenswrapper[4676]: I0930 13:59:54.432597 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:59:54 crc kubenswrapper[4676]: I0930 13:59:54.432722 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:59:54 crc kubenswrapper[4676]: E0930 13:59:54.432836 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:59:54 crc kubenswrapper[4676]: E0930 13:59:54.433084 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:59:54 crc kubenswrapper[4676]: E0930 13:59:54.433157 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:59:54 crc kubenswrapper[4676]: I0930 13:59:54.454320 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 13:59:54 crc kubenswrapper[4676]: I0930 13:59:54.454368 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 13:59:54 crc kubenswrapper[4676]: I0930 13:59:54.454379 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 13:59:54 crc kubenswrapper[4676]: I0930 13:59:54.454396 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 13:59:54 crc kubenswrapper[4676]: I0930 13:59:54.454408 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T13:59:54Z","lastTransitionTime":"2025-09-30T13:59:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 13:59:54 crc kubenswrapper[4676]: I0930 13:59:54.491911 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-mr92t"] Sep 30 13:59:54 crc kubenswrapper[4676]: I0930 13:59:54.492379 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mr92t" Sep 30 13:59:54 crc kubenswrapper[4676]: I0930 13:59:54.494891 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Sep 30 13:59:54 crc kubenswrapper[4676]: I0930 13:59:54.494908 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Sep 30 13:59:54 crc kubenswrapper[4676]: I0930 13:59:54.495115 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Sep 30 13:59:54 crc kubenswrapper[4676]: I0930 13:59:54.495232 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Sep 30 13:59:54 crc kubenswrapper[4676]: I0930 13:59:54.610184 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e90a48fc-6ca7-4c75-8c13-4a1981796259-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-mr92t\" (UID: \"e90a48fc-6ca7-4c75-8c13-4a1981796259\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mr92t" Sep 30 13:59:54 crc kubenswrapper[4676]: I0930 13:59:54.610240 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e90a48fc-6ca7-4c75-8c13-4a1981796259-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-mr92t\" (UID: \"e90a48fc-6ca7-4c75-8c13-4a1981796259\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mr92t" Sep 30 13:59:54 crc kubenswrapper[4676]: I0930 13:59:54.610262 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e90a48fc-6ca7-4c75-8c13-4a1981796259-service-ca\") pod \"cluster-version-operator-5c965bbfc6-mr92t\" (UID: \"e90a48fc-6ca7-4c75-8c13-4a1981796259\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mr92t" Sep 30 13:59:54 crc kubenswrapper[4676]: I0930 13:59:54.610285 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e90a48fc-6ca7-4c75-8c13-4a1981796259-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-mr92t\" (UID: \"e90a48fc-6ca7-4c75-8c13-4a1981796259\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mr92t" Sep 30 13:59:54 crc kubenswrapper[4676]: I0930 13:59:54.610327 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e90a48fc-6ca7-4c75-8c13-4a1981796259-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-mr92t\" (UID: \"e90a48fc-6ca7-4c75-8c13-4a1981796259\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mr92t" Sep 30 13:59:54 crc kubenswrapper[4676]: I0930 13:59:54.711842 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e90a48fc-6ca7-4c75-8c13-4a1981796259-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-mr92t\" (UID: \"e90a48fc-6ca7-4c75-8c13-4a1981796259\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mr92t" Sep 30 13:59:54 crc kubenswrapper[4676]: I0930 13:59:54.711915 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e90a48fc-6ca7-4c75-8c13-4a1981796259-service-ca\") pod \"cluster-version-operator-5c965bbfc6-mr92t\" (UID: \"e90a48fc-6ca7-4c75-8c13-4a1981796259\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mr92t" Sep 30 13:59:54 crc kubenswrapper[4676]: I0930 13:59:54.711933 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e90a48fc-6ca7-4c75-8c13-4a1981796259-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-mr92t\" (UID: \"e90a48fc-6ca7-4c75-8c13-4a1981796259\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mr92t" Sep 30 13:59:54 crc kubenswrapper[4676]: I0930 13:59:54.711953 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e90a48fc-6ca7-4c75-8c13-4a1981796259-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-mr92t\" (UID: \"e90a48fc-6ca7-4c75-8c13-4a1981796259\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mr92t" Sep 30 13:59:54 crc kubenswrapper[4676]: I0930 13:59:54.711957 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e90a48fc-6ca7-4c75-8c13-4a1981796259-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-mr92t\" (UID: \"e90a48fc-6ca7-4c75-8c13-4a1981796259\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mr92t" Sep 30 13:59:54 crc kubenswrapper[4676]: I0930 13:59:54.711987 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e90a48fc-6ca7-4c75-8c13-4a1981796259-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-mr92t\" (UID: \"e90a48fc-6ca7-4c75-8c13-4a1981796259\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mr92t" Sep 30 13:59:54 crc kubenswrapper[4676]: I0930 13:59:54.712021 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e90a48fc-6ca7-4c75-8c13-4a1981796259-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-mr92t\" (UID: \"e90a48fc-6ca7-4c75-8c13-4a1981796259\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mr92t" Sep 30 13:59:54 crc kubenswrapper[4676]: I0930 13:59:54.714113 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e90a48fc-6ca7-4c75-8c13-4a1981796259-service-ca\") pod \"cluster-version-operator-5c965bbfc6-mr92t\" (UID: \"e90a48fc-6ca7-4c75-8c13-4a1981796259\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mr92t" Sep 30 13:59:54 crc kubenswrapper[4676]: I0930 13:59:54.721600 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e90a48fc-6ca7-4c75-8c13-4a1981796259-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-mr92t\" (UID: \"e90a48fc-6ca7-4c75-8c13-4a1981796259\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mr92t" Sep 30 13:59:54 crc kubenswrapper[4676]: I0930 13:59:54.728273 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e90a48fc-6ca7-4c75-8c13-4a1981796259-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-mr92t\" (UID: \"e90a48fc-6ca7-4c75-8c13-4a1981796259\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mr92t" Sep 30 13:59:54 crc kubenswrapper[4676]: I0930 13:59:54.810326 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mr92t" Sep 30 13:59:54 crc kubenswrapper[4676]: W0930 13:59:54.829967 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode90a48fc_6ca7_4c75_8c13_4a1981796259.slice/crio-d49c1f586dca3688a56b46e6b21548052fbac1cab851c2a2313c56c0fee22748 WatchSource:0}: Error finding container d49c1f586dca3688a56b46e6b21548052fbac1cab851c2a2313c56c0fee22748: Status 404 returned error can't find the container with id d49c1f586dca3688a56b46e6b21548052fbac1cab851c2a2313c56c0fee22748 Sep 30 13:59:54 crc kubenswrapper[4676]: I0930 13:59:54.907328 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mr92t" event={"ID":"e90a48fc-6ca7-4c75-8c13-4a1981796259","Type":"ContainerStarted","Data":"d49c1f586dca3688a56b46e6b21548052fbac1cab851c2a2313c56c0fee22748"} Sep 30 13:59:55 crc kubenswrapper[4676]: I0930 13:59:55.432428 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sksn7" Sep 30 13:59:55 crc kubenswrapper[4676]: E0930 13:59:55.432648 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sksn7" podUID="e47ea2c6-e937-4411-b4c9-98048a5e5f05" Sep 30 13:59:55 crc kubenswrapper[4676]: I0930 13:59:55.912928 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mr92t" event={"ID":"e90a48fc-6ca7-4c75-8c13-4a1981796259","Type":"ContainerStarted","Data":"51c04577fa9a194bb3633a1f8917599ab39897518b7408ea4be2446ee216aa23"} Sep 30 13:59:56 crc kubenswrapper[4676]: I0930 13:59:56.432231 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:59:56 crc kubenswrapper[4676]: I0930 13:59:56.432231 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:59:56 crc kubenswrapper[4676]: E0930 13:59:56.432534 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:59:56 crc kubenswrapper[4676]: E0930 13:59:56.432442 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:59:56 crc kubenswrapper[4676]: I0930 13:59:56.432253 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:59:56 crc kubenswrapper[4676]: E0930 13:59:56.432611 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:59:57 crc kubenswrapper[4676]: I0930 13:59:57.433366 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sksn7" Sep 30 13:59:57 crc kubenswrapper[4676]: E0930 13:59:57.433801 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sksn7" podUID="e47ea2c6-e937-4411-b4c9-98048a5e5f05" Sep 30 13:59:58 crc kubenswrapper[4676]: I0930 13:59:58.433051 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 13:59:58 crc kubenswrapper[4676]: I0930 13:59:58.433187 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 13:59:58 crc kubenswrapper[4676]: E0930 13:59:58.433325 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 13:59:58 crc kubenswrapper[4676]: E0930 13:59:58.433557 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 13:59:58 crc kubenswrapper[4676]: I0930 13:59:58.433757 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 13:59:58 crc kubenswrapper[4676]: E0930 13:59:58.433834 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 13:59:59 crc kubenswrapper[4676]: I0930 13:59:59.433058 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sksn7" Sep 30 13:59:59 crc kubenswrapper[4676]: E0930 13:59:59.433489 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sksn7" podUID="e47ea2c6-e937-4411-b4c9-98048a5e5f05" Sep 30 14:00:00 crc kubenswrapper[4676]: I0930 14:00:00.000930 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e47ea2c6-e937-4411-b4c9-98048a5e5f05-metrics-certs\") pod \"network-metrics-daemon-sksn7\" (UID: \"e47ea2c6-e937-4411-b4c9-98048a5e5f05\") " pod="openshift-multus/network-metrics-daemon-sksn7" Sep 30 14:00:00 crc kubenswrapper[4676]: E0930 14:00:00.001119 4676 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 14:00:00 crc kubenswrapper[4676]: E0930 14:00:00.001223 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e47ea2c6-e937-4411-b4c9-98048a5e5f05-metrics-certs podName:e47ea2c6-e937-4411-b4c9-98048a5e5f05 nodeName:}" failed. No retries permitted until 2025-09-30 14:01:04.001198944 +0000 UTC m=+167.984287373 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e47ea2c6-e937-4411-b4c9-98048a5e5f05-metrics-certs") pod "network-metrics-daemon-sksn7" (UID: "e47ea2c6-e937-4411-b4c9-98048a5e5f05") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 14:00:00 crc kubenswrapper[4676]: I0930 14:00:00.432855 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 14:00:00 crc kubenswrapper[4676]: I0930 14:00:00.433001 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 14:00:00 crc kubenswrapper[4676]: E0930 14:00:00.433212 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 14:00:00 crc kubenswrapper[4676]: E0930 14:00:00.433350 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 14:00:00 crc kubenswrapper[4676]: I0930 14:00:00.433475 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 14:00:00 crc kubenswrapper[4676]: E0930 14:00:00.433598 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 14:00:01 crc kubenswrapper[4676]: I0930 14:00:01.433069 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sksn7" Sep 30 14:00:01 crc kubenswrapper[4676]: E0930 14:00:01.433740 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sksn7" podUID="e47ea2c6-e937-4411-b4c9-98048a5e5f05" Sep 30 14:00:02 crc kubenswrapper[4676]: I0930 14:00:02.432869 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 14:00:02 crc kubenswrapper[4676]: I0930 14:00:02.432922 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 14:00:02 crc kubenswrapper[4676]: I0930 14:00:02.432988 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 14:00:02 crc kubenswrapper[4676]: E0930 14:00:02.433031 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 14:00:02 crc kubenswrapper[4676]: E0930 14:00:02.433151 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 14:00:02 crc kubenswrapper[4676]: E0930 14:00:02.433265 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 14:00:03 crc kubenswrapper[4676]: I0930 14:00:03.432091 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sksn7" Sep 30 14:00:03 crc kubenswrapper[4676]: E0930 14:00:03.432251 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sksn7" podUID="e47ea2c6-e937-4411-b4c9-98048a5e5f05" Sep 30 14:00:04 crc kubenswrapper[4676]: I0930 14:00:04.433026 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 14:00:04 crc kubenswrapper[4676]: I0930 14:00:04.433021 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 14:00:04 crc kubenswrapper[4676]: E0930 14:00:04.433167 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 14:00:04 crc kubenswrapper[4676]: I0930 14:00:04.433048 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 14:00:04 crc kubenswrapper[4676]: E0930 14:00:04.433346 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 14:00:04 crc kubenswrapper[4676]: E0930 14:00:04.433390 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 14:00:05 crc kubenswrapper[4676]: I0930 14:00:05.432573 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sksn7" Sep 30 14:00:05 crc kubenswrapper[4676]: E0930 14:00:05.432751 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sksn7" podUID="e47ea2c6-e937-4411-b4c9-98048a5e5f05" Sep 30 14:00:06 crc kubenswrapper[4676]: I0930 14:00:06.432417 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 14:00:06 crc kubenswrapper[4676]: I0930 14:00:06.432517 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 14:00:06 crc kubenswrapper[4676]: E0930 14:00:06.432596 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 14:00:06 crc kubenswrapper[4676]: E0930 14:00:06.432697 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 14:00:06 crc kubenswrapper[4676]: I0930 14:00:06.432788 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 14:00:06 crc kubenswrapper[4676]: E0930 14:00:06.432832 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 14:00:07 crc kubenswrapper[4676]: I0930 14:00:07.432833 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sksn7" Sep 30 14:00:07 crc kubenswrapper[4676]: E0930 14:00:07.434015 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sksn7" podUID="e47ea2c6-e937-4411-b4c9-98048a5e5f05" Sep 30 14:00:07 crc kubenswrapper[4676]: I0930 14:00:07.434838 4676 scope.go:117] "RemoveContainer" containerID="b8cab1fe8b98cb8ace2412a246546b498693392bd88d2ce95ed838184c795670" Sep 30 14:00:07 crc kubenswrapper[4676]: E0930 14:00:07.435099 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-9775s_openshift-ovn-kubernetes(4fae6bdf-2a3f-4961-934d-b8f653412538)\"" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" podUID="4fae6bdf-2a3f-4961-934d-b8f653412538" Sep 30 14:00:08 crc kubenswrapper[4676]: I0930 14:00:08.432632 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 14:00:08 crc kubenswrapper[4676]: I0930 14:00:08.432698 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 14:00:08 crc kubenswrapper[4676]: I0930 14:00:08.432750 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 14:00:08 crc kubenswrapper[4676]: E0930 14:00:08.432804 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 14:00:08 crc kubenswrapper[4676]: E0930 14:00:08.432955 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 14:00:08 crc kubenswrapper[4676]: E0930 14:00:08.433040 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 14:00:09 crc kubenswrapper[4676]: I0930 14:00:09.432544 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sksn7" Sep 30 14:00:09 crc kubenswrapper[4676]: E0930 14:00:09.432721 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sksn7" podUID="e47ea2c6-e937-4411-b4c9-98048a5e5f05" Sep 30 14:00:10 crc kubenswrapper[4676]: I0930 14:00:10.432625 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 14:00:10 crc kubenswrapper[4676]: I0930 14:00:10.432683 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 14:00:10 crc kubenswrapper[4676]: E0930 14:00:10.432842 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 14:00:10 crc kubenswrapper[4676]: I0930 14:00:10.433139 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 14:00:10 crc kubenswrapper[4676]: E0930 14:00:10.433235 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 14:00:10 crc kubenswrapper[4676]: E0930 14:00:10.433416 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 14:00:11 crc kubenswrapper[4676]: I0930 14:00:11.432322 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sksn7" Sep 30 14:00:11 crc kubenswrapper[4676]: E0930 14:00:11.432492 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sksn7" podUID="e47ea2c6-e937-4411-b4c9-98048a5e5f05" Sep 30 14:00:12 crc kubenswrapper[4676]: I0930 14:00:12.432069 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 14:00:12 crc kubenswrapper[4676]: I0930 14:00:12.432153 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 14:00:12 crc kubenswrapper[4676]: I0930 14:00:12.432107 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 14:00:12 crc kubenswrapper[4676]: E0930 14:00:12.432222 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 14:00:12 crc kubenswrapper[4676]: E0930 14:00:12.432288 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 14:00:12 crc kubenswrapper[4676]: E0930 14:00:12.432494 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 14:00:13 crc kubenswrapper[4676]: I0930 14:00:13.432461 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sksn7" Sep 30 14:00:13 crc kubenswrapper[4676]: E0930 14:00:13.432657 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sksn7" podUID="e47ea2c6-e937-4411-b4c9-98048a5e5f05" Sep 30 14:00:14 crc kubenswrapper[4676]: I0930 14:00:14.432820 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 14:00:14 crc kubenswrapper[4676]: I0930 14:00:14.432910 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 14:00:14 crc kubenswrapper[4676]: I0930 14:00:14.432932 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 14:00:14 crc kubenswrapper[4676]: E0930 14:00:14.433041 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 14:00:14 crc kubenswrapper[4676]: E0930 14:00:14.433177 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 14:00:14 crc kubenswrapper[4676]: E0930 14:00:14.433589 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 14:00:15 crc kubenswrapper[4676]: I0930 14:00:15.433003 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sksn7" Sep 30 14:00:15 crc kubenswrapper[4676]: E0930 14:00:15.433172 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sksn7" podUID="e47ea2c6-e937-4411-b4c9-98048a5e5f05" Sep 30 14:00:16 crc kubenswrapper[4676]: I0930 14:00:16.432612 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 14:00:16 crc kubenswrapper[4676]: I0930 14:00:16.432631 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 14:00:16 crc kubenswrapper[4676]: I0930 14:00:16.432652 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 14:00:16 crc kubenswrapper[4676]: E0930 14:00:16.432766 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 14:00:16 crc kubenswrapper[4676]: E0930 14:00:16.432894 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 14:00:16 crc kubenswrapper[4676]: E0930 14:00:16.432988 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 14:00:17 crc kubenswrapper[4676]: I0930 14:00:17.432804 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sksn7" Sep 30 14:00:17 crc kubenswrapper[4676]: E0930 14:00:17.433866 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sksn7" podUID="e47ea2c6-e937-4411-b4c9-98048a5e5f05" Sep 30 14:00:17 crc kubenswrapper[4676]: E0930 14:00:17.459122 4676 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Sep 30 14:00:17 crc kubenswrapper[4676]: E0930 14:00:17.554004 4676 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 30 14:00:17 crc kubenswrapper[4676]: I0930 14:00:17.980102 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-s7q5x_12808c49-1bed-4251-bcbe-fad6207eea57/kube-multus/1.log" Sep 30 14:00:17 crc kubenswrapper[4676]: I0930 14:00:17.980518 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-s7q5x_12808c49-1bed-4251-bcbe-fad6207eea57/kube-multus/0.log" Sep 30 14:00:17 crc kubenswrapper[4676]: I0930 14:00:17.980665 4676 generic.go:334] "Generic (PLEG): container finished" podID="12808c49-1bed-4251-bcbe-fad6207eea57" containerID="c09ecf58dd0a201d63bf48e9293f7416eb2fd5f5d394f293ae8f20758bb505b2" exitCode=1 Sep 30 14:00:17 crc kubenswrapper[4676]: I0930 14:00:17.980767 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-s7q5x" event={"ID":"12808c49-1bed-4251-bcbe-fad6207eea57","Type":"ContainerDied","Data":"c09ecf58dd0a201d63bf48e9293f7416eb2fd5f5d394f293ae8f20758bb505b2"} Sep 30 14:00:17 crc kubenswrapper[4676]: I0930 14:00:17.980842 4676 scope.go:117] "RemoveContainer" containerID="397ae2103d09a79613ccc4281fb5318128fa7dc84e8013a5327cf3cd31490049" Sep 30 14:00:17 crc kubenswrapper[4676]: I0930 14:00:17.981355 4676 scope.go:117] "RemoveContainer" containerID="c09ecf58dd0a201d63bf48e9293f7416eb2fd5f5d394f293ae8f20758bb505b2" Sep 30 14:00:17 crc kubenswrapper[4676]: E0930 14:00:17.981552 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-s7q5x_openshift-multus(12808c49-1bed-4251-bcbe-fad6207eea57)\"" pod="openshift-multus/multus-s7q5x" podUID="12808c49-1bed-4251-bcbe-fad6207eea57" Sep 30 14:00:17 crc kubenswrapper[4676]: I0930 14:00:17.997995 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mr92t" podStartSLOduration=96.997975128 podStartE2EDuration="1m36.997975128s" podCreationTimestamp="2025-09-30 13:58:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 13:59:55.92873795 +0000 UTC m=+99.911826379" watchObservedRunningTime="2025-09-30 14:00:17.997975128 +0000 UTC m=+121.981063557" Sep 30 14:00:18 crc kubenswrapper[4676]: I0930 14:00:18.432668 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 14:00:18 crc kubenswrapper[4676]: I0930 14:00:18.432738 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 14:00:18 crc kubenswrapper[4676]: I0930 14:00:18.432805 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 14:00:18 crc kubenswrapper[4676]: E0930 14:00:18.433765 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 14:00:18 crc kubenswrapper[4676]: E0930 14:00:18.433920 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 14:00:18 crc kubenswrapper[4676]: E0930 14:00:18.433947 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 14:00:18 crc kubenswrapper[4676]: I0930 14:00:18.984783 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-s7q5x_12808c49-1bed-4251-bcbe-fad6207eea57/kube-multus/1.log" Sep 30 14:00:19 crc kubenswrapper[4676]: I0930 14:00:19.432696 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sksn7" Sep 30 14:00:19 crc kubenswrapper[4676]: E0930 14:00:19.433031 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sksn7" podUID="e47ea2c6-e937-4411-b4c9-98048a5e5f05" Sep 30 14:00:20 crc kubenswrapper[4676]: I0930 14:00:20.431960 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 14:00:20 crc kubenswrapper[4676]: I0930 14:00:20.432003 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 14:00:20 crc kubenswrapper[4676]: I0930 14:00:20.431979 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 14:00:20 crc kubenswrapper[4676]: E0930 14:00:20.432158 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 14:00:20 crc kubenswrapper[4676]: E0930 14:00:20.432306 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 14:00:20 crc kubenswrapper[4676]: E0930 14:00:20.432374 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 14:00:21 crc kubenswrapper[4676]: I0930 14:00:21.432483 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sksn7" Sep 30 14:00:21 crc kubenswrapper[4676]: E0930 14:00:21.432644 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sksn7" podUID="e47ea2c6-e937-4411-b4c9-98048a5e5f05" Sep 30 14:00:22 crc kubenswrapper[4676]: I0930 14:00:22.432024 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 14:00:22 crc kubenswrapper[4676]: E0930 14:00:22.432186 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 14:00:22 crc kubenswrapper[4676]: I0930 14:00:22.432350 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 14:00:22 crc kubenswrapper[4676]: I0930 14:00:22.432480 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 14:00:22 crc kubenswrapper[4676]: E0930 14:00:22.432984 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 14:00:22 crc kubenswrapper[4676]: E0930 14:00:22.433236 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 14:00:22 crc kubenswrapper[4676]: I0930 14:00:22.433380 4676 scope.go:117] "RemoveContainer" containerID="b8cab1fe8b98cb8ace2412a246546b498693392bd88d2ce95ed838184c795670" Sep 30 14:00:22 crc kubenswrapper[4676]: E0930 14:00:22.555845 4676 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 30 14:00:23 crc kubenswrapper[4676]: I0930 14:00:23.000556 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9775s_4fae6bdf-2a3f-4961-934d-b8f653412538/ovnkube-controller/3.log" Sep 30 14:00:23 crc kubenswrapper[4676]: I0930 14:00:23.002997 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" event={"ID":"4fae6bdf-2a3f-4961-934d-b8f653412538","Type":"ContainerStarted","Data":"3570aaa8d7310d0e69298c35f7c86dd176f0969ffb22846f86bda60c92e1f439"} Sep 30 14:00:23 crc kubenswrapper[4676]: I0930 14:00:23.004298 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 14:00:23 crc kubenswrapper[4676]: I0930 14:00:23.032998 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" podStartSLOduration=102.032977595 podStartE2EDuration="1m42.032977595s" podCreationTimestamp="2025-09-30 13:58:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:00:23.03111142 +0000 UTC m=+127.014199859" watchObservedRunningTime="2025-09-30 14:00:23.032977595 +0000 UTC m=+127.016066024" Sep 30 14:00:23 crc kubenswrapper[4676]: I0930 14:00:23.361838 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-sksn7"] Sep 30 14:00:23 crc kubenswrapper[4676]: I0930 14:00:23.362578 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sksn7" Sep 30 14:00:23 crc kubenswrapper[4676]: E0930 14:00:23.362747 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sksn7" podUID="e47ea2c6-e937-4411-b4c9-98048a5e5f05" Sep 30 14:00:24 crc kubenswrapper[4676]: I0930 14:00:24.432938 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 14:00:24 crc kubenswrapper[4676]: I0930 14:00:24.433021 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 14:00:24 crc kubenswrapper[4676]: I0930 14:00:24.432965 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 14:00:24 crc kubenswrapper[4676]: E0930 14:00:24.433120 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 14:00:24 crc kubenswrapper[4676]: E0930 14:00:24.433270 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 14:00:24 crc kubenswrapper[4676]: E0930 14:00:24.433329 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 14:00:25 crc kubenswrapper[4676]: I0930 14:00:25.433067 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sksn7" Sep 30 14:00:25 crc kubenswrapper[4676]: E0930 14:00:25.433265 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sksn7" podUID="e47ea2c6-e937-4411-b4c9-98048a5e5f05" Sep 30 14:00:26 crc kubenswrapper[4676]: I0930 14:00:26.432240 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 14:00:26 crc kubenswrapper[4676]: I0930 14:00:26.432321 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 14:00:26 crc kubenswrapper[4676]: E0930 14:00:26.432397 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 14:00:26 crc kubenswrapper[4676]: E0930 14:00:26.432471 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 14:00:26 crc kubenswrapper[4676]: I0930 14:00:26.432557 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 14:00:26 crc kubenswrapper[4676]: E0930 14:00:26.432780 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 14:00:27 crc kubenswrapper[4676]: I0930 14:00:27.434245 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sksn7" Sep 30 14:00:27 crc kubenswrapper[4676]: E0930 14:00:27.434389 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sksn7" podUID="e47ea2c6-e937-4411-b4c9-98048a5e5f05" Sep 30 14:00:27 crc kubenswrapper[4676]: E0930 14:00:27.557164 4676 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 30 14:00:28 crc kubenswrapper[4676]: I0930 14:00:28.432249 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 14:00:28 crc kubenswrapper[4676]: I0930 14:00:28.432260 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 14:00:28 crc kubenswrapper[4676]: E0930 14:00:28.432464 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 14:00:28 crc kubenswrapper[4676]: I0930 14:00:28.432306 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 14:00:28 crc kubenswrapper[4676]: E0930 14:00:28.432612 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 14:00:28 crc kubenswrapper[4676]: E0930 14:00:28.432658 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 14:00:29 crc kubenswrapper[4676]: I0930 14:00:29.432712 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sksn7" Sep 30 14:00:29 crc kubenswrapper[4676]: E0930 14:00:29.432916 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sksn7" podUID="e47ea2c6-e937-4411-b4c9-98048a5e5f05" Sep 30 14:00:30 crc kubenswrapper[4676]: I0930 14:00:30.432103 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 14:00:30 crc kubenswrapper[4676]: E0930 14:00:30.432269 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 14:00:30 crc kubenswrapper[4676]: I0930 14:00:30.432345 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 14:00:30 crc kubenswrapper[4676]: E0930 14:00:30.432521 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 14:00:30 crc kubenswrapper[4676]: I0930 14:00:30.432746 4676 scope.go:117] "RemoveContainer" containerID="c09ecf58dd0a201d63bf48e9293f7416eb2fd5f5d394f293ae8f20758bb505b2" Sep 30 14:00:30 crc kubenswrapper[4676]: I0930 14:00:30.433097 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 14:00:30 crc kubenswrapper[4676]: E0930 14:00:30.434172 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 14:00:31 crc kubenswrapper[4676]: I0930 14:00:31.031133 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-s7q5x_12808c49-1bed-4251-bcbe-fad6207eea57/kube-multus/1.log" Sep 30 14:00:31 crc kubenswrapper[4676]: I0930 14:00:31.031715 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-s7q5x" event={"ID":"12808c49-1bed-4251-bcbe-fad6207eea57","Type":"ContainerStarted","Data":"113c6305d3745ecf2f275c9e94641b977c27aa7cac2b903fe4da072121bcae96"} Sep 30 14:00:31 crc kubenswrapper[4676]: I0930 14:00:31.432276 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sksn7" Sep 30 14:00:31 crc kubenswrapper[4676]: E0930 14:00:31.432487 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sksn7" podUID="e47ea2c6-e937-4411-b4c9-98048a5e5f05" Sep 30 14:00:32 crc kubenswrapper[4676]: I0930 14:00:32.432436 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 14:00:32 crc kubenswrapper[4676]: I0930 14:00:32.432490 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 14:00:32 crc kubenswrapper[4676]: I0930 14:00:32.432456 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 14:00:32 crc kubenswrapper[4676]: E0930 14:00:32.432618 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 14:00:32 crc kubenswrapper[4676]: E0930 14:00:32.432687 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 14:00:32 crc kubenswrapper[4676]: E0930 14:00:32.432780 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 14:00:33 crc kubenswrapper[4676]: I0930 14:00:33.432211 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sksn7" Sep 30 14:00:33 crc kubenswrapper[4676]: I0930 14:00:33.435308 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Sep 30 14:00:33 crc kubenswrapper[4676]: I0930 14:00:33.442459 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Sep 30 14:00:34 crc kubenswrapper[4676]: I0930 14:00:34.432563 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 14:00:34 crc kubenswrapper[4676]: I0930 14:00:34.432691 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 14:00:34 crc kubenswrapper[4676]: I0930 14:00:34.432867 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 14:00:34 crc kubenswrapper[4676]: I0930 14:00:34.434870 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Sep 30 14:00:34 crc kubenswrapper[4676]: I0930 14:00:34.435021 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Sep 30 14:00:34 crc kubenswrapper[4676]: I0930 14:00:34.435357 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Sep 30 14:00:34 crc kubenswrapper[4676]: I0930 14:00:34.435441 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.450180 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.491227 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gn7dj"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.491798 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gn7dj" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.492940 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-qbdpc"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.493645 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-qbdpc" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.497115 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-xr2v8"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.497701 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-xr2v8" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.498794 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h9nl2"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.499292 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h9nl2" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.500909 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-c9kjj"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.501428 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-c9kjj" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.501852 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6fq4x"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.502305 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6fq4x" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.512578 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.513206 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.515113 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.515361 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.515484 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.515656 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.515869 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.516024 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.516291 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-qhnkq"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.524205 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.525073 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.525432 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.525807 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.526332 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-4mgv7"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.527122 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.527139 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4mgv7" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.528315 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.529113 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-qhnkq" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.529401 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.529914 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.530228 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.530489 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.530815 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.531029 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.531400 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.531667 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.531955 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.532247 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.532543 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.532664 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.532751 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.532987 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.533262 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.533479 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-mqdxw"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.533971 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.534246 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.534797 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.534991 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.535123 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.554085 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.554397 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.556667 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.556779 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.556947 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.556684 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.557042 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.557086 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.557131 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.557042 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.557199 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.557274 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.557354 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.557697 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-wwnsl"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.559143 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.559257 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.559166 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mqdxw" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.559397 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.559656 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.559759 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.560911 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-zgcbx"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.561810 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.562553 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-wwnsl" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.563025 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-zgcbx" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.563330 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-qnmrm"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.564135 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-rs79q"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.564828 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-rs79q" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.564897 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-k8mk5"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.564949 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-qnmrm" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.565421 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-k8mk5" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.568373 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-rgs57"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.569137 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.569290 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7sxgh"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.569867 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7sxgh" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.570967 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-5hctc"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.571396 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-5hctc" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.571830 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.574577 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.576156 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-bthfv"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.577025 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bthfv" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.577105 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-c9kjj"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.578435 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-qhnkq"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.579930 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gn7dj"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.580693 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-qbdpc"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.581973 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-gcvn8"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.582665 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.582954 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p9gxt"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.583467 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p9gxt" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.583730 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-7p4nz"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.583849 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gcvn8" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.584371 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7p4nz" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.594803 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.595110 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.595285 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.595535 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.595776 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.595971 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.596391 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6mnvc"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.604812 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/10f8e308-867e-4dd1-be59-c73d8297cbfe-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-6fq4x\" (UID: \"10f8e308-867e-4dd1-be59-c73d8297cbfe\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6fq4x" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.604897 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61b1921b-4102-4959-abd7-c86ca3ae880e-config\") pod \"controller-manager-879f6c89f-qbdpc\" (UID: \"61b1921b-4102-4959-abd7-c86ca3ae880e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qbdpc" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.604937 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be094421-14a0-4ba3-b708-fb08c498c2d6-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-qhnkq\" (UID: \"be094421-14a0-4ba3-b708-fb08c498c2d6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qhnkq" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.604969 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5d26a632-0017-40e4-ac56-5794c4800d83-etcd-client\") pod \"apiserver-76f77b778f-xr2v8\" (UID: \"5d26a632-0017-40e4-ac56-5794c4800d83\") " pod="openshift-apiserver/apiserver-76f77b778f-xr2v8" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.604995 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnf52\" (UniqueName: \"kubernetes.io/projected/ebd6b987-d54a-4692-800a-8eadc5e8690c-kube-api-access-tnf52\") pod \"machine-api-operator-5694c8668f-c9kjj\" (UID: \"ebd6b987-d54a-4692-800a-8eadc5e8690c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-c9kjj" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.605025 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4c806cf0-18da-4035-bf9c-f134a1b23485-audit-dir\") pod \"apiserver-7bbb656c7d-4mgv7\" (UID: \"4c806cf0-18da-4035-bf9c-f134a1b23485\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4mgv7" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.605052 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b3b285f-9406-4f9b-9768-8827933418d7-config\") pod \"route-controller-manager-6576b87f9c-gn7dj\" (UID: \"0b3b285f-9406-4f9b-9768-8827933418d7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gn7dj" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.605082 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c806cf0-18da-4035-bf9c-f134a1b23485-serving-cert\") pod \"apiserver-7bbb656c7d-4mgv7\" (UID: \"4c806cf0-18da-4035-bf9c-f134a1b23485\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4mgv7" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.605108 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5d26a632-0017-40e4-ac56-5794c4800d83-etcd-serving-ca\") pod \"apiserver-76f77b778f-xr2v8\" (UID: \"5d26a632-0017-40e4-ac56-5794c4800d83\") " pod="openshift-apiserver/apiserver-76f77b778f-xr2v8" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.605133 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be094421-14a0-4ba3-b708-fb08c498c2d6-serving-cert\") pod \"authentication-operator-69f744f599-qhnkq\" (UID: \"be094421-14a0-4ba3-b708-fb08c498c2d6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qhnkq" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.605162 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebd6b987-d54a-4692-800a-8eadc5e8690c-config\") pod \"machine-api-operator-5694c8668f-c9kjj\" (UID: \"ebd6b987-d54a-4692-800a-8eadc5e8690c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-c9kjj" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.605194 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5h8rn\" (UniqueName: \"kubernetes.io/projected/10f8e308-867e-4dd1-be59-c73d8297cbfe-kube-api-access-5h8rn\") pod \"cluster-samples-operator-665b6dd947-6fq4x\" (UID: \"10f8e308-867e-4dd1-be59-c73d8297cbfe\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6fq4x" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.605229 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwd5w\" (UniqueName: \"kubernetes.io/projected/61b1921b-4102-4959-abd7-c86ca3ae880e-kube-api-access-rwd5w\") pod \"controller-manager-879f6c89f-qbdpc\" (UID: \"61b1921b-4102-4959-abd7-c86ca3ae880e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qbdpc" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.605253 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5d26a632-0017-40e4-ac56-5794c4800d83-encryption-config\") pod \"apiserver-76f77b778f-xr2v8\" (UID: \"5d26a632-0017-40e4-ac56-5794c4800d83\") " pod="openshift-apiserver/apiserver-76f77b778f-xr2v8" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.605281 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0b3b285f-9406-4f9b-9768-8827933418d7-client-ca\") pod \"route-controller-manager-6576b87f9c-gn7dj\" (UID: \"0b3b285f-9406-4f9b-9768-8827933418d7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gn7dj" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.605312 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79abcd65-3889-4034-874b-a8c0ad78caac-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-h9nl2\" (UID: \"79abcd65-3889-4034-874b-a8c0ad78caac\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h9nl2" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.605342 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ebd6b987-d54a-4692-800a-8eadc5e8690c-images\") pod \"machine-api-operator-5694c8668f-c9kjj\" (UID: \"ebd6b987-d54a-4692-800a-8eadc5e8690c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-c9kjj" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.605386 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5d26a632-0017-40e4-ac56-5794c4800d83-node-pullsecrets\") pod \"apiserver-76f77b778f-xr2v8\" (UID: \"5d26a632-0017-40e4-ac56-5794c4800d83\") " pod="openshift-apiserver/apiserver-76f77b778f-xr2v8" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.605428 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26fvv\" (UniqueName: \"kubernetes.io/projected/4c806cf0-18da-4035-bf9c-f134a1b23485-kube-api-access-26fvv\") pod \"apiserver-7bbb656c7d-4mgv7\" (UID: \"4c806cf0-18da-4035-bf9c-f134a1b23485\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4mgv7" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.605474 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzlh4\" (UniqueName: \"kubernetes.io/projected/0b3b285f-9406-4f9b-9768-8827933418d7-kube-api-access-vzlh4\") pod \"route-controller-manager-6576b87f9c-gn7dj\" (UID: \"0b3b285f-9406-4f9b-9768-8827933418d7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gn7dj" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.605505 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b3b285f-9406-4f9b-9768-8827933418d7-serving-cert\") pod \"route-controller-manager-6576b87f9c-gn7dj\" (UID: \"0b3b285f-9406-4f9b-9768-8827933418d7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gn7dj" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.605529 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ebd6b987-d54a-4692-800a-8eadc5e8690c-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-c9kjj\" (UID: \"ebd6b987-d54a-4692-800a-8eadc5e8690c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-c9kjj" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.605556 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d26a632-0017-40e4-ac56-5794c4800d83-config\") pod \"apiserver-76f77b778f-xr2v8\" (UID: \"5d26a632-0017-40e4-ac56-5794c4800d83\") " pod="openshift-apiserver/apiserver-76f77b778f-xr2v8" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.605583 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/5d26a632-0017-40e4-ac56-5794c4800d83-audit\") pod \"apiserver-76f77b778f-xr2v8\" (UID: \"5d26a632-0017-40e4-ac56-5794c4800d83\") " pod="openshift-apiserver/apiserver-76f77b778f-xr2v8" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.605613 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c806cf0-18da-4035-bf9c-f134a1b23485-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-4mgv7\" (UID: \"4c806cf0-18da-4035-bf9c-f134a1b23485\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4mgv7" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.605642 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64dkj\" (UniqueName: \"kubernetes.io/projected/79abcd65-3889-4034-874b-a8c0ad78caac-kube-api-access-64dkj\") pod \"openshift-apiserver-operator-796bbdcf4f-h9nl2\" (UID: \"79abcd65-3889-4034-874b-a8c0ad78caac\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h9nl2" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.605674 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be094421-14a0-4ba3-b708-fb08c498c2d6-service-ca-bundle\") pod \"authentication-operator-69f744f599-qhnkq\" (UID: \"be094421-14a0-4ba3-b708-fb08c498c2d6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qhnkq" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.605701 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61b1921b-4102-4959-abd7-c86ca3ae880e-serving-cert\") pod \"controller-manager-879f6c89f-qbdpc\" (UID: \"61b1921b-4102-4959-abd7-c86ca3ae880e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qbdpc" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.605729 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/5d26a632-0017-40e4-ac56-5794c4800d83-image-import-ca\") pod \"apiserver-76f77b778f-xr2v8\" (UID: \"5d26a632-0017-40e4-ac56-5794c4800d83\") " pod="openshift-apiserver/apiserver-76f77b778f-xr2v8" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.605760 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4c806cf0-18da-4035-bf9c-f134a1b23485-audit-policies\") pod \"apiserver-7bbb656c7d-4mgv7\" (UID: \"4c806cf0-18da-4035-bf9c-f134a1b23485\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4mgv7" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.605782 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8xj6\" (UniqueName: \"kubernetes.io/projected/be094421-14a0-4ba3-b708-fb08c498c2d6-kube-api-access-k8xj6\") pod \"authentication-operator-69f744f599-qhnkq\" (UID: \"be094421-14a0-4ba3-b708-fb08c498c2d6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qhnkq" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.605809 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsn67\" (UniqueName: \"kubernetes.io/projected/5d26a632-0017-40e4-ac56-5794c4800d83-kube-api-access-bsn67\") pod \"apiserver-76f77b778f-xr2v8\" (UID: \"5d26a632-0017-40e4-ac56-5794c4800d83\") " pod="openshift-apiserver/apiserver-76f77b778f-xr2v8" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.605840 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d26a632-0017-40e4-ac56-5794c4800d83-trusted-ca-bundle\") pod \"apiserver-76f77b778f-xr2v8\" (UID: \"5d26a632-0017-40e4-ac56-5794c4800d83\") " pod="openshift-apiserver/apiserver-76f77b778f-xr2v8" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.605870 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/4c806cf0-18da-4035-bf9c-f134a1b23485-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-4mgv7\" (UID: \"4c806cf0-18da-4035-bf9c-f134a1b23485\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4mgv7" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.605911 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5d26a632-0017-40e4-ac56-5794c4800d83-audit-dir\") pod \"apiserver-76f77b778f-xr2v8\" (UID: \"5d26a632-0017-40e4-ac56-5794c4800d83\") " pod="openshift-apiserver/apiserver-76f77b778f-xr2v8" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.605938 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4c806cf0-18da-4035-bf9c-f134a1b23485-etcd-client\") pod \"apiserver-7bbb656c7d-4mgv7\" (UID: \"4c806cf0-18da-4035-bf9c-f134a1b23485\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4mgv7" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.605967 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/4c806cf0-18da-4035-bf9c-f134a1b23485-encryption-config\") pod \"apiserver-7bbb656c7d-4mgv7\" (UID: \"4c806cf0-18da-4035-bf9c-f134a1b23485\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4mgv7" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.606004 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/61b1921b-4102-4959-abd7-c86ca3ae880e-client-ca\") pod \"controller-manager-879f6c89f-qbdpc\" (UID: \"61b1921b-4102-4959-abd7-c86ca3ae880e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qbdpc" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.606033 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be094421-14a0-4ba3-b708-fb08c498c2d6-config\") pod \"authentication-operator-69f744f599-qhnkq\" (UID: \"be094421-14a0-4ba3-b708-fb08c498c2d6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qhnkq" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.606075 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79abcd65-3889-4034-874b-a8c0ad78caac-config\") pod \"openshift-apiserver-operator-796bbdcf4f-h9nl2\" (UID: \"79abcd65-3889-4034-874b-a8c0ad78caac\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h9nl2" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.606104 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5d26a632-0017-40e4-ac56-5794c4800d83-serving-cert\") pod \"apiserver-76f77b778f-xr2v8\" (UID: \"5d26a632-0017-40e4-ac56-5794c4800d83\") " pod="openshift-apiserver/apiserver-76f77b778f-xr2v8" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.606132 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/61b1921b-4102-4959-abd7-c86ca3ae880e-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-qbdpc\" (UID: \"61b1921b-4102-4959-abd7-c86ca3ae880e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qbdpc" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.607680 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s5h2z"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.609340 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s5h2z" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.609822 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6mnvc" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.612288 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.613447 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.613704 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.617179 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wb8d2"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.618362 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ggdd9"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.619782 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ggdd9" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.620837 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.622086 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wb8d2" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.622524 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.623155 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.624009 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.626043 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.627299 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.627548 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.627940 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.628107 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.628229 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.628333 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.628594 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.628691 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.628759 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.628828 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.628934 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.630455 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.630682 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.630757 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.631264 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.631377 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.631492 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.632669 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.632790 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.633561 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.633761 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.633857 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-tj9mh"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.633915 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.634036 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.634114 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.635551 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-wwnsl"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.635577 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-2sdrd"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.639761 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-tj9mh" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.642553 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.642815 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.643545 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.644436 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.644628 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.644895 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.648842 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-66mls"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.649239 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zgqt5"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.649593 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zgqt5" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.649839 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2sdrd" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.650672 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.651439 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-66mls" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.652552 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.653862 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-5rrn6"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.654935 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-5rrn6" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.655254 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-mqdxw"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.656715 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.656906 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-5hctc"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.657299 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.658434 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.663915 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gmxgf"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.664677 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h9nl2"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.664797 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gmxgf" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.665531 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r28xn"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.666414 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r28xn" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.667827 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-7ttmn"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.668810 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7ttmn" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.669960 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-8dzwh"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.670535 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-8dzwh" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.671417 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-wfv7t"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.671781 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.671863 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.672197 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-wfv7t" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.674961 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320680-q9p7h"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.676081 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320680-q9p7h" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.688726 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7wrkd"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.690035 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7wrkd" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.692409 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.705082 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-52tgw"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.706211 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-52tgw" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.709090 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4c806cf0-18da-4035-bf9c-f134a1b23485-audit-dir\") pod \"apiserver-7bbb656c7d-4mgv7\" (UID: \"4c806cf0-18da-4035-bf9c-f134a1b23485\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4mgv7" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.709150 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3fc15677-b9d9-4e15-ac99-4072e3ad4c91-serving-cert\") pod \"console-operator-58897d9998-wwnsl\" (UID: \"3fc15677-b9d9-4e15-ac99-4072e3ad4c91\") " pod="openshift-console-operator/console-operator-58897d9998-wwnsl" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.709181 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5dxr\" (UniqueName: \"kubernetes.io/projected/fa148973-699f-4aeb-839b-ec65aa1a2a37-kube-api-access-k5dxr\") pod \"multus-admission-controller-857f4d67dd-5rrn6\" (UID: \"fa148973-699f-4aeb-839b-ec65aa1a2a37\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-5rrn6" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.709210 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b3b285f-9406-4f9b-9768-8827933418d7-config\") pod \"route-controller-manager-6576b87f9c-gn7dj\" (UID: \"0b3b285f-9406-4f9b-9768-8827933418d7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gn7dj" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.709219 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4c806cf0-18da-4035-bf9c-f134a1b23485-audit-dir\") pod \"apiserver-7bbb656c7d-4mgv7\" (UID: \"4c806cf0-18da-4035-bf9c-f134a1b23485\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4mgv7" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.709236 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c806cf0-18da-4035-bf9c-f134a1b23485-serving-cert\") pod \"apiserver-7bbb656c7d-4mgv7\" (UID: \"4c806cf0-18da-4035-bf9c-f134a1b23485\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4mgv7" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.709481 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5d26a632-0017-40e4-ac56-5794c4800d83-etcd-serving-ca\") pod \"apiserver-76f77b778f-xr2v8\" (UID: \"5d26a632-0017-40e4-ac56-5794c4800d83\") " pod="openshift-apiserver/apiserver-76f77b778f-xr2v8" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.709538 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be094421-14a0-4ba3-b708-fb08c498c2d6-serving-cert\") pod \"authentication-operator-69f744f599-qhnkq\" (UID: \"be094421-14a0-4ba3-b708-fb08c498c2d6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qhnkq" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.709568 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebd6b987-d54a-4692-800a-8eadc5e8690c-config\") pod \"machine-api-operator-5694c8668f-c9kjj\" (UID: \"ebd6b987-d54a-4692-800a-8eadc5e8690c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-c9kjj" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.709601 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5h8rn\" (UniqueName: \"kubernetes.io/projected/10f8e308-867e-4dd1-be59-c73d8297cbfe-kube-api-access-5h8rn\") pod \"cluster-samples-operator-665b6dd947-6fq4x\" (UID: \"10f8e308-867e-4dd1-be59-c73d8297cbfe\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6fq4x" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.709629 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/c931aa3f-02a2-4dd7-bd02-a6a8e4484875-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-7sxgh\" (UID: \"c931aa3f-02a2-4dd7-bd02-a6a8e4484875\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7sxgh" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.709668 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sktk\" (UniqueName: \"kubernetes.io/projected/c931aa3f-02a2-4dd7-bd02-a6a8e4484875-kube-api-access-2sktk\") pod \"cluster-image-registry-operator-dc59b4c8b-7sxgh\" (UID: \"c931aa3f-02a2-4dd7-bd02-a6a8e4484875\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7sxgh" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.709707 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwd5w\" (UniqueName: \"kubernetes.io/projected/61b1921b-4102-4959-abd7-c86ca3ae880e-kube-api-access-rwd5w\") pod \"controller-manager-879f6c89f-qbdpc\" (UID: \"61b1921b-4102-4959-abd7-c86ca3ae880e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qbdpc" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.709730 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5d26a632-0017-40e4-ac56-5794c4800d83-encryption-config\") pod \"apiserver-76f77b778f-xr2v8\" (UID: \"5d26a632-0017-40e4-ac56-5794c4800d83\") " pod="openshift-apiserver/apiserver-76f77b778f-xr2v8" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.709754 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0b3b285f-9406-4f9b-9768-8827933418d7-client-ca\") pod \"route-controller-manager-6576b87f9c-gn7dj\" (UID: \"0b3b285f-9406-4f9b-9768-8827933418d7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gn7dj" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.709789 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79abcd65-3889-4034-874b-a8c0ad78caac-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-h9nl2\" (UID: \"79abcd65-3889-4034-874b-a8c0ad78caac\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h9nl2" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.709815 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ebd6b987-d54a-4692-800a-8eadc5e8690c-images\") pod \"machine-api-operator-5694c8668f-c9kjj\" (UID: \"ebd6b987-d54a-4692-800a-8eadc5e8690c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-c9kjj" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.709862 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5d26a632-0017-40e4-ac56-5794c4800d83-node-pullsecrets\") pod \"apiserver-76f77b778f-xr2v8\" (UID: \"5d26a632-0017-40e4-ac56-5794c4800d83\") " pod="openshift-apiserver/apiserver-76f77b778f-xr2v8" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.709932 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26fvv\" (UniqueName: \"kubernetes.io/projected/4c806cf0-18da-4035-bf9c-f134a1b23485-kube-api-access-26fvv\") pod \"apiserver-7bbb656c7d-4mgv7\" (UID: \"4c806cf0-18da-4035-bf9c-f134a1b23485\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4mgv7" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.710002 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzlh4\" (UniqueName: \"kubernetes.io/projected/0b3b285f-9406-4f9b-9768-8827933418d7-kube-api-access-vzlh4\") pod \"route-controller-manager-6576b87f9c-gn7dj\" (UID: \"0b3b285f-9406-4f9b-9768-8827933418d7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gn7dj" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.710044 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b3b285f-9406-4f9b-9768-8827933418d7-serving-cert\") pod \"route-controller-manager-6576b87f9c-gn7dj\" (UID: \"0b3b285f-9406-4f9b-9768-8827933418d7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gn7dj" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.710081 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ebd6b987-d54a-4692-800a-8eadc5e8690c-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-c9kjj\" (UID: \"ebd6b987-d54a-4692-800a-8eadc5e8690c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-c9kjj" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.710111 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d26a632-0017-40e4-ac56-5794c4800d83-config\") pod \"apiserver-76f77b778f-xr2v8\" (UID: \"5d26a632-0017-40e4-ac56-5794c4800d83\") " pod="openshift-apiserver/apiserver-76f77b778f-xr2v8" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.710138 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/5d26a632-0017-40e4-ac56-5794c4800d83-audit\") pod \"apiserver-76f77b778f-xr2v8\" (UID: \"5d26a632-0017-40e4-ac56-5794c4800d83\") " pod="openshift-apiserver/apiserver-76f77b778f-xr2v8" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.710168 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c931aa3f-02a2-4dd7-bd02-a6a8e4484875-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-7sxgh\" (UID: \"c931aa3f-02a2-4dd7-bd02-a6a8e4484875\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7sxgh" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.710200 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c806cf0-18da-4035-bf9c-f134a1b23485-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-4mgv7\" (UID: \"4c806cf0-18da-4035-bf9c-f134a1b23485\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4mgv7" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.710234 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64dkj\" (UniqueName: \"kubernetes.io/projected/79abcd65-3889-4034-874b-a8c0ad78caac-kube-api-access-64dkj\") pod \"openshift-apiserver-operator-796bbdcf4f-h9nl2\" (UID: \"79abcd65-3889-4034-874b-a8c0ad78caac\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h9nl2" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.710258 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5d26a632-0017-40e4-ac56-5794c4800d83-node-pullsecrets\") pod \"apiserver-76f77b778f-xr2v8\" (UID: \"5d26a632-0017-40e4-ac56-5794c4800d83\") " pod="openshift-apiserver/apiserver-76f77b778f-xr2v8" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.710278 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be094421-14a0-4ba3-b708-fb08c498c2d6-service-ca-bundle\") pod \"authentication-operator-69f744f599-qhnkq\" (UID: \"be094421-14a0-4ba3-b708-fb08c498c2d6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qhnkq" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.710316 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61b1921b-4102-4959-abd7-c86ca3ae880e-serving-cert\") pod \"controller-manager-879f6c89f-qbdpc\" (UID: \"61b1921b-4102-4959-abd7-c86ca3ae880e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qbdpc" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.710345 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/5d26a632-0017-40e4-ac56-5794c4800d83-image-import-ca\") pod \"apiserver-76f77b778f-xr2v8\" (UID: \"5d26a632-0017-40e4-ac56-5794c4800d83\") " pod="openshift-apiserver/apiserver-76f77b778f-xr2v8" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.710379 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c931aa3f-02a2-4dd7-bd02-a6a8e4484875-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-7sxgh\" (UID: \"c931aa3f-02a2-4dd7-bd02-a6a8e4484875\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7sxgh" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.710412 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrmdq\" (UniqueName: \"kubernetes.io/projected/3fc15677-b9d9-4e15-ac99-4072e3ad4c91-kube-api-access-rrmdq\") pod \"console-operator-58897d9998-wwnsl\" (UID: \"3fc15677-b9d9-4e15-ac99-4072e3ad4c91\") " pod="openshift-console-operator/console-operator-58897d9998-wwnsl" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.710481 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4c806cf0-18da-4035-bf9c-f134a1b23485-audit-policies\") pod \"apiserver-7bbb656c7d-4mgv7\" (UID: \"4c806cf0-18da-4035-bf9c-f134a1b23485\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4mgv7" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.710520 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8xj6\" (UniqueName: \"kubernetes.io/projected/be094421-14a0-4ba3-b708-fb08c498c2d6-kube-api-access-k8xj6\") pod \"authentication-operator-69f744f599-qhnkq\" (UID: \"be094421-14a0-4ba3-b708-fb08c498c2d6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qhnkq" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.710558 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsn67\" (UniqueName: \"kubernetes.io/projected/5d26a632-0017-40e4-ac56-5794c4800d83-kube-api-access-bsn67\") pod \"apiserver-76f77b778f-xr2v8\" (UID: \"5d26a632-0017-40e4-ac56-5794c4800d83\") " pod="openshift-apiserver/apiserver-76f77b778f-xr2v8" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.710590 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d26a632-0017-40e4-ac56-5794c4800d83-trusted-ca-bundle\") pod \"apiserver-76f77b778f-xr2v8\" (UID: \"5d26a632-0017-40e4-ac56-5794c4800d83\") " pod="openshift-apiserver/apiserver-76f77b778f-xr2v8" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.710622 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fc15677-b9d9-4e15-ac99-4072e3ad4c91-config\") pod \"console-operator-58897d9998-wwnsl\" (UID: \"3fc15677-b9d9-4e15-ac99-4072e3ad4c91\") " pod="openshift-console-operator/console-operator-58897d9998-wwnsl" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.710661 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/4c806cf0-18da-4035-bf9c-f134a1b23485-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-4mgv7\" (UID: \"4c806cf0-18da-4035-bf9c-f134a1b23485\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4mgv7" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.710689 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5d26a632-0017-40e4-ac56-5794c4800d83-audit-dir\") pod \"apiserver-76f77b778f-xr2v8\" (UID: \"5d26a632-0017-40e4-ac56-5794c4800d83\") " pod="openshift-apiserver/apiserver-76f77b778f-xr2v8" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.710721 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4c806cf0-18da-4035-bf9c-f134a1b23485-etcd-client\") pod \"apiserver-7bbb656c7d-4mgv7\" (UID: \"4c806cf0-18da-4035-bf9c-f134a1b23485\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4mgv7" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.710747 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/4c806cf0-18da-4035-bf9c-f134a1b23485-encryption-config\") pod \"apiserver-7bbb656c7d-4mgv7\" (UID: \"4c806cf0-18da-4035-bf9c-f134a1b23485\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4mgv7" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.710794 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/61b1921b-4102-4959-abd7-c86ca3ae880e-client-ca\") pod \"controller-manager-879f6c89f-qbdpc\" (UID: \"61b1921b-4102-4959-abd7-c86ca3ae880e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qbdpc" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.710781 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5d26a632-0017-40e4-ac56-5794c4800d83-etcd-serving-ca\") pod \"apiserver-76f77b778f-xr2v8\" (UID: \"5d26a632-0017-40e4-ac56-5794c4800d83\") " pod="openshift-apiserver/apiserver-76f77b778f-xr2v8" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.710827 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/fa148973-699f-4aeb-839b-ec65aa1a2a37-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-5rrn6\" (UID: \"fa148973-699f-4aeb-839b-ec65aa1a2a37\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-5rrn6" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.710867 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be094421-14a0-4ba3-b708-fb08c498c2d6-config\") pod \"authentication-operator-69f744f599-qhnkq\" (UID: \"be094421-14a0-4ba3-b708-fb08c498c2d6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qhnkq" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.710924 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79abcd65-3889-4034-874b-a8c0ad78caac-config\") pod \"openshift-apiserver-operator-796bbdcf4f-h9nl2\" (UID: \"79abcd65-3889-4034-874b-a8c0ad78caac\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h9nl2" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.710944 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5d26a632-0017-40e4-ac56-5794c4800d83-serving-cert\") pod \"apiserver-76f77b778f-xr2v8\" (UID: \"5d26a632-0017-40e4-ac56-5794c4800d83\") " pod="openshift-apiserver/apiserver-76f77b778f-xr2v8" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.710976 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/61b1921b-4102-4959-abd7-c86ca3ae880e-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-qbdpc\" (UID: \"61b1921b-4102-4959-abd7-c86ca3ae880e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qbdpc" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.711031 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/10f8e308-867e-4dd1-be59-c73d8297cbfe-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-6fq4x\" (UID: \"10f8e308-867e-4dd1-be59-c73d8297cbfe\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6fq4x" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.711056 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3fc15677-b9d9-4e15-ac99-4072e3ad4c91-trusted-ca\") pod \"console-operator-58897d9998-wwnsl\" (UID: \"3fc15677-b9d9-4e15-ac99-4072e3ad4c91\") " pod="openshift-console-operator/console-operator-58897d9998-wwnsl" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.711090 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61b1921b-4102-4959-abd7-c86ca3ae880e-config\") pod \"controller-manager-879f6c89f-qbdpc\" (UID: \"61b1921b-4102-4959-abd7-c86ca3ae880e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qbdpc" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.711115 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be094421-14a0-4ba3-b708-fb08c498c2d6-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-qhnkq\" (UID: \"be094421-14a0-4ba3-b708-fb08c498c2d6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qhnkq" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.711137 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5d26a632-0017-40e4-ac56-5794c4800d83-etcd-client\") pod \"apiserver-76f77b778f-xr2v8\" (UID: \"5d26a632-0017-40e4-ac56-5794c4800d83\") " pod="openshift-apiserver/apiserver-76f77b778f-xr2v8" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.711166 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnf52\" (UniqueName: \"kubernetes.io/projected/ebd6b987-d54a-4692-800a-8eadc5e8690c-kube-api-access-tnf52\") pod \"machine-api-operator-5694c8668f-c9kjj\" (UID: \"ebd6b987-d54a-4692-800a-8eadc5e8690c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-c9kjj" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.711219 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b3b285f-9406-4f9b-9768-8827933418d7-config\") pod \"route-controller-manager-6576b87f9c-gn7dj\" (UID: \"0b3b285f-9406-4f9b-9768-8827933418d7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gn7dj" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.712138 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebd6b987-d54a-4692-800a-8eadc5e8690c-config\") pod \"machine-api-operator-5694c8668f-c9kjj\" (UID: \"ebd6b987-d54a-4692-800a-8eadc5e8690c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-c9kjj" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.712205 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-qnmrm"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.712241 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/5d26a632-0017-40e4-ac56-5794c4800d83-audit\") pod \"apiserver-76f77b778f-xr2v8\" (UID: \"5d26a632-0017-40e4-ac56-5794c4800d83\") " pod="openshift-apiserver/apiserver-76f77b778f-xr2v8" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.712298 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5d26a632-0017-40e4-ac56-5794c4800d83-audit-dir\") pod \"apiserver-76f77b778f-xr2v8\" (UID: \"5d26a632-0017-40e4-ac56-5794c4800d83\") " pod="openshift-apiserver/apiserver-76f77b778f-xr2v8" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.712787 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/4c806cf0-18da-4035-bf9c-f134a1b23485-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-4mgv7\" (UID: \"4c806cf0-18da-4035-bf9c-f134a1b23485\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4mgv7" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.712894 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d26a632-0017-40e4-ac56-5794c4800d83-config\") pod \"apiserver-76f77b778f-xr2v8\" (UID: \"5d26a632-0017-40e4-ac56-5794c4800d83\") " pod="openshift-apiserver/apiserver-76f77b778f-xr2v8" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.713002 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-rgs57"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.714906 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ebd6b987-d54a-4692-800a-8eadc5e8690c-images\") pod \"machine-api-operator-5694c8668f-c9kjj\" (UID: \"ebd6b987-d54a-4692-800a-8eadc5e8690c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-c9kjj" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.714945 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/61b1921b-4102-4959-abd7-c86ca3ae880e-client-ca\") pod \"controller-manager-879f6c89f-qbdpc\" (UID: \"61b1921b-4102-4959-abd7-c86ca3ae880e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qbdpc" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.715871 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be094421-14a0-4ba3-b708-fb08c498c2d6-config\") pod \"authentication-operator-69f744f599-qhnkq\" (UID: \"be094421-14a0-4ba3-b708-fb08c498c2d6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qhnkq" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.716005 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0b3b285f-9406-4f9b-9768-8827933418d7-client-ca\") pod \"route-controller-manager-6576b87f9c-gn7dj\" (UID: \"0b3b285f-9406-4f9b-9768-8827933418d7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gn7dj" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.716075 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/61b1921b-4102-4959-abd7-c86ca3ae880e-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-qbdpc\" (UID: \"61b1921b-4102-4959-abd7-c86ca3ae880e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qbdpc" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.717063 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79abcd65-3889-4034-874b-a8c0ad78caac-config\") pod \"openshift-apiserver-operator-796bbdcf4f-h9nl2\" (UID: \"79abcd65-3889-4034-874b-a8c0ad78caac\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h9nl2" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.717452 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c806cf0-18da-4035-bf9c-f134a1b23485-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-4mgv7\" (UID: \"4c806cf0-18da-4035-bf9c-f134a1b23485\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4mgv7" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.718354 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5d26a632-0017-40e4-ac56-5794c4800d83-encryption-config\") pod \"apiserver-76f77b778f-xr2v8\" (UID: \"5d26a632-0017-40e4-ac56-5794c4800d83\") " pod="openshift-apiserver/apiserver-76f77b778f-xr2v8" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.718373 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be094421-14a0-4ba3-b708-fb08c498c2d6-serving-cert\") pod \"authentication-operator-69f744f599-qhnkq\" (UID: \"be094421-14a0-4ba3-b708-fb08c498c2d6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qhnkq" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.718917 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be094421-14a0-4ba3-b708-fb08c498c2d6-service-ca-bundle\") pod \"authentication-operator-69f744f599-qhnkq\" (UID: \"be094421-14a0-4ba3-b708-fb08c498c2d6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qhnkq" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.719622 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4c806cf0-18da-4035-bf9c-f134a1b23485-audit-policies\") pod \"apiserver-7bbb656c7d-4mgv7\" (UID: \"4c806cf0-18da-4035-bf9c-f134a1b23485\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4mgv7" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.719674 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79abcd65-3889-4034-874b-a8c0ad78caac-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-h9nl2\" (UID: \"79abcd65-3889-4034-874b-a8c0ad78caac\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h9nl2" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.720126 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.720129 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be094421-14a0-4ba3-b708-fb08c498c2d6-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-qhnkq\" (UID: \"be094421-14a0-4ba3-b708-fb08c498c2d6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qhnkq" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.720641 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d26a632-0017-40e4-ac56-5794c4800d83-trusted-ca-bundle\") pod \"apiserver-76f77b778f-xr2v8\" (UID: \"5d26a632-0017-40e4-ac56-5794c4800d83\") " pod="openshift-apiserver/apiserver-76f77b778f-xr2v8" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.720675 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4c806cf0-18da-4035-bf9c-f134a1b23485-etcd-client\") pod \"apiserver-7bbb656c7d-4mgv7\" (UID: \"4c806cf0-18da-4035-bf9c-f134a1b23485\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4mgv7" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.720958 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/10f8e308-867e-4dd1-be59-c73d8297cbfe-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-6fq4x\" (UID: \"10f8e308-867e-4dd1-be59-c73d8297cbfe\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6fq4x" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.721174 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5d26a632-0017-40e4-ac56-5794c4800d83-serving-cert\") pod \"apiserver-76f77b778f-xr2v8\" (UID: \"5d26a632-0017-40e4-ac56-5794c4800d83\") " pod="openshift-apiserver/apiserver-76f77b778f-xr2v8" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.721318 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/5d26a632-0017-40e4-ac56-5794c4800d83-image-import-ca\") pod \"apiserver-76f77b778f-xr2v8\" (UID: \"5d26a632-0017-40e4-ac56-5794c4800d83\") " pod="openshift-apiserver/apiserver-76f77b778f-xr2v8" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.740567 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61b1921b-4102-4959-abd7-c86ca3ae880e-serving-cert\") pod \"controller-manager-879f6c89f-qbdpc\" (UID: \"61b1921b-4102-4959-abd7-c86ca3ae880e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qbdpc" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.740940 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c806cf0-18da-4035-bf9c-f134a1b23485-serving-cert\") pod \"apiserver-7bbb656c7d-4mgv7\" (UID: \"4c806cf0-18da-4035-bf9c-f134a1b23485\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4mgv7" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.740940 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/4c806cf0-18da-4035-bf9c-f134a1b23485-encryption-config\") pod \"apiserver-7bbb656c7d-4mgv7\" (UID: \"4c806cf0-18da-4035-bf9c-f134a1b23485\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4mgv7" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.741020 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61b1921b-4102-4959-abd7-c86ca3ae880e-config\") pod \"controller-manager-879f6c89f-qbdpc\" (UID: \"61b1921b-4102-4959-abd7-c86ca3ae880e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qbdpc" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.741606 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5d26a632-0017-40e4-ac56-5794c4800d83-etcd-client\") pod \"apiserver-76f77b778f-xr2v8\" (UID: \"5d26a632-0017-40e4-ac56-5794c4800d83\") " pod="openshift-apiserver/apiserver-76f77b778f-xr2v8" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.741981 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b3b285f-9406-4f9b-9768-8827933418d7-serving-cert\") pod \"route-controller-manager-6576b87f9c-gn7dj\" (UID: \"0b3b285f-9406-4f9b-9768-8827933418d7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gn7dj" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.743101 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-rs79q"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.743744 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.744633 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-rnmq9"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.745546 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6fq4x"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.745659 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-rnmq9" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.747355 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-h8fq9"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.748484 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ebd6b987-d54a-4692-800a-8eadc5e8690c-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-c9kjj\" (UID: \"ebd6b987-d54a-4692-800a-8eadc5e8690c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-c9kjj" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.749281 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-xr2v8"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.749424 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-h8fq9" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.750918 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7sxgh"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.752062 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-7p4nz"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.752211 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.753843 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s5h2z"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.756750 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p9gxt"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.758434 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-66mls"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.760009 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-2sdrd"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.768170 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wb8d2"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.769551 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-8dzwh"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.770651 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ggdd9"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.771761 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-sngkt"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.772433 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.772801 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-sngkt" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.776146 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-j465h"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.777026 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-j465h" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.777783 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6mnvc"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.779245 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gmxgf"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.780453 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-7ttmn"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.781894 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-5rrn6"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.783006 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-zgcbx"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.786601 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-gcvn8"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.788073 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-4mgv7"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.789090 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-sngkt"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.790093 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-j465h"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.791120 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320680-q9p7h"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.792283 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zgqt5"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.792395 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.793535 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r28xn"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.794550 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-wfv7t"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.795646 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-k8mk5"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.796772 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7wrkd"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.798312 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-52tgw"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.799336 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-rnmq9"] Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.811689 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c931aa3f-02a2-4dd7-bd02-a6a8e4484875-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-7sxgh\" (UID: \"c931aa3f-02a2-4dd7-bd02-a6a8e4484875\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7sxgh" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.811725 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c931aa3f-02a2-4dd7-bd02-a6a8e4484875-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-7sxgh\" (UID: \"c931aa3f-02a2-4dd7-bd02-a6a8e4484875\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7sxgh" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.811751 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrmdq\" (UniqueName: \"kubernetes.io/projected/3fc15677-b9d9-4e15-ac99-4072e3ad4c91-kube-api-access-rrmdq\") pod \"console-operator-58897d9998-wwnsl\" (UID: \"3fc15677-b9d9-4e15-ac99-4072e3ad4c91\") " pod="openshift-console-operator/console-operator-58897d9998-wwnsl" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.811783 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fc15677-b9d9-4e15-ac99-4072e3ad4c91-config\") pod \"console-operator-58897d9998-wwnsl\" (UID: \"3fc15677-b9d9-4e15-ac99-4072e3ad4c91\") " pod="openshift-console-operator/console-operator-58897d9998-wwnsl" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.811813 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/fa148973-699f-4aeb-839b-ec65aa1a2a37-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-5rrn6\" (UID: \"fa148973-699f-4aeb-839b-ec65aa1a2a37\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-5rrn6" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.812165 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3fc15677-b9d9-4e15-ac99-4072e3ad4c91-trusted-ca\") pod \"console-operator-58897d9998-wwnsl\" (UID: \"3fc15677-b9d9-4e15-ac99-4072e3ad4c91\") " pod="openshift-console-operator/console-operator-58897d9998-wwnsl" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.812224 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3fc15677-b9d9-4e15-ac99-4072e3ad4c91-serving-cert\") pod \"console-operator-58897d9998-wwnsl\" (UID: \"3fc15677-b9d9-4e15-ac99-4072e3ad4c91\") " pod="openshift-console-operator/console-operator-58897d9998-wwnsl" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.812245 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5dxr\" (UniqueName: \"kubernetes.io/projected/fa148973-699f-4aeb-839b-ec65aa1a2a37-kube-api-access-k5dxr\") pod \"multus-admission-controller-857f4d67dd-5rrn6\" (UID: \"fa148973-699f-4aeb-839b-ec65aa1a2a37\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-5rrn6" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.812281 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/c931aa3f-02a2-4dd7-bd02-a6a8e4484875-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-7sxgh\" (UID: \"c931aa3f-02a2-4dd7-bd02-a6a8e4484875\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7sxgh" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.812305 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2sktk\" (UniqueName: \"kubernetes.io/projected/c931aa3f-02a2-4dd7-bd02-a6a8e4484875-kube-api-access-2sktk\") pod \"cluster-image-registry-operator-dc59b4c8b-7sxgh\" (UID: \"c931aa3f-02a2-4dd7-bd02-a6a8e4484875\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7sxgh" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.812353 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.812584 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fc15677-b9d9-4e15-ac99-4072e3ad4c91-config\") pod \"console-operator-58897d9998-wwnsl\" (UID: \"3fc15677-b9d9-4e15-ac99-4072e3ad4c91\") " pod="openshift-console-operator/console-operator-58897d9998-wwnsl" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.813295 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3fc15677-b9d9-4e15-ac99-4072e3ad4c91-trusted-ca\") pod \"console-operator-58897d9998-wwnsl\" (UID: \"3fc15677-b9d9-4e15-ac99-4072e3ad4c91\") " pod="openshift-console-operator/console-operator-58897d9998-wwnsl" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.813352 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c931aa3f-02a2-4dd7-bd02-a6a8e4484875-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-7sxgh\" (UID: \"c931aa3f-02a2-4dd7-bd02-a6a8e4484875\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7sxgh" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.815565 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3fc15677-b9d9-4e15-ac99-4072e3ad4c91-serving-cert\") pod \"console-operator-58897d9998-wwnsl\" (UID: \"3fc15677-b9d9-4e15-ac99-4072e3ad4c91\") " pod="openshift-console-operator/console-operator-58897d9998-wwnsl" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.815728 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/c931aa3f-02a2-4dd7-bd02-a6a8e4484875-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-7sxgh\" (UID: \"c931aa3f-02a2-4dd7-bd02-a6a8e4484875\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7sxgh" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.832864 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.851795 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.872555 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.892311 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.912080 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.938142 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.952428 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.972441 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Sep 30 14:00:35 crc kubenswrapper[4676]: I0930 14:00:35.997811 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Sep 30 14:00:36 crc kubenswrapper[4676]: I0930 14:00:36.012098 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Sep 30 14:00:36 crc kubenswrapper[4676]: I0930 14:00:36.033121 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Sep 30 14:00:36 crc kubenswrapper[4676]: I0930 14:00:36.052598 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Sep 30 14:00:36 crc kubenswrapper[4676]: I0930 14:00:36.072638 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Sep 30 14:00:36 crc kubenswrapper[4676]: I0930 14:00:36.092729 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Sep 30 14:00:36 crc kubenswrapper[4676]: I0930 14:00:36.111960 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Sep 30 14:00:36 crc kubenswrapper[4676]: I0930 14:00:36.133388 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Sep 30 14:00:36 crc kubenswrapper[4676]: I0930 14:00:36.153089 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Sep 30 14:00:36 crc kubenswrapper[4676]: I0930 14:00:36.173006 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Sep 30 14:00:36 crc kubenswrapper[4676]: I0930 14:00:36.193006 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Sep 30 14:00:36 crc kubenswrapper[4676]: I0930 14:00:36.212947 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Sep 30 14:00:36 crc kubenswrapper[4676]: I0930 14:00:36.231494 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Sep 30 14:00:36 crc kubenswrapper[4676]: I0930 14:00:36.252692 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Sep 30 14:00:36 crc kubenswrapper[4676]: I0930 14:00:36.272663 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Sep 30 14:00:36 crc kubenswrapper[4676]: I0930 14:00:36.312595 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Sep 30 14:00:36 crc kubenswrapper[4676]: I0930 14:00:36.332477 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Sep 30 14:00:36 crc kubenswrapper[4676]: I0930 14:00:36.352923 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Sep 30 14:00:36 crc kubenswrapper[4676]: I0930 14:00:36.372920 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Sep 30 14:00:36 crc kubenswrapper[4676]: I0930 14:00:36.392588 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Sep 30 14:00:36 crc kubenswrapper[4676]: I0930 14:00:36.412934 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Sep 30 14:00:36 crc kubenswrapper[4676]: I0930 14:00:36.433013 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Sep 30 14:00:36 crc kubenswrapper[4676]: I0930 14:00:36.453140 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Sep 30 14:00:36 crc kubenswrapper[4676]: I0930 14:00:36.472800 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Sep 30 14:00:36 crc kubenswrapper[4676]: I0930 14:00:36.493273 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Sep 30 14:00:36 crc kubenswrapper[4676]: I0930 14:00:36.512453 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Sep 30 14:00:36 crc kubenswrapper[4676]: I0930 14:00:36.532351 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Sep 30 14:00:36 crc kubenswrapper[4676]: I0930 14:00:36.551996 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Sep 30 14:00:36 crc kubenswrapper[4676]: I0930 14:00:36.573322 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Sep 30 14:00:36 crc kubenswrapper[4676]: I0930 14:00:36.594545 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Sep 30 14:00:36 crc kubenswrapper[4676]: I0930 14:00:36.613677 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Sep 30 14:00:36 crc kubenswrapper[4676]: I0930 14:00:36.632365 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Sep 30 14:00:36 crc kubenswrapper[4676]: I0930 14:00:36.652294 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Sep 30 14:00:36 crc kubenswrapper[4676]: I0930 14:00:36.671014 4676 request.go:700] Waited for 1.017290769s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/secrets?fieldSelector=metadata.name%3Dolm-operator-serving-cert&limit=500&resourceVersion=0 Sep 30 14:00:36 crc kubenswrapper[4676]: I0930 14:00:36.672783 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Sep 30 14:00:36 crc kubenswrapper[4676]: I0930 14:00:36.692865 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Sep 30 14:00:36 crc kubenswrapper[4676]: I0930 14:00:36.705212 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/fa148973-699f-4aeb-839b-ec65aa1a2a37-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-5rrn6\" (UID: \"fa148973-699f-4aeb-839b-ec65aa1a2a37\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-5rrn6" Sep 30 14:00:36 crc kubenswrapper[4676]: I0930 14:00:36.712955 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Sep 30 14:00:36 crc kubenswrapper[4676]: I0930 14:00:36.732846 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Sep 30 14:00:36 crc kubenswrapper[4676]: I0930 14:00:36.773527 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Sep 30 14:00:36 crc kubenswrapper[4676]: I0930 14:00:36.792986 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Sep 30 14:00:36 crc kubenswrapper[4676]: I0930 14:00:36.812641 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Sep 30 14:00:36 crc kubenswrapper[4676]: I0930 14:00:36.832817 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Sep 30 14:00:36 crc kubenswrapper[4676]: I0930 14:00:36.852891 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Sep 30 14:00:36 crc kubenswrapper[4676]: I0930 14:00:36.872565 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Sep 30 14:00:36 crc kubenswrapper[4676]: I0930 14:00:36.892650 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Sep 30 14:00:36 crc kubenswrapper[4676]: I0930 14:00:36.912311 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Sep 30 14:00:36 crc kubenswrapper[4676]: I0930 14:00:36.932089 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Sep 30 14:00:36 crc kubenswrapper[4676]: I0930 14:00:36.952524 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Sep 30 14:00:36 crc kubenswrapper[4676]: I0930 14:00:36.972906 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Sep 30 14:00:36 crc kubenswrapper[4676]: I0930 14:00:36.991569 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.012843 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.033186 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.053291 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.072848 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.091567 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.112696 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.131938 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.152427 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.172754 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.191953 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.213260 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.238137 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.252613 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.272196 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.309511 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwd5w\" (UniqueName: \"kubernetes.io/projected/61b1921b-4102-4959-abd7-c86ca3ae880e-kube-api-access-rwd5w\") pod \"controller-manager-879f6c89f-qbdpc\" (UID: \"61b1921b-4102-4959-abd7-c86ca3ae880e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qbdpc" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.326819 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnf52\" (UniqueName: \"kubernetes.io/projected/ebd6b987-d54a-4692-800a-8eadc5e8690c-kube-api-access-tnf52\") pod \"machine-api-operator-5694c8668f-c9kjj\" (UID: \"ebd6b987-d54a-4692-800a-8eadc5e8690c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-c9kjj" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.352605 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5h8rn\" (UniqueName: \"kubernetes.io/projected/10f8e308-867e-4dd1-be59-c73d8297cbfe-kube-api-access-5h8rn\") pod \"cluster-samples-operator-665b6dd947-6fq4x\" (UID: \"10f8e308-867e-4dd1-be59-c73d8297cbfe\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6fq4x" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.367300 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26fvv\" (UniqueName: \"kubernetes.io/projected/4c806cf0-18da-4035-bf9c-f134a1b23485-kube-api-access-26fvv\") pod \"apiserver-7bbb656c7d-4mgv7\" (UID: \"4c806cf0-18da-4035-bf9c-f134a1b23485\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4mgv7" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.370833 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-qbdpc" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.389770 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzlh4\" (UniqueName: \"kubernetes.io/projected/0b3b285f-9406-4f9b-9768-8827933418d7-kube-api-access-vzlh4\") pod \"route-controller-manager-6576b87f9c-gn7dj\" (UID: \"0b3b285f-9406-4f9b-9768-8827933418d7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gn7dj" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.407664 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64dkj\" (UniqueName: \"kubernetes.io/projected/79abcd65-3889-4034-874b-a8c0ad78caac-kube-api-access-64dkj\") pod \"openshift-apiserver-operator-796bbdcf4f-h9nl2\" (UID: \"79abcd65-3889-4034-874b-a8c0ad78caac\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h9nl2" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.411149 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h9nl2" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.432573 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8xj6\" (UniqueName: \"kubernetes.io/projected/be094421-14a0-4ba3-b708-fb08c498c2d6-kube-api-access-k8xj6\") pod \"authentication-operator-69f744f599-qhnkq\" (UID: \"be094421-14a0-4ba3-b708-fb08c498c2d6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qhnkq" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.441731 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6fq4x" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.447294 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-c9kjj" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.452569 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsn67\" (UniqueName: \"kubernetes.io/projected/5d26a632-0017-40e4-ac56-5794c4800d83-kube-api-access-bsn67\") pod \"apiserver-76f77b778f-xr2v8\" (UID: \"5d26a632-0017-40e4-ac56-5794c4800d83\") " pod="openshift-apiserver/apiserver-76f77b778f-xr2v8" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.453050 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.464328 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4mgv7" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.471555 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-qhnkq" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.473529 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.492671 4676 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.512585 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.534856 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.553747 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.571859 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.572937 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-qbdpc"] Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.591510 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.595952 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h9nl2"] Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.613383 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Sep 30 14:00:37 crc kubenswrapper[4676]: W0930 14:00:37.617215 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79abcd65_3889_4034_874b_a8c0ad78caac.slice/crio-363c233ef7986a33e0f508eb1e76d4c1fa5622594f4f9ebda3c3aaa37fb5fda6 WatchSource:0}: Error finding container 363c233ef7986a33e0f508eb1e76d4c1fa5622594f4f9ebda3c3aaa37fb5fda6: Status 404 returned error can't find the container with id 363c233ef7986a33e0f508eb1e76d4c1fa5622594f4f9ebda3c3aaa37fb5fda6 Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.633994 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.653180 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.659589 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gn7dj" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.672157 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.682680 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-xr2v8" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.691126 4676 request.go:700] Waited for 1.913793625s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-dns/secrets?fieldSelector=metadata.name%3Ddns-dockercfg-jwfmh&limit=500&resourceVersion=0 Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.692924 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.731875 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrmdq\" (UniqueName: \"kubernetes.io/projected/3fc15677-b9d9-4e15-ac99-4072e3ad4c91-kube-api-access-rrmdq\") pod \"console-operator-58897d9998-wwnsl\" (UID: \"3fc15677-b9d9-4e15-ac99-4072e3ad4c91\") " pod="openshift-console-operator/console-operator-58897d9998-wwnsl" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.748015 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c931aa3f-02a2-4dd7-bd02-a6a8e4484875-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-7sxgh\" (UID: \"c931aa3f-02a2-4dd7-bd02-a6a8e4484875\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7sxgh" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.769065 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-4mgv7"] Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.770221 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-qhnkq"] Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.772586 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sktk\" (UniqueName: \"kubernetes.io/projected/c931aa3f-02a2-4dd7-bd02-a6a8e4484875-kube-api-access-2sktk\") pod \"cluster-image-registry-operator-dc59b4c8b-7sxgh\" (UID: \"c931aa3f-02a2-4dd7-bd02-a6a8e4484875\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7sxgh" Sep 30 14:00:37 crc kubenswrapper[4676]: W0930 14:00:37.792768 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c806cf0_18da_4035_bf9c_f134a1b23485.slice/crio-6ae3821bf55a85b5def945d0548e428c07e6779baa03172d009f39ebf5fe163b WatchSource:0}: Error finding container 6ae3821bf55a85b5def945d0548e428c07e6779baa03172d009f39ebf5fe163b: Status 404 returned error can't find the container with id 6ae3821bf55a85b5def945d0548e428c07e6779baa03172d009f39ebf5fe163b Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.795678 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5dxr\" (UniqueName: \"kubernetes.io/projected/fa148973-699f-4aeb-839b-ec65aa1a2a37-kube-api-access-k5dxr\") pod \"multus-admission-controller-857f4d67dd-5rrn6\" (UID: \"fa148973-699f-4aeb-839b-ec65aa1a2a37\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-5rrn6" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.845510 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-wwnsl" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.846602 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3c5e85dd-ceb0-40eb-86b8-353702e07379-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-k8mk5\" (UID: \"3c5e85dd-ceb0-40eb-86b8-353702e07379\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8mk5" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.846641 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kv7h\" (UniqueName: \"kubernetes.io/projected/3c5e85dd-ceb0-40eb-86b8-353702e07379-kube-api-access-9kv7h\") pod \"oauth-openshift-558db77b4-k8mk5\" (UID: \"3c5e85dd-ceb0-40eb-86b8-353702e07379\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8mk5" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.846666 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f2376906-3f0c-4ba2-b27b-ae1464676554-metrics-tls\") pod \"ingress-operator-5b745b69d9-gcvn8\" (UID: \"f2376906-3f0c-4ba2-b27b-ae1464676554\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gcvn8" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.846692 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f1f17b4e-4e0a-418e-9c13-33300291d209-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zgqt5\" (UID: \"f1f17b4e-4e0a-418e-9c13-33300291d209\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zgqt5" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.846718 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvvqh\" (UniqueName: \"kubernetes.io/projected/97f344fc-42a3-4630-af31-ea25b72941e6-kube-api-access-nvvqh\") pod \"control-plane-machine-set-operator-78cbb6b69f-wb8d2\" (UID: \"97f344fc-42a3-4630-af31-ea25b72941e6\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wb8d2" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.846741 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/aa6b14cb-7f79-4bc6-bc14-58325daf3c86-registry-certificates\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.846764 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1f17b4e-4e0a-418e-9c13-33300291d209-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zgqt5\" (UID: \"f1f17b4e-4e0a-418e-9c13-33300291d209\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zgqt5" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.846787 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d-service-ca\") pod \"console-f9d7485db-zgcbx\" (UID: \"1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d\") " pod="openshift-console/console-f9d7485db-zgcbx" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.846812 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3efa2991-05a6-465c-b8c1-105edce450d9-proxy-tls\") pod \"machine-config-operator-74547568cd-7p4nz\" (UID: \"3efa2991-05a6-465c-b8c1-105edce450d9\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7p4nz" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.846833 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eeefe4d3-1e4c-4149-9da0-9c3533991a83-config\") pod \"kube-apiserver-operator-766d6c64bb-ggdd9\" (UID: \"eeefe4d3-1e4c-4149-9da0-9c3533991a83\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ggdd9" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.846854 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d-console-oauth-config\") pod \"console-f9d7485db-zgcbx\" (UID: \"1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d\") " pod="openshift-console/console-f9d7485db-zgcbx" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.846879 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3c5e85dd-ceb0-40eb-86b8-353702e07379-audit-policies\") pod \"oauth-openshift-558db77b4-k8mk5\" (UID: \"3c5e85dd-ceb0-40eb-86b8-353702e07379\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8mk5" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.846945 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b7545eda-fbd7-4d47-8f48-084b1319bf34-default-certificate\") pod \"router-default-5444994796-tj9mh\" (UID: \"b7545eda-fbd7-4d47-8f48-084b1319bf34\") " pod="openshift-ingress/router-default-5444994796-tj9mh" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.846974 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a60ae8d9-5908-469c-aede-1af2fb6b8631-metrics-tls\") pod \"dns-operator-744455d44c-rs79q\" (UID: \"a60ae8d9-5908-469c-aede-1af2fb6b8631\") " pod="openshift-dns-operator/dns-operator-744455d44c-rs79q" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.846996 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d-console-config\") pod \"console-f9d7485db-zgcbx\" (UID: \"1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d\") " pod="openshift-console/console-f9d7485db-zgcbx" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.847052 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3c5e85dd-ceb0-40eb-86b8-353702e07379-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-k8mk5\" (UID: \"3c5e85dd-ceb0-40eb-86b8-353702e07379\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8mk5" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.847074 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eeefe4d3-1e4c-4149-9da0-9c3533991a83-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-ggdd9\" (UID: \"eeefe4d3-1e4c-4149-9da0-9c3533991a83\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ggdd9" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.847342 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b7545eda-fbd7-4d47-8f48-084b1319bf34-service-ca-bundle\") pod \"router-default-5444994796-tj9mh\" (UID: \"b7545eda-fbd7-4d47-8f48-084b1319bf34\") " pod="openshift-ingress/router-default-5444994796-tj9mh" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.847410 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t9ls\" (UniqueName: \"kubernetes.io/projected/b7545eda-fbd7-4d47-8f48-084b1319bf34-kube-api-access-7t9ls\") pod \"router-default-5444994796-tj9mh\" (UID: \"b7545eda-fbd7-4d47-8f48-084b1319bf34\") " pod="openshift-ingress/router-default-5444994796-tj9mh" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.847436 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f2376906-3f0c-4ba2-b27b-ae1464676554-trusted-ca\") pod \"ingress-operator-5b745b69d9-gcvn8\" (UID: \"f2376906-3f0c-4ba2-b27b-ae1464676554\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gcvn8" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.847471 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aa6b14cb-7f79-4bc6-bc14-58325daf3c86-registry-tls\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.847495 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eeefe4d3-1e4c-4149-9da0-9c3533991a83-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-ggdd9\" (UID: \"eeefe4d3-1e4c-4149-9da0-9c3533991a83\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ggdd9" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.847542 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69f91148-b6d8-4a40-a4b4-f0411e9617ed-serving-cert\") pod \"etcd-operator-b45778765-5hctc\" (UID: \"69f91148-b6d8-4a40-a4b4-f0411e9617ed\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5hctc" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.847571 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f2376906-3f0c-4ba2-b27b-ae1464676554-bound-sa-token\") pod \"ingress-operator-5b745b69d9-gcvn8\" (UID: \"f2376906-3f0c-4ba2-b27b-ae1464676554\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gcvn8" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.847618 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3c5e85dd-ceb0-40eb-86b8-353702e07379-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-k8mk5\" (UID: \"3c5e85dd-ceb0-40eb-86b8-353702e07379\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8mk5" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.847653 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.847680 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1f17b4e-4e0a-418e-9c13-33300291d209-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zgqt5\" (UID: \"f1f17b4e-4e0a-418e-9c13-33300291d209\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zgqt5" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.847726 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/627ad3bd-de42-40d1-ac5c-ab04b2b2ddca-srv-cert\") pod \"catalog-operator-68c6474976-6mnvc\" (UID: \"627ad3bd-de42-40d1-ac5c-ab04b2b2ddca\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6mnvc" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.847763 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3c5e85dd-ceb0-40eb-86b8-353702e07379-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-k8mk5\" (UID: \"3c5e85dd-ceb0-40eb-86b8-353702e07379\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8mk5" Sep 30 14:00:37 crc kubenswrapper[4676]: E0930 14:00:37.848093 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 14:00:38.348075811 +0000 UTC m=+142.331164240 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rgs57" (UID: "aa6b14cb-7f79-4bc6-bc14-58325daf3c86") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.848388 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3c5e85dd-ceb0-40eb-86b8-353702e07379-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-k8mk5\" (UID: \"3c5e85dd-ceb0-40eb-86b8-353702e07379\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8mk5" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.848439 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjkk4\" (UniqueName: \"kubernetes.io/projected/f2376906-3f0c-4ba2-b27b-ae1464676554-kube-api-access-rjkk4\") pod \"ingress-operator-5b745b69d9-gcvn8\" (UID: \"f2376906-3f0c-4ba2-b27b-ae1464676554\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gcvn8" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.848465 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69f91148-b6d8-4a40-a4b4-f0411e9617ed-config\") pod \"etcd-operator-b45778765-5hctc\" (UID: \"69f91148-b6d8-4a40-a4b4-f0411e9617ed\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5hctc" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.848566 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56bpp\" (UniqueName: \"kubernetes.io/projected/a296de91-e1c3-4820-a2ec-cedfc4eac0db-kube-api-access-56bpp\") pod \"openshift-controller-manager-operator-756b6f6bc6-p9gxt\" (UID: \"a296de91-e1c3-4820-a2ec-cedfc4eac0db\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p9gxt" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.848635 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/627ad3bd-de42-40d1-ac5c-ab04b2b2ddca-profile-collector-cert\") pod \"catalog-operator-68c6474976-6mnvc\" (UID: \"627ad3bd-de42-40d1-ac5c-ab04b2b2ddca\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6mnvc" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.848736 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c9f7b25-e209-4e15-b74f-ac572638fc9a-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-s5h2z\" (UID: \"0c9f7b25-e209-4e15-b74f-ac572638fc9a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s5h2z" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.848917 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dm7l\" (UniqueName: \"kubernetes.io/projected/6bf298e3-e793-4a56-9856-094437b77046-kube-api-access-2dm7l\") pod \"olm-operator-6b444d44fb-66mls\" (UID: \"6bf298e3-e793-4a56-9856-094437b77046\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-66mls" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.848962 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/216b0fb0-8e21-4b04-96c0-1f2c984b09e2-proxy-tls\") pod \"machine-config-controller-84d6567774-2sdrd\" (UID: \"216b0fb0-8e21-4b04-96c0-1f2c984b09e2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2sdrd" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.849003 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbr68\" (UniqueName: \"kubernetes.io/projected/a60ae8d9-5908-469c-aede-1af2fb6b8631-kube-api-access-bbr68\") pod \"dns-operator-744455d44c-rs79q\" (UID: \"a60ae8d9-5908-469c-aede-1af2fb6b8631\") " pod="openshift-dns-operator/dns-operator-744455d44c-rs79q" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.849046 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d39e9d64-8aa0-4b36-8cf6-9a7ebe5499ba-config\") pod \"machine-approver-56656f9798-bthfv\" (UID: \"d39e9d64-8aa0-4b36-8cf6-9a7ebe5499ba\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bthfv" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.849105 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6bf298e3-e793-4a56-9856-094437b77046-srv-cert\") pod \"olm-operator-6b444d44fb-66mls\" (UID: \"6bf298e3-e793-4a56-9856-094437b77046\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-66mls" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.849128 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/69f91148-b6d8-4a40-a4b4-f0411e9617ed-etcd-ca\") pod \"etcd-operator-b45778765-5hctc\" (UID: \"69f91148-b6d8-4a40-a4b4-f0411e9617ed\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5hctc" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.849153 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/aa6b14cb-7f79-4bc6-bc14-58325daf3c86-ca-trust-extracted\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.849175 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a296de91-e1c3-4820-a2ec-cedfc4eac0db-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-p9gxt\" (UID: \"a296de91-e1c3-4820-a2ec-cedfc4eac0db\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p9gxt" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.849305 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3c5e85dd-ceb0-40eb-86b8-353702e07379-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-k8mk5\" (UID: \"3c5e85dd-ceb0-40eb-86b8-353702e07379\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8mk5" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.849369 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcfxh\" (UniqueName: \"kubernetes.io/projected/3efa2991-05a6-465c-b8c1-105edce450d9-kube-api-access-hcfxh\") pod \"machine-config-operator-74547568cd-7p4nz\" (UID: \"3efa2991-05a6-465c-b8c1-105edce450d9\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7p4nz" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.849469 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b7545eda-fbd7-4d47-8f48-084b1319bf34-stats-auth\") pod \"router-default-5444994796-tj9mh\" (UID: \"b7545eda-fbd7-4d47-8f48-084b1319bf34\") " pod="openshift-ingress/router-default-5444994796-tj9mh" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.849501 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d-oauth-serving-cert\") pod \"console-f9d7485db-zgcbx\" (UID: \"1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d\") " pod="openshift-console/console-f9d7485db-zgcbx" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.849794 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7c68x\" (UniqueName: \"kubernetes.io/projected/2ef453c0-cdb0-4da7-8c32-3b975e1009a1-kube-api-access-7c68x\") pod \"downloads-7954f5f757-qnmrm\" (UID: \"2ef453c0-cdb0-4da7-8c32-3b975e1009a1\") " pod="openshift-console/downloads-7954f5f757-qnmrm" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.850195 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/216b0fb0-8e21-4b04-96c0-1f2c984b09e2-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-2sdrd\" (UID: \"216b0fb0-8e21-4b04-96c0-1f2c984b09e2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2sdrd" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.850259 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/aa6b14cb-7f79-4bc6-bc14-58325daf3c86-installation-pull-secrets\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.850293 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/69f91148-b6d8-4a40-a4b4-f0411e9617ed-etcd-service-ca\") pod \"etcd-operator-b45778765-5hctc\" (UID: \"69f91148-b6d8-4a40-a4b4-f0411e9617ed\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5hctc" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.850340 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a296de91-e1c3-4820-a2ec-cedfc4eac0db-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-p9gxt\" (UID: \"a296de91-e1c3-4820-a2ec-cedfc4eac0db\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p9gxt" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.850368 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b7545eda-fbd7-4d47-8f48-084b1319bf34-metrics-certs\") pod \"router-default-5444994796-tj9mh\" (UID: \"b7545eda-fbd7-4d47-8f48-084b1319bf34\") " pod="openshift-ingress/router-default-5444994796-tj9mh" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.850392 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lvcb\" (UniqueName: \"kubernetes.io/projected/627ad3bd-de42-40d1-ac5c-ab04b2b2ddca-kube-api-access-8lvcb\") pod \"catalog-operator-68c6474976-6mnvc\" (UID: \"627ad3bd-de42-40d1-ac5c-ab04b2b2ddca\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6mnvc" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.850414 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6bf298e3-e793-4a56-9856-094437b77046-profile-collector-cert\") pod \"olm-operator-6b444d44fb-66mls\" (UID: \"6bf298e3-e793-4a56-9856-094437b77046\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-66mls" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.850439 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8nz6\" (UniqueName: \"kubernetes.io/projected/216b0fb0-8e21-4b04-96c0-1f2c984b09e2-kube-api-access-d8nz6\") pod \"machine-config-controller-84d6567774-2sdrd\" (UID: \"216b0fb0-8e21-4b04-96c0-1f2c984b09e2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2sdrd" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.850460 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d-trusted-ca-bundle\") pod \"console-f9d7485db-zgcbx\" (UID: \"1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d\") " pod="openshift-console/console-f9d7485db-zgcbx" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.850495 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlmdb\" (UniqueName: \"kubernetes.io/projected/69f91148-b6d8-4a40-a4b4-f0411e9617ed-kube-api-access-xlmdb\") pod \"etcd-operator-b45778765-5hctc\" (UID: \"69f91148-b6d8-4a40-a4b4-f0411e9617ed\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5hctc" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.850515 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f652cd09-f743-4500-bace-2652974c9ef3-serving-cert\") pod \"openshift-config-operator-7777fb866f-mqdxw\" (UID: \"f652cd09-f743-4500-bace-2652974c9ef3\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mqdxw" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.850564 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjmdp\" (UniqueName: \"kubernetes.io/projected/f652cd09-f743-4500-bace-2652974c9ef3-kube-api-access-jjmdp\") pod \"openshift-config-operator-7777fb866f-mqdxw\" (UID: \"f652cd09-f743-4500-bace-2652974c9ef3\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mqdxw" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.850586 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3c5e85dd-ceb0-40eb-86b8-353702e07379-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-k8mk5\" (UID: \"3c5e85dd-ceb0-40eb-86b8-353702e07379\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8mk5" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.850606 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3efa2991-05a6-465c-b8c1-105edce450d9-auth-proxy-config\") pod \"machine-config-operator-74547568cd-7p4nz\" (UID: \"3efa2991-05a6-465c-b8c1-105edce450d9\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7p4nz" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.850625 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/97f344fc-42a3-4630-af31-ea25b72941e6-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-wb8d2\" (UID: \"97f344fc-42a3-4630-af31-ea25b72941e6\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wb8d2" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.850666 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d39e9d64-8aa0-4b36-8cf6-9a7ebe5499ba-auth-proxy-config\") pod \"machine-approver-56656f9798-bthfv\" (UID: \"d39e9d64-8aa0-4b36-8cf6-9a7ebe5499ba\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bthfv" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.850712 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3c5e85dd-ceb0-40eb-86b8-353702e07379-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-k8mk5\" (UID: \"3c5e85dd-ceb0-40eb-86b8-353702e07379\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8mk5" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.850731 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fndjg\" (UniqueName: \"kubernetes.io/projected/aa6b14cb-7f79-4bc6-bc14-58325daf3c86-kube-api-access-fndjg\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.850826 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3c5e85dd-ceb0-40eb-86b8-353702e07379-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-k8mk5\" (UID: \"3c5e85dd-ceb0-40eb-86b8-353702e07379\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8mk5" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.850871 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3c5e85dd-ceb0-40eb-86b8-353702e07379-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-k8mk5\" (UID: \"3c5e85dd-ceb0-40eb-86b8-353702e07379\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8mk5" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.850979 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d39e9d64-8aa0-4b36-8cf6-9a7ebe5499ba-machine-approver-tls\") pod \"machine-approver-56656f9798-bthfv\" (UID: \"d39e9d64-8aa0-4b36-8cf6-9a7ebe5499ba\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bthfv" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.850999 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82m47\" (UniqueName: \"kubernetes.io/projected/d39e9d64-8aa0-4b36-8cf6-9a7ebe5499ba-kube-api-access-82m47\") pod \"machine-approver-56656f9798-bthfv\" (UID: \"d39e9d64-8aa0-4b36-8cf6-9a7ebe5499ba\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bthfv" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.851217 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c9f7b25-e209-4e15-b74f-ac572638fc9a-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-s5h2z\" (UID: \"0c9f7b25-e209-4e15-b74f-ac572638fc9a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s5h2z" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.851256 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3efa2991-05a6-465c-b8c1-105edce450d9-images\") pod \"machine-config-operator-74547568cd-7p4nz\" (UID: \"3efa2991-05a6-465c-b8c1-105edce450d9\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7p4nz" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.852470 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aa6b14cb-7f79-4bc6-bc14-58325daf3c86-bound-sa-token\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.852495 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h82kj\" (UniqueName: \"kubernetes.io/projected/0c9f7b25-e209-4e15-b74f-ac572638fc9a-kube-api-access-h82kj\") pod \"kube-storage-version-migrator-operator-b67b599dd-s5h2z\" (UID: \"0c9f7b25-e209-4e15-b74f-ac572638fc9a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s5h2z" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.852600 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/f652cd09-f743-4500-bace-2652974c9ef3-available-featuregates\") pod \"openshift-config-operator-7777fb866f-mqdxw\" (UID: \"f652cd09-f743-4500-bace-2652974c9ef3\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mqdxw" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.852656 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d-console-serving-cert\") pod \"console-f9d7485db-zgcbx\" (UID: \"1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d\") " pod="openshift-console/console-f9d7485db-zgcbx" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.852716 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3c5e85dd-ceb0-40eb-86b8-353702e07379-audit-dir\") pod \"oauth-openshift-558db77b4-k8mk5\" (UID: \"3c5e85dd-ceb0-40eb-86b8-353702e07379\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8mk5" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.852914 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aa6b14cb-7f79-4bc6-bc14-58325daf3c86-trusted-ca\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.853185 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3c5e85dd-ceb0-40eb-86b8-353702e07379-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-k8mk5\" (UID: \"3c5e85dd-ceb0-40eb-86b8-353702e07379\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8mk5" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.853218 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4m5j\" (UniqueName: \"kubernetes.io/projected/1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d-kube-api-access-x4m5j\") pod \"console-f9d7485db-zgcbx\" (UID: \"1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d\") " pod="openshift-console/console-f9d7485db-zgcbx" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.853255 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/69f91148-b6d8-4a40-a4b4-f0411e9617ed-etcd-client\") pod \"etcd-operator-b45778765-5hctc\" (UID: \"69f91148-b6d8-4a40-a4b4-f0411e9617ed\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5hctc" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.879207 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7sxgh" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.895725 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-c9kjj"] Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.896981 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6fq4x"] Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.907026 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gn7dj"] Sep 30 14:00:37 crc kubenswrapper[4676]: W0930 14:00:37.921708 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podebd6b987_d54a_4692_800a_8eadc5e8690c.slice/crio-2bae2e8ec4414d356ef5157b3cd850e900d2218a2d120a096fa4d0c74946b12e WatchSource:0}: Error finding container 2bae2e8ec4414d356ef5157b3cd850e900d2218a2d120a096fa4d0c74946b12e: Status 404 returned error can't find the container with id 2bae2e8ec4414d356ef5157b3cd850e900d2218a2d120a096fa4d0c74946b12e Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.943940 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-xr2v8"] Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.953968 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 14:00:37 crc kubenswrapper[4676]: E0930 14:00:37.954115 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 14:00:38.454085933 +0000 UTC m=+142.437174362 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.954232 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjgnk\" (UniqueName: \"kubernetes.io/projected/6709a903-5bfa-42c6-be52-df8efb1d106e-kube-api-access-tjgnk\") pod \"collect-profiles-29320680-q9p7h\" (UID: \"6709a903-5bfa-42c6-be52-df8efb1d106e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320680-q9p7h" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.954337 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dm7l\" (UniqueName: \"kubernetes.io/projected/6bf298e3-e793-4a56-9856-094437b77046-kube-api-access-2dm7l\") pod \"olm-operator-6b444d44fb-66mls\" (UID: \"6bf298e3-e793-4a56-9856-094437b77046\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-66mls" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.954372 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/216b0fb0-8e21-4b04-96c0-1f2c984b09e2-proxy-tls\") pod \"machine-config-controller-84d6567774-2sdrd\" (UID: \"216b0fb0-8e21-4b04-96c0-1f2c984b09e2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2sdrd" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.954394 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbr68\" (UniqueName: \"kubernetes.io/projected/a60ae8d9-5908-469c-aede-1af2fb6b8631-kube-api-access-bbr68\") pod \"dns-operator-744455d44c-rs79q\" (UID: \"a60ae8d9-5908-469c-aede-1af2fb6b8631\") " pod="openshift-dns-operator/dns-operator-744455d44c-rs79q" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.954418 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d39e9d64-8aa0-4b36-8cf6-9a7ebe5499ba-config\") pod \"machine-approver-56656f9798-bthfv\" (UID: \"d39e9d64-8aa0-4b36-8cf6-9a7ebe5499ba\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bthfv" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.954442 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nx5vn\" (UniqueName: \"kubernetes.io/projected/515397b2-ff6d-491a-9c49-fb217236b19f-kube-api-access-nx5vn\") pod \"dns-default-j465h\" (UID: \"515397b2-ff6d-491a-9c49-fb217236b19f\") " pod="openshift-dns/dns-default-j465h" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.954465 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6bf298e3-e793-4a56-9856-094437b77046-srv-cert\") pod \"olm-operator-6b444d44fb-66mls\" (UID: \"6bf298e3-e793-4a56-9856-094437b77046\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-66mls" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.954488 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/69f91148-b6d8-4a40-a4b4-f0411e9617ed-etcd-ca\") pod \"etcd-operator-b45778765-5hctc\" (UID: \"69f91148-b6d8-4a40-a4b4-f0411e9617ed\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5hctc" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.954513 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9ae45d95-5ab3-4a8f-906a-6ed43bb21612-apiservice-cert\") pod \"packageserver-d55dfcdfc-52tgw\" (UID: \"9ae45d95-5ab3-4a8f-906a-6ed43bb21612\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-52tgw" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.954537 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v68kh\" (UniqueName: \"kubernetes.io/projected/4ec8c2c0-6649-4fce-bd49-c178b25d9da1-kube-api-access-v68kh\") pod \"marketplace-operator-79b997595-7wrkd\" (UID: \"4ec8c2c0-6649-4fce-bd49-c178b25d9da1\") " pod="openshift-marketplace/marketplace-operator-79b997595-7wrkd" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.954562 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/aa6b14cb-7f79-4bc6-bc14-58325daf3c86-ca-trust-extracted\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.954585 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a296de91-e1c3-4820-a2ec-cedfc4eac0db-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-p9gxt\" (UID: \"a296de91-e1c3-4820-a2ec-cedfc4eac0db\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p9gxt" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.954608 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/515397b2-ff6d-491a-9c49-fb217236b19f-metrics-tls\") pod \"dns-default-j465h\" (UID: \"515397b2-ff6d-491a-9c49-fb217236b19f\") " pod="openshift-dns/dns-default-j465h" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.954649 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3c5e85dd-ceb0-40eb-86b8-353702e07379-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-k8mk5\" (UID: \"3c5e85dd-ceb0-40eb-86b8-353702e07379\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8mk5" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.954693 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcfxh\" (UniqueName: \"kubernetes.io/projected/3efa2991-05a6-465c-b8c1-105edce450d9-kube-api-access-hcfxh\") pod \"machine-config-operator-74547568cd-7p4nz\" (UID: \"3efa2991-05a6-465c-b8c1-105edce450d9\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7p4nz" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.954718 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/41db2279-05aa-4558-8f60-1761471ba62c-cert\") pod \"ingress-canary-sngkt\" (UID: \"41db2279-05aa-4558-8f60-1761471ba62c\") " pod="openshift-ingress-canary/ingress-canary-sngkt" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.954742 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b7545eda-fbd7-4d47-8f48-084b1319bf34-stats-auth\") pod \"router-default-5444994796-tj9mh\" (UID: \"b7545eda-fbd7-4d47-8f48-084b1319bf34\") " pod="openshift-ingress/router-default-5444994796-tj9mh" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.954767 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d-oauth-serving-cert\") pod \"console-f9d7485db-zgcbx\" (UID: \"1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d\") " pod="openshift-console/console-f9d7485db-zgcbx" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.954793 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsxks\" (UniqueName: \"kubernetes.io/projected/c23c466d-b781-4d90-a824-3061cf8890be-kube-api-access-tsxks\") pod \"service-ca-operator-777779d784-8dzwh\" (UID: \"c23c466d-b781-4d90-a824-3061cf8890be\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8dzwh" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.954835 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7c68x\" (UniqueName: \"kubernetes.io/projected/2ef453c0-cdb0-4da7-8c32-3b975e1009a1-kube-api-access-7c68x\") pod \"downloads-7954f5f757-qnmrm\" (UID: \"2ef453c0-cdb0-4da7-8c32-3b975e1009a1\") " pod="openshift-console/downloads-7954f5f757-qnmrm" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.954883 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/216b0fb0-8e21-4b04-96c0-1f2c984b09e2-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-2sdrd\" (UID: \"216b0fb0-8e21-4b04-96c0-1f2c984b09e2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2sdrd" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.954926 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f8d50b6f-eaaa-4b6a-b447-a8614539c9f8-signing-key\") pod \"service-ca-9c57cc56f-wfv7t\" (UID: \"f8d50b6f-eaaa-4b6a-b447-a8614539c9f8\") " pod="openshift-service-ca/service-ca-9c57cc56f-wfv7t" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.954963 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05eb8f48-640e-413a-8845-9b2e3bf86f23-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-gmxgf\" (UID: \"05eb8f48-640e-413a-8845-9b2e3bf86f23\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gmxgf" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.954989 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/aa6b14cb-7f79-4bc6-bc14-58325daf3c86-installation-pull-secrets\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.955013 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6709a903-5bfa-42c6-be52-df8efb1d106e-config-volume\") pod \"collect-profiles-29320680-q9p7h\" (UID: \"6709a903-5bfa-42c6-be52-df8efb1d106e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320680-q9p7h" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.955037 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xfd9\" (UniqueName: \"kubernetes.io/projected/41db2279-05aa-4558-8f60-1761471ba62c-kube-api-access-4xfd9\") pod \"ingress-canary-sngkt\" (UID: \"41db2279-05aa-4558-8f60-1761471ba62c\") " pod="openshift-ingress-canary/ingress-canary-sngkt" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.955060 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/69f91148-b6d8-4a40-a4b4-f0411e9617ed-etcd-service-ca\") pod \"etcd-operator-b45778765-5hctc\" (UID: \"69f91148-b6d8-4a40-a4b4-f0411e9617ed\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5hctc" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.955084 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/36b6965c-2288-4689-bb83-d099cd6e4a3d-certs\") pod \"machine-config-server-h8fq9\" (UID: \"36b6965c-2288-4689-bb83-d099cd6e4a3d\") " pod="openshift-machine-config-operator/machine-config-server-h8fq9" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.955107 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a296de91-e1c3-4820-a2ec-cedfc4eac0db-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-p9gxt\" (UID: \"a296de91-e1c3-4820-a2ec-cedfc4eac0db\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p9gxt" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.955134 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b7545eda-fbd7-4d47-8f48-084b1319bf34-metrics-certs\") pod \"router-default-5444994796-tj9mh\" (UID: \"b7545eda-fbd7-4d47-8f48-084b1319bf34\") " pod="openshift-ingress/router-default-5444994796-tj9mh" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.955159 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lvcb\" (UniqueName: \"kubernetes.io/projected/627ad3bd-de42-40d1-ac5c-ab04b2b2ddca-kube-api-access-8lvcb\") pod \"catalog-operator-68c6474976-6mnvc\" (UID: \"627ad3bd-de42-40d1-ac5c-ab04b2b2ddca\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6mnvc" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.955182 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6bf298e3-e793-4a56-9856-094437b77046-profile-collector-cert\") pod \"olm-operator-6b444d44fb-66mls\" (UID: \"6bf298e3-e793-4a56-9856-094437b77046\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-66mls" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.955205 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8nz6\" (UniqueName: \"kubernetes.io/projected/216b0fb0-8e21-4b04-96c0-1f2c984b09e2-kube-api-access-d8nz6\") pod \"machine-config-controller-84d6567774-2sdrd\" (UID: \"216b0fb0-8e21-4b04-96c0-1f2c984b09e2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2sdrd" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.955226 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d-trusted-ca-bundle\") pod \"console-f9d7485db-zgcbx\" (UID: \"1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d\") " pod="openshift-console/console-f9d7485db-zgcbx" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.955249 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6709a903-5bfa-42c6-be52-df8efb1d106e-secret-volume\") pod \"collect-profiles-29320680-q9p7h\" (UID: \"6709a903-5bfa-42c6-be52-df8efb1d106e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320680-q9p7h" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.955278 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/ad665bdf-4477-45c2-bcde-96a4042e2176-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-r28xn\" (UID: \"ad665bdf-4477-45c2-bcde-96a4042e2176\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r28xn" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.955306 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/9ae45d95-5ab3-4a8f-906a-6ed43bb21612-tmpfs\") pod \"packageserver-d55dfcdfc-52tgw\" (UID: \"9ae45d95-5ab3-4a8f-906a-6ed43bb21612\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-52tgw" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.955332 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlmdb\" (UniqueName: \"kubernetes.io/projected/69f91148-b6d8-4a40-a4b4-f0411e9617ed-kube-api-access-xlmdb\") pod \"etcd-operator-b45778765-5hctc\" (UID: \"69f91148-b6d8-4a40-a4b4-f0411e9617ed\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5hctc" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.955611 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f652cd09-f743-4500-bace-2652974c9ef3-serving-cert\") pod \"openshift-config-operator-7777fb866f-mqdxw\" (UID: \"f652cd09-f743-4500-bace-2652974c9ef3\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mqdxw" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.955630 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/69f91148-b6d8-4a40-a4b4-f0411e9617ed-etcd-ca\") pod \"etcd-operator-b45778765-5hctc\" (UID: \"69f91148-b6d8-4a40-a4b4-f0411e9617ed\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5hctc" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.955805 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d39e9d64-8aa0-4b36-8cf6-9a7ebe5499ba-config\") pod \"machine-approver-56656f9798-bthfv\" (UID: \"d39e9d64-8aa0-4b36-8cf6-9a7ebe5499ba\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bthfv" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.955819 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjmdp\" (UniqueName: \"kubernetes.io/projected/f652cd09-f743-4500-bace-2652974c9ef3-kube-api-access-jjmdp\") pod \"openshift-config-operator-7777fb866f-mqdxw\" (UID: \"f652cd09-f743-4500-bace-2652974c9ef3\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mqdxw" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.955947 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3c5e85dd-ceb0-40eb-86b8-353702e07379-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-k8mk5\" (UID: \"3c5e85dd-ceb0-40eb-86b8-353702e07379\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8mk5" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.955982 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3efa2991-05a6-465c-b8c1-105edce450d9-auth-proxy-config\") pod \"machine-config-operator-74547568cd-7p4nz\" (UID: \"3efa2991-05a6-465c-b8c1-105edce450d9\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7p4nz" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.956015 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/97f344fc-42a3-4630-af31-ea25b72941e6-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-wb8d2\" (UID: \"97f344fc-42a3-4630-af31-ea25b72941e6\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wb8d2" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.956048 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d39e9d64-8aa0-4b36-8cf6-9a7ebe5499ba-auth-proxy-config\") pod \"machine-approver-56656f9798-bthfv\" (UID: \"d39e9d64-8aa0-4b36-8cf6-9a7ebe5499ba\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bthfv" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.956079 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3c5e85dd-ceb0-40eb-86b8-353702e07379-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-k8mk5\" (UID: \"3c5e85dd-ceb0-40eb-86b8-353702e07379\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8mk5" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.956098 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/69f91148-b6d8-4a40-a4b4-f0411e9617ed-etcd-service-ca\") pod \"etcd-operator-b45778765-5hctc\" (UID: \"69f91148-b6d8-4a40-a4b4-f0411e9617ed\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5hctc" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.956115 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j48p6\" (UniqueName: \"kubernetes.io/projected/9ae45d95-5ab3-4a8f-906a-6ed43bb21612-kube-api-access-j48p6\") pod \"packageserver-d55dfcdfc-52tgw\" (UID: \"9ae45d95-5ab3-4a8f-906a-6ed43bb21612\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-52tgw" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.956145 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4ec8c2c0-6649-4fce-bd49-c178b25d9da1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7wrkd\" (UID: \"4ec8c2c0-6649-4fce-bd49-c178b25d9da1\") " pod="openshift-marketplace/marketplace-operator-79b997595-7wrkd" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.956174 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fndjg\" (UniqueName: \"kubernetes.io/projected/aa6b14cb-7f79-4bc6-bc14-58325daf3c86-kube-api-access-fndjg\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.956200 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3c5e85dd-ceb0-40eb-86b8-353702e07379-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-k8mk5\" (UID: \"3c5e85dd-ceb0-40eb-86b8-353702e07379\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8mk5" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.956228 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3c5e85dd-ceb0-40eb-86b8-353702e07379-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-k8mk5\" (UID: \"3c5e85dd-ceb0-40eb-86b8-353702e07379\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8mk5" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.956255 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22c2x\" (UniqueName: \"kubernetes.io/projected/2b51a7d3-7ee2-45a0-afa2-e0fb3bbed0b6-kube-api-access-22c2x\") pod \"csi-hostpathplugin-rnmq9\" (UID: \"2b51a7d3-7ee2-45a0-afa2-e0fb3bbed0b6\") " pod="hostpath-provisioner/csi-hostpathplugin-rnmq9" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.956284 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d39e9d64-8aa0-4b36-8cf6-9a7ebe5499ba-machine-approver-tls\") pod \"machine-approver-56656f9798-bthfv\" (UID: \"d39e9d64-8aa0-4b36-8cf6-9a7ebe5499ba\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bthfv" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.956309 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82m47\" (UniqueName: \"kubernetes.io/projected/d39e9d64-8aa0-4b36-8cf6-9a7ebe5499ba-kube-api-access-82m47\") pod \"machine-approver-56656f9798-bthfv\" (UID: \"d39e9d64-8aa0-4b36-8cf6-9a7ebe5499ba\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bthfv" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.956335 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c9f7b25-e209-4e15-b74f-ac572638fc9a-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-s5h2z\" (UID: \"0c9f7b25-e209-4e15-b74f-ac572638fc9a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s5h2z" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.956377 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4ec8c2c0-6649-4fce-bd49-c178b25d9da1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7wrkd\" (UID: \"4ec8c2c0-6649-4fce-bd49-c178b25d9da1\") " pod="openshift-marketplace/marketplace-operator-79b997595-7wrkd" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.956420 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3efa2991-05a6-465c-b8c1-105edce450d9-images\") pod \"machine-config-operator-74547568cd-7p4nz\" (UID: \"3efa2991-05a6-465c-b8c1-105edce450d9\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7p4nz" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.956445 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f8d50b6f-eaaa-4b6a-b447-a8614539c9f8-signing-cabundle\") pod \"service-ca-9c57cc56f-wfv7t\" (UID: \"f8d50b6f-eaaa-4b6a-b447-a8614539c9f8\") " pod="openshift-service-ca/service-ca-9c57cc56f-wfv7t" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.956471 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/2b51a7d3-7ee2-45a0-afa2-e0fb3bbed0b6-csi-data-dir\") pod \"csi-hostpathplugin-rnmq9\" (UID: \"2b51a7d3-7ee2-45a0-afa2-e0fb3bbed0b6\") " pod="hostpath-provisioner/csi-hostpathplugin-rnmq9" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.956495 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c23c466d-b781-4d90-a824-3061cf8890be-config\") pod \"service-ca-operator-777779d784-8dzwh\" (UID: \"c23c466d-b781-4d90-a824-3061cf8890be\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8dzwh" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.955502 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/aa6b14cb-7f79-4bc6-bc14-58325daf3c86-ca-trust-extracted\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.956846 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d-trusted-ca-bundle\") pod \"console-f9d7485db-zgcbx\" (UID: \"1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d\") " pod="openshift-console/console-f9d7485db-zgcbx" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.958445 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d39e9d64-8aa0-4b36-8cf6-9a7ebe5499ba-auth-proxy-config\") pod \"machine-approver-56656f9798-bthfv\" (UID: \"d39e9d64-8aa0-4b36-8cf6-9a7ebe5499ba\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bthfv" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.958497 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aa6b14cb-7f79-4bc6-bc14-58325daf3c86-bound-sa-token\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.958520 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h82kj\" (UniqueName: \"kubernetes.io/projected/0c9f7b25-e209-4e15-b74f-ac572638fc9a-kube-api-access-h82kj\") pod \"kube-storage-version-migrator-operator-b67b599dd-s5h2z\" (UID: \"0c9f7b25-e209-4e15-b74f-ac572638fc9a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s5h2z" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.958555 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/f652cd09-f743-4500-bace-2652974c9ef3-available-featuregates\") pod \"openshift-config-operator-7777fb866f-mqdxw\" (UID: \"f652cd09-f743-4500-bace-2652974c9ef3\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mqdxw" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.958574 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d-console-serving-cert\") pod \"console-f9d7485db-zgcbx\" (UID: \"1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d\") " pod="openshift-console/console-f9d7485db-zgcbx" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.958927 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/216b0fb0-8e21-4b04-96c0-1f2c984b09e2-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-2sdrd\" (UID: \"216b0fb0-8e21-4b04-96c0-1f2c984b09e2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2sdrd" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.959817 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d-oauth-serving-cert\") pod \"console-f9d7485db-zgcbx\" (UID: \"1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d\") " pod="openshift-console/console-f9d7485db-zgcbx" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.959940 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3c5e85dd-ceb0-40eb-86b8-353702e07379-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-k8mk5\" (UID: \"3c5e85dd-ceb0-40eb-86b8-353702e07379\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8mk5" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.960137 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3c5e85dd-ceb0-40eb-86b8-353702e07379-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-k8mk5\" (UID: \"3c5e85dd-ceb0-40eb-86b8-353702e07379\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8mk5" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.959456 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3efa2991-05a6-465c-b8c1-105edce450d9-auth-proxy-config\") pod \"machine-config-operator-74547568cd-7p4nz\" (UID: \"3efa2991-05a6-465c-b8c1-105edce450d9\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7p4nz" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.961302 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/216b0fb0-8e21-4b04-96c0-1f2c984b09e2-proxy-tls\") pod \"machine-config-controller-84d6567774-2sdrd\" (UID: \"216b0fb0-8e21-4b04-96c0-1f2c984b09e2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2sdrd" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.961597 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f652cd09-f743-4500-bace-2652974c9ef3-serving-cert\") pod \"openshift-config-operator-7777fb866f-mqdxw\" (UID: \"f652cd09-f743-4500-bace-2652974c9ef3\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mqdxw" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.961986 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a296de91-e1c3-4820-a2ec-cedfc4eac0db-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-p9gxt\" (UID: \"a296de91-e1c3-4820-a2ec-cedfc4eac0db\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p9gxt" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.962273 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6bf298e3-e793-4a56-9856-094437b77046-srv-cert\") pod \"olm-operator-6b444d44fb-66mls\" (UID: \"6bf298e3-e793-4a56-9856-094437b77046\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-66mls" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.962315 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3c5e85dd-ceb0-40eb-86b8-353702e07379-audit-dir\") pod \"oauth-openshift-558db77b4-k8mk5\" (UID: \"3c5e85dd-ceb0-40eb-86b8-353702e07379\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8mk5" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.962352 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2b51a7d3-7ee2-45a0-afa2-e0fb3bbed0b6-socket-dir\") pod \"csi-hostpathplugin-rnmq9\" (UID: \"2b51a7d3-7ee2-45a0-afa2-e0fb3bbed0b6\") " pod="hostpath-provisioner/csi-hostpathplugin-rnmq9" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.962416 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9ae45d95-5ab3-4a8f-906a-6ed43bb21612-webhook-cert\") pod \"packageserver-d55dfcdfc-52tgw\" (UID: \"9ae45d95-5ab3-4a8f-906a-6ed43bb21612\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-52tgw" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.962531 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d39e9d64-8aa0-4b36-8cf6-9a7ebe5499ba-machine-approver-tls\") pod \"machine-approver-56656f9798-bthfv\" (UID: \"d39e9d64-8aa0-4b36-8cf6-9a7ebe5499ba\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bthfv" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.962546 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05eb8f48-640e-413a-8845-9b2e3bf86f23-config\") pod \"kube-controller-manager-operator-78b949d7b-gmxgf\" (UID: \"05eb8f48-640e-413a-8845-9b2e3bf86f23\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gmxgf" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.962581 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/05eb8f48-640e-413a-8845-9b2e3bf86f23-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-gmxgf\" (UID: \"05eb8f48-640e-413a-8845-9b2e3bf86f23\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gmxgf" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.962590 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3efa2991-05a6-465c-b8c1-105edce450d9-images\") pod \"machine-config-operator-74547568cd-7p4nz\" (UID: \"3efa2991-05a6-465c-b8c1-105edce450d9\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7p4nz" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.962637 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aa6b14cb-7f79-4bc6-bc14-58325daf3c86-trusted-ca\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.962675 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3c5e85dd-ceb0-40eb-86b8-353702e07379-audit-dir\") pod \"oauth-openshift-558db77b4-k8mk5\" (UID: \"3c5e85dd-ceb0-40eb-86b8-353702e07379\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8mk5" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.963387 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3c5e85dd-ceb0-40eb-86b8-353702e07379-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-k8mk5\" (UID: \"3c5e85dd-ceb0-40eb-86b8-353702e07379\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8mk5" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.963510 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3c5e85dd-ceb0-40eb-86b8-353702e07379-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-k8mk5\" (UID: \"3c5e85dd-ceb0-40eb-86b8-353702e07379\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8mk5" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.963533 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3c5e85dd-ceb0-40eb-86b8-353702e07379-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-k8mk5\" (UID: \"3c5e85dd-ceb0-40eb-86b8-353702e07379\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8mk5" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.963679 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4m5j\" (UniqueName: \"kubernetes.io/projected/1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d-kube-api-access-x4m5j\") pod \"console-f9d7485db-zgcbx\" (UID: \"1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d\") " pod="openshift-console/console-f9d7485db-zgcbx" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.963709 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/69f91148-b6d8-4a40-a4b4-f0411e9617ed-etcd-client\") pod \"etcd-operator-b45778765-5hctc\" (UID: \"69f91148-b6d8-4a40-a4b4-f0411e9617ed\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5hctc" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.963735 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjm2d\" (UniqueName: \"kubernetes.io/projected/36b6965c-2288-4689-bb83-d099cd6e4a3d-kube-api-access-sjm2d\") pod \"machine-config-server-h8fq9\" (UID: \"36b6965c-2288-4689-bb83-d099cd6e4a3d\") " pod="openshift-machine-config-operator/machine-config-server-h8fq9" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.963780 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c23c466d-b781-4d90-a824-3061cf8890be-serving-cert\") pod \"service-ca-operator-777779d784-8dzwh\" (UID: \"c23c466d-b781-4d90-a824-3061cf8890be\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8dzwh" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.963816 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/515397b2-ff6d-491a-9c49-fb217236b19f-config-volume\") pod \"dns-default-j465h\" (UID: \"515397b2-ff6d-491a-9c49-fb217236b19f\") " pod="openshift-dns/dns-default-j465h" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.964144 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/f652cd09-f743-4500-bace-2652974c9ef3-available-featuregates\") pod \"openshift-config-operator-7777fb866f-mqdxw\" (UID: \"f652cd09-f743-4500-bace-2652974c9ef3\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mqdxw" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.964234 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/aa6b14cb-7f79-4bc6-bc14-58325daf3c86-installation-pull-secrets\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.964391 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3c5e85dd-ceb0-40eb-86b8-353702e07379-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-k8mk5\" (UID: \"3c5e85dd-ceb0-40eb-86b8-353702e07379\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8mk5" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.964435 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kv7h\" (UniqueName: \"kubernetes.io/projected/3c5e85dd-ceb0-40eb-86b8-353702e07379-kube-api-access-9kv7h\") pod \"oauth-openshift-558db77b4-k8mk5\" (UID: \"3c5e85dd-ceb0-40eb-86b8-353702e07379\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8mk5" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.964503 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f2376906-3f0c-4ba2-b27b-ae1464676554-metrics-tls\") pod \"ingress-operator-5b745b69d9-gcvn8\" (UID: \"f2376906-3f0c-4ba2-b27b-ae1464676554\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gcvn8" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.964539 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f1f17b4e-4e0a-418e-9c13-33300291d209-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zgqt5\" (UID: \"f1f17b4e-4e0a-418e-9c13-33300291d209\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zgqt5" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.964567 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvvqh\" (UniqueName: \"kubernetes.io/projected/97f344fc-42a3-4630-af31-ea25b72941e6-kube-api-access-nvvqh\") pod \"control-plane-machine-set-operator-78cbb6b69f-wb8d2\" (UID: \"97f344fc-42a3-4630-af31-ea25b72941e6\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wb8d2" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.964699 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b7545eda-fbd7-4d47-8f48-084b1319bf34-metrics-certs\") pod \"router-default-5444994796-tj9mh\" (UID: \"b7545eda-fbd7-4d47-8f48-084b1319bf34\") " pod="openshift-ingress/router-default-5444994796-tj9mh" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.965254 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/aa6b14cb-7f79-4bc6-bc14-58325daf3c86-registry-certificates\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.965292 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1f17b4e-4e0a-418e-9c13-33300291d209-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zgqt5\" (UID: \"f1f17b4e-4e0a-418e-9c13-33300291d209\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zgqt5" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.965351 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d-service-ca\") pod \"console-f9d7485db-zgcbx\" (UID: \"1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d\") " pod="openshift-console/console-f9d7485db-zgcbx" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.965378 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3efa2991-05a6-465c-b8c1-105edce450d9-proxy-tls\") pod \"machine-config-operator-74547568cd-7p4nz\" (UID: \"3efa2991-05a6-465c-b8c1-105edce450d9\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7p4nz" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.965401 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eeefe4d3-1e4c-4149-9da0-9c3533991a83-config\") pod \"kube-apiserver-operator-766d6c64bb-ggdd9\" (UID: \"eeefe4d3-1e4c-4149-9da0-9c3533991a83\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ggdd9" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.965425 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d-console-oauth-config\") pod \"console-f9d7485db-zgcbx\" (UID: \"1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d\") " pod="openshift-console/console-f9d7485db-zgcbx" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.965501 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3c5e85dd-ceb0-40eb-86b8-353702e07379-audit-policies\") pod \"oauth-openshift-558db77b4-k8mk5\" (UID: \"3c5e85dd-ceb0-40eb-86b8-353702e07379\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8mk5" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.965527 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b7545eda-fbd7-4d47-8f48-084b1319bf34-default-certificate\") pod \"router-default-5444994796-tj9mh\" (UID: \"b7545eda-fbd7-4d47-8f48-084b1319bf34\") " pod="openshift-ingress/router-default-5444994796-tj9mh" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.965549 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a60ae8d9-5908-469c-aede-1af2fb6b8631-metrics-tls\") pod \"dns-operator-744455d44c-rs79q\" (UID: \"a60ae8d9-5908-469c-aede-1af2fb6b8631\") " pod="openshift-dns-operator/dns-operator-744455d44c-rs79q" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.965579 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d-console-config\") pod \"console-f9d7485db-zgcbx\" (UID: \"1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d\") " pod="openshift-console/console-f9d7485db-zgcbx" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.965606 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2b51a7d3-7ee2-45a0-afa2-e0fb3bbed0b6-registration-dir\") pod \"csi-hostpathplugin-rnmq9\" (UID: \"2b51a7d3-7ee2-45a0-afa2-e0fb3bbed0b6\") " pod="hostpath-provisioner/csi-hostpathplugin-rnmq9" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.966486 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3c5e85dd-ceb0-40eb-86b8-353702e07379-audit-policies\") pod \"oauth-openshift-558db77b4-k8mk5\" (UID: \"3c5e85dd-ceb0-40eb-86b8-353702e07379\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8mk5" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.966533 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d-service-ca\") pod \"console-f9d7485db-zgcbx\" (UID: \"1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d\") " pod="openshift-console/console-f9d7485db-zgcbx" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.966701 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eeefe4d3-1e4c-4149-9da0-9c3533991a83-config\") pod \"kube-apiserver-operator-766d6c64bb-ggdd9\" (UID: \"eeefe4d3-1e4c-4149-9da0-9c3533991a83\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ggdd9" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.966702 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d-console-config\") pod \"console-f9d7485db-zgcbx\" (UID: \"1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d\") " pod="openshift-console/console-f9d7485db-zgcbx" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.967027 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3c5e85dd-ceb0-40eb-86b8-353702e07379-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-k8mk5\" (UID: \"3c5e85dd-ceb0-40eb-86b8-353702e07379\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8mk5" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.967068 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eeefe4d3-1e4c-4149-9da0-9c3533991a83-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-ggdd9\" (UID: \"eeefe4d3-1e4c-4149-9da0-9c3533991a83\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ggdd9" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.967507 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aa6b14cb-7f79-4bc6-bc14-58325daf3c86-trusted-ca\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.967680 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrftm\" (UniqueName: \"kubernetes.io/projected/63ad8370-13f3-4bcc-80f4-8a8e6b657667-kube-api-access-zrftm\") pod \"migrator-59844c95c7-7ttmn\" (UID: \"63ad8370-13f3-4bcc-80f4-8a8e6b657667\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7ttmn" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.967808 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b7545eda-fbd7-4d47-8f48-084b1319bf34-service-ca-bundle\") pod \"router-default-5444994796-tj9mh\" (UID: \"b7545eda-fbd7-4d47-8f48-084b1319bf34\") " pod="openshift-ingress/router-default-5444994796-tj9mh" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.967919 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7t9ls\" (UniqueName: \"kubernetes.io/projected/b7545eda-fbd7-4d47-8f48-084b1319bf34-kube-api-access-7t9ls\") pod \"router-default-5444994796-tj9mh\" (UID: \"b7545eda-fbd7-4d47-8f48-084b1319bf34\") " pod="openshift-ingress/router-default-5444994796-tj9mh" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.967960 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f2376906-3f0c-4ba2-b27b-ae1464676554-trusted-ca\") pod \"ingress-operator-5b745b69d9-gcvn8\" (UID: \"f2376906-3f0c-4ba2-b27b-ae1464676554\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gcvn8" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.968010 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3c5e85dd-ceb0-40eb-86b8-353702e07379-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-k8mk5\" (UID: \"3c5e85dd-ceb0-40eb-86b8-353702e07379\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8mk5" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.968049 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aa6b14cb-7f79-4bc6-bc14-58325daf3c86-registry-tls\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.968078 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eeefe4d3-1e4c-4149-9da0-9c3533991a83-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-ggdd9\" (UID: \"eeefe4d3-1e4c-4149-9da0-9c3533991a83\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ggdd9" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.968106 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69f91148-b6d8-4a40-a4b4-f0411e9617ed-serving-cert\") pod \"etcd-operator-b45778765-5hctc\" (UID: \"69f91148-b6d8-4a40-a4b4-f0411e9617ed\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5hctc" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.968585 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/aa6b14cb-7f79-4bc6-bc14-58325daf3c86-registry-certificates\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.968819 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f2376906-3f0c-4ba2-b27b-ae1464676554-bound-sa-token\") pod \"ingress-operator-5b745b69d9-gcvn8\" (UID: \"f2376906-3f0c-4ba2-b27b-ae1464676554\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gcvn8" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.969026 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpj79\" (UniqueName: \"kubernetes.io/projected/f8d50b6f-eaaa-4b6a-b447-a8614539c9f8-kube-api-access-tpj79\") pod \"service-ca-9c57cc56f-wfv7t\" (UID: \"f8d50b6f-eaaa-4b6a-b447-a8614539c9f8\") " pod="openshift-service-ca/service-ca-9c57cc56f-wfv7t" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.969126 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3c5e85dd-ceb0-40eb-86b8-353702e07379-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-k8mk5\" (UID: \"3c5e85dd-ceb0-40eb-86b8-353702e07379\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8mk5" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.969486 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a296de91-e1c3-4820-a2ec-cedfc4eac0db-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-p9gxt\" (UID: \"a296de91-e1c3-4820-a2ec-cedfc4eac0db\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p9gxt" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.969563 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b7545eda-fbd7-4d47-8f48-084b1319bf34-stats-auth\") pod \"router-default-5444994796-tj9mh\" (UID: \"b7545eda-fbd7-4d47-8f48-084b1319bf34\") " pod="openshift-ingress/router-default-5444994796-tj9mh" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.969767 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.970021 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b7545eda-fbd7-4d47-8f48-084b1319bf34-service-ca-bundle\") pod \"router-default-5444994796-tj9mh\" (UID: \"b7545eda-fbd7-4d47-8f48-084b1319bf34\") " pod="openshift-ingress/router-default-5444994796-tj9mh" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.970060 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f2376906-3f0c-4ba2-b27b-ae1464676554-trusted-ca\") pod \"ingress-operator-5b745b69d9-gcvn8\" (UID: \"f2376906-3f0c-4ba2-b27b-ae1464676554\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gcvn8" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.970104 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1f17b4e-4e0a-418e-9c13-33300291d209-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zgqt5\" (UID: \"f1f17b4e-4e0a-418e-9c13-33300291d209\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zgqt5" Sep 30 14:00:37 crc kubenswrapper[4676]: E0930 14:00:37.970551 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 14:00:38.470529021 +0000 UTC m=+142.453617530 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rgs57" (UID: "aa6b14cb-7f79-4bc6-bc14-58325daf3c86") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.970605 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1f17b4e-4e0a-418e-9c13-33300291d209-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zgqt5\" (UID: \"f1f17b4e-4e0a-418e-9c13-33300291d209\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zgqt5" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.970774 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/627ad3bd-de42-40d1-ac5c-ab04b2b2ddca-srv-cert\") pod \"catalog-operator-68c6474976-6mnvc\" (UID: \"627ad3bd-de42-40d1-ac5c-ab04b2b2ddca\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6mnvc" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.970797 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/2b51a7d3-7ee2-45a0-afa2-e0fb3bbed0b6-mountpoint-dir\") pod \"csi-hostpathplugin-rnmq9\" (UID: \"2b51a7d3-7ee2-45a0-afa2-e0fb3bbed0b6\") " pod="hostpath-provisioner/csi-hostpathplugin-rnmq9" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.970837 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3c5e85dd-ceb0-40eb-86b8-353702e07379-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-k8mk5\" (UID: \"3c5e85dd-ceb0-40eb-86b8-353702e07379\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8mk5" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.970865 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/2b51a7d3-7ee2-45a0-afa2-e0fb3bbed0b6-plugins-dir\") pod \"csi-hostpathplugin-rnmq9\" (UID: \"2b51a7d3-7ee2-45a0-afa2-e0fb3bbed0b6\") " pod="hostpath-provisioner/csi-hostpathplugin-rnmq9" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.970906 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3c5e85dd-ceb0-40eb-86b8-353702e07379-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-k8mk5\" (UID: \"3c5e85dd-ceb0-40eb-86b8-353702e07379\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8mk5" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.970936 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjkk4\" (UniqueName: \"kubernetes.io/projected/f2376906-3f0c-4ba2-b27b-ae1464676554-kube-api-access-rjkk4\") pod \"ingress-operator-5b745b69d9-gcvn8\" (UID: \"f2376906-3f0c-4ba2-b27b-ae1464676554\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gcvn8" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.970955 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69f91148-b6d8-4a40-a4b4-f0411e9617ed-config\") pod \"etcd-operator-b45778765-5hctc\" (UID: \"69f91148-b6d8-4a40-a4b4-f0411e9617ed\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5hctc" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.970977 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56bpp\" (UniqueName: \"kubernetes.io/projected/a296de91-e1c3-4820-a2ec-cedfc4eac0db-kube-api-access-56bpp\") pod \"openshift-controller-manager-operator-756b6f6bc6-p9gxt\" (UID: \"a296de91-e1c3-4820-a2ec-cedfc4eac0db\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p9gxt" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.972032 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/627ad3bd-de42-40d1-ac5c-ab04b2b2ddca-profile-collector-cert\") pod \"catalog-operator-68c6474976-6mnvc\" (UID: \"627ad3bd-de42-40d1-ac5c-ab04b2b2ddca\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6mnvc" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.972140 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fnsg\" (UniqueName: \"kubernetes.io/projected/ad665bdf-4477-45c2-bcde-96a4042e2176-kube-api-access-7fnsg\") pod \"package-server-manager-789f6589d5-r28xn\" (UID: \"ad665bdf-4477-45c2-bcde-96a4042e2176\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r28xn" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.972147 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3c5e85dd-ceb0-40eb-86b8-353702e07379-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-k8mk5\" (UID: \"3c5e85dd-ceb0-40eb-86b8-353702e07379\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8mk5" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.972165 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/36b6965c-2288-4689-bb83-d099cd6e4a3d-node-bootstrap-token\") pod \"machine-config-server-h8fq9\" (UID: \"36b6965c-2288-4689-bb83-d099cd6e4a3d\") " pod="openshift-machine-config-operator/machine-config-server-h8fq9" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.972641 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c9f7b25-e209-4e15-b74f-ac572638fc9a-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-s5h2z\" (UID: \"0c9f7b25-e209-4e15-b74f-ac572638fc9a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s5h2z" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.974511 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69f91148-b6d8-4a40-a4b4-f0411e9617ed-config\") pod \"etcd-operator-b45778765-5hctc\" (UID: \"69f91148-b6d8-4a40-a4b4-f0411e9617ed\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5hctc" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.975279 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c9f7b25-e209-4e15-b74f-ac572638fc9a-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-s5h2z\" (UID: \"0c9f7b25-e209-4e15-b74f-ac572638fc9a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s5h2z" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.975657 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c9f7b25-e209-4e15-b74f-ac572638fc9a-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-s5h2z\" (UID: \"0c9f7b25-e209-4e15-b74f-ac572638fc9a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s5h2z" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.976532 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6bf298e3-e793-4a56-9856-094437b77046-profile-collector-cert\") pod \"olm-operator-6b444d44fb-66mls\" (UID: \"6bf298e3-e793-4a56-9856-094437b77046\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-66mls" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.977564 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3c5e85dd-ceb0-40eb-86b8-353702e07379-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-k8mk5\" (UID: \"3c5e85dd-ceb0-40eb-86b8-353702e07379\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8mk5" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.978806 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/69f91148-b6d8-4a40-a4b4-f0411e9617ed-etcd-client\") pod \"etcd-operator-b45778765-5hctc\" (UID: \"69f91148-b6d8-4a40-a4b4-f0411e9617ed\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5hctc" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.978817 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3c5e85dd-ceb0-40eb-86b8-353702e07379-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-k8mk5\" (UID: \"3c5e85dd-ceb0-40eb-86b8-353702e07379\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8mk5" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.979194 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d-console-serving-cert\") pod \"console-f9d7485db-zgcbx\" (UID: \"1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d\") " pod="openshift-console/console-f9d7485db-zgcbx" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.979206 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f2376906-3f0c-4ba2-b27b-ae1464676554-metrics-tls\") pod \"ingress-operator-5b745b69d9-gcvn8\" (UID: \"f2376906-3f0c-4ba2-b27b-ae1464676554\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gcvn8" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.979380 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3c5e85dd-ceb0-40eb-86b8-353702e07379-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-k8mk5\" (UID: \"3c5e85dd-ceb0-40eb-86b8-353702e07379\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8mk5" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.979837 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b7545eda-fbd7-4d47-8f48-084b1319bf34-default-certificate\") pod \"router-default-5444994796-tj9mh\" (UID: \"b7545eda-fbd7-4d47-8f48-084b1319bf34\") " pod="openshift-ingress/router-default-5444994796-tj9mh" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.980578 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/627ad3bd-de42-40d1-ac5c-ab04b2b2ddca-srv-cert\") pod \"catalog-operator-68c6474976-6mnvc\" (UID: \"627ad3bd-de42-40d1-ac5c-ab04b2b2ddca\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6mnvc" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.981098 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eeefe4d3-1e4c-4149-9da0-9c3533991a83-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-ggdd9\" (UID: \"eeefe4d3-1e4c-4149-9da0-9c3533991a83\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ggdd9" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.981385 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1f17b4e-4e0a-418e-9c13-33300291d209-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zgqt5\" (UID: \"f1f17b4e-4e0a-418e-9c13-33300291d209\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zgqt5" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.981692 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aa6b14cb-7f79-4bc6-bc14-58325daf3c86-registry-tls\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.981795 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3c5e85dd-ceb0-40eb-86b8-353702e07379-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-k8mk5\" (UID: \"3c5e85dd-ceb0-40eb-86b8-353702e07379\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8mk5" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.981866 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69f91148-b6d8-4a40-a4b4-f0411e9617ed-serving-cert\") pod \"etcd-operator-b45778765-5hctc\" (UID: \"69f91148-b6d8-4a40-a4b4-f0411e9617ed\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5hctc" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.981941 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/97f344fc-42a3-4630-af31-ea25b72941e6-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-wb8d2\" (UID: \"97f344fc-42a3-4630-af31-ea25b72941e6\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wb8d2" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.982498 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3efa2991-05a6-465c-b8c1-105edce450d9-proxy-tls\") pod \"machine-config-operator-74547568cd-7p4nz\" (UID: \"3efa2991-05a6-465c-b8c1-105edce450d9\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7p4nz" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.982722 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3c5e85dd-ceb0-40eb-86b8-353702e07379-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-k8mk5\" (UID: \"3c5e85dd-ceb0-40eb-86b8-353702e07379\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8mk5" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.983462 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/627ad3bd-de42-40d1-ac5c-ab04b2b2ddca-profile-collector-cert\") pod \"catalog-operator-68c6474976-6mnvc\" (UID: \"627ad3bd-de42-40d1-ac5c-ab04b2b2ddca\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6mnvc" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.983778 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d-console-oauth-config\") pod \"console-f9d7485db-zgcbx\" (UID: \"1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d\") " pod="openshift-console/console-f9d7485db-zgcbx" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.984124 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a60ae8d9-5908-469c-aede-1af2fb6b8631-metrics-tls\") pod \"dns-operator-744455d44c-rs79q\" (UID: \"a60ae8d9-5908-469c-aede-1af2fb6b8631\") " pod="openshift-dns-operator/dns-operator-744455d44c-rs79q" Sep 30 14:00:37 crc kubenswrapper[4676]: I0930 14:00:37.994819 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbr68\" (UniqueName: \"kubernetes.io/projected/a60ae8d9-5908-469c-aede-1af2fb6b8631-kube-api-access-bbr68\") pod \"dns-operator-744455d44c-rs79q\" (UID: \"a60ae8d9-5908-469c-aede-1af2fb6b8631\") " pod="openshift-dns-operator/dns-operator-744455d44c-rs79q" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.007653 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-5rrn6" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.027632 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcfxh\" (UniqueName: \"kubernetes.io/projected/3efa2991-05a6-465c-b8c1-105edce450d9-kube-api-access-hcfxh\") pod \"machine-config-operator-74547568cd-7p4nz\" (UID: \"3efa2991-05a6-465c-b8c1-105edce450d9\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7p4nz" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.053574 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dm7l\" (UniqueName: \"kubernetes.io/projected/6bf298e3-e793-4a56-9856-094437b77046-kube-api-access-2dm7l\") pod \"olm-operator-6b444d44fb-66mls\" (UID: \"6bf298e3-e793-4a56-9856-094437b77046\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-66mls" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.068655 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gn7dj" event={"ID":"0b3b285f-9406-4f9b-9768-8827933418d7","Type":"ContainerStarted","Data":"768459b8d0ae42a92971c99a8a1ddd447fbe9f91d0ef8e4efa77308b1d771a68"} Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.069588 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjmdp\" (UniqueName: \"kubernetes.io/projected/f652cd09-f743-4500-bace-2652974c9ef3-kube-api-access-jjmdp\") pod \"openshift-config-operator-7777fb866f-mqdxw\" (UID: \"f652cd09-f743-4500-bace-2652974c9ef3\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mqdxw" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.071660 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-qhnkq" event={"ID":"be094421-14a0-4ba3-b708-fb08c498c2d6","Type":"ContainerStarted","Data":"75d7d0da0cb27f33a8846a8c4ed66afb3132387215bb8b03a787aa8e3e121284"} Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.071717 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-qhnkq" event={"ID":"be094421-14a0-4ba3-b708-fb08c498c2d6","Type":"ContainerStarted","Data":"d222486a4dee1977e0217197171ff86df6127e68bb0d70e5d977fcd048505ca4"} Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.073150 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-xr2v8" event={"ID":"5d26a632-0017-40e4-ac56-5794c4800d83","Type":"ContainerStarted","Data":"45e650c76a22d6c3497f79a28f15b860fcb19d063be06d55ed6c6f4f27664421"} Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.073499 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 14:00:38 crc kubenswrapper[4676]: E0930 14:00:38.073768 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 14:00:38.573739564 +0000 UTC m=+142.556828003 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.073857 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f8d50b6f-eaaa-4b6a-b447-a8614539c9f8-signing-cabundle\") pod \"service-ca-9c57cc56f-wfv7t\" (UID: \"f8d50b6f-eaaa-4b6a-b447-a8614539c9f8\") " pod="openshift-service-ca/service-ca-9c57cc56f-wfv7t" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.073911 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/2b51a7d3-7ee2-45a0-afa2-e0fb3bbed0b6-csi-data-dir\") pod \"csi-hostpathplugin-rnmq9\" (UID: \"2b51a7d3-7ee2-45a0-afa2-e0fb3bbed0b6\") " pod="hostpath-provisioner/csi-hostpathplugin-rnmq9" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.073938 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c23c466d-b781-4d90-a824-3061cf8890be-config\") pod \"service-ca-operator-777779d784-8dzwh\" (UID: \"c23c466d-b781-4d90-a824-3061cf8890be\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8dzwh" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.074226 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/2b51a7d3-7ee2-45a0-afa2-e0fb3bbed0b6-csi-data-dir\") pod \"csi-hostpathplugin-rnmq9\" (UID: \"2b51a7d3-7ee2-45a0-afa2-e0fb3bbed0b6\") " pod="hostpath-provisioner/csi-hostpathplugin-rnmq9" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.074254 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2b51a7d3-7ee2-45a0-afa2-e0fb3bbed0b6-socket-dir\") pod \"csi-hostpathplugin-rnmq9\" (UID: \"2b51a7d3-7ee2-45a0-afa2-e0fb3bbed0b6\") " pod="hostpath-provisioner/csi-hostpathplugin-rnmq9" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.074296 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05eb8f48-640e-413a-8845-9b2e3bf86f23-config\") pod \"kube-controller-manager-operator-78b949d7b-gmxgf\" (UID: \"05eb8f48-640e-413a-8845-9b2e3bf86f23\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gmxgf" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.074322 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/05eb8f48-640e-413a-8845-9b2e3bf86f23-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-gmxgf\" (UID: \"05eb8f48-640e-413a-8845-9b2e3bf86f23\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gmxgf" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.074346 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9ae45d95-5ab3-4a8f-906a-6ed43bb21612-webhook-cert\") pod \"packageserver-d55dfcdfc-52tgw\" (UID: \"9ae45d95-5ab3-4a8f-906a-6ed43bb21612\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-52tgw" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.074383 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjm2d\" (UniqueName: \"kubernetes.io/projected/36b6965c-2288-4689-bb83-d099cd6e4a3d-kube-api-access-sjm2d\") pod \"machine-config-server-h8fq9\" (UID: \"36b6965c-2288-4689-bb83-d099cd6e4a3d\") " pod="openshift-machine-config-operator/machine-config-server-h8fq9" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.074403 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c23c466d-b781-4d90-a824-3061cf8890be-serving-cert\") pod \"service-ca-operator-777779d784-8dzwh\" (UID: \"c23c466d-b781-4d90-a824-3061cf8890be\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8dzwh" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.074429 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/515397b2-ff6d-491a-9c49-fb217236b19f-config-volume\") pod \"dns-default-j465h\" (UID: \"515397b2-ff6d-491a-9c49-fb217236b19f\") " pod="openshift-dns/dns-default-j465h" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.074478 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2b51a7d3-7ee2-45a0-afa2-e0fb3bbed0b6-registration-dir\") pod \"csi-hostpathplugin-rnmq9\" (UID: \"2b51a7d3-7ee2-45a0-afa2-e0fb3bbed0b6\") " pod="hostpath-provisioner/csi-hostpathplugin-rnmq9" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.074517 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrftm\" (UniqueName: \"kubernetes.io/projected/63ad8370-13f3-4bcc-80f4-8a8e6b657667-kube-api-access-zrftm\") pod \"migrator-59844c95c7-7ttmn\" (UID: \"63ad8370-13f3-4bcc-80f4-8a8e6b657667\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7ttmn" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.074572 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpj79\" (UniqueName: \"kubernetes.io/projected/f8d50b6f-eaaa-4b6a-b447-a8614539c9f8-kube-api-access-tpj79\") pod \"service-ca-9c57cc56f-wfv7t\" (UID: \"f8d50b6f-eaaa-4b6a-b447-a8614539c9f8\") " pod="openshift-service-ca/service-ca-9c57cc56f-wfv7t" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.074585 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2b51a7d3-7ee2-45a0-afa2-e0fb3bbed0b6-socket-dir\") pod \"csi-hostpathplugin-rnmq9\" (UID: \"2b51a7d3-7ee2-45a0-afa2-e0fb3bbed0b6\") " pod="hostpath-provisioner/csi-hostpathplugin-rnmq9" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.074602 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.074629 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/2b51a7d3-7ee2-45a0-afa2-e0fb3bbed0b6-mountpoint-dir\") pod \"csi-hostpathplugin-rnmq9\" (UID: \"2b51a7d3-7ee2-45a0-afa2-e0fb3bbed0b6\") " pod="hostpath-provisioner/csi-hostpathplugin-rnmq9" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.074651 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/2b51a7d3-7ee2-45a0-afa2-e0fb3bbed0b6-plugins-dir\") pod \"csi-hostpathplugin-rnmq9\" (UID: \"2b51a7d3-7ee2-45a0-afa2-e0fb3bbed0b6\") " pod="hostpath-provisioner/csi-hostpathplugin-rnmq9" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.074770 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c23c466d-b781-4d90-a824-3061cf8890be-config\") pod \"service-ca-operator-777779d784-8dzwh\" (UID: \"c23c466d-b781-4d90-a824-3061cf8890be\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8dzwh" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.075342 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/2b51a7d3-7ee2-45a0-afa2-e0fb3bbed0b6-mountpoint-dir\") pod \"csi-hostpathplugin-rnmq9\" (UID: \"2b51a7d3-7ee2-45a0-afa2-e0fb3bbed0b6\") " pod="hostpath-provisioner/csi-hostpathplugin-rnmq9" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.075467 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/2b51a7d3-7ee2-45a0-afa2-e0fb3bbed0b6-plugins-dir\") pod \"csi-hostpathplugin-rnmq9\" (UID: \"2b51a7d3-7ee2-45a0-afa2-e0fb3bbed0b6\") " pod="hostpath-provisioner/csi-hostpathplugin-rnmq9" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.075715 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05eb8f48-640e-413a-8845-9b2e3bf86f23-config\") pod \"kube-controller-manager-operator-78b949d7b-gmxgf\" (UID: \"05eb8f48-640e-413a-8845-9b2e3bf86f23\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gmxgf" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.075790 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2b51a7d3-7ee2-45a0-afa2-e0fb3bbed0b6-registration-dir\") pod \"csi-hostpathplugin-rnmq9\" (UID: \"2b51a7d3-7ee2-45a0-afa2-e0fb3bbed0b6\") " pod="hostpath-provisioner/csi-hostpathplugin-rnmq9" Sep 30 14:00:38 crc kubenswrapper[4676]: E0930 14:00:38.075794 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 14:00:38.5757811 +0000 UTC m=+142.558869589 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rgs57" (UID: "aa6b14cb-7f79-4bc6-bc14-58325daf3c86") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.076479 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f8d50b6f-eaaa-4b6a-b447-a8614539c9f8-signing-cabundle\") pod \"service-ca-9c57cc56f-wfv7t\" (UID: \"f8d50b6f-eaaa-4b6a-b447-a8614539c9f8\") " pod="openshift-service-ca/service-ca-9c57cc56f-wfv7t" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.077160 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/515397b2-ff6d-491a-9c49-fb217236b19f-config-volume\") pod \"dns-default-j465h\" (UID: \"515397b2-ff6d-491a-9c49-fb217236b19f\") " pod="openshift-dns/dns-default-j465h" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.077677 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-c9kjj" event={"ID":"ebd6b987-d54a-4692-800a-8eadc5e8690c","Type":"ContainerStarted","Data":"2bae2e8ec4414d356ef5157b3cd850e900d2218a2d120a096fa4d0c74946b12e"} Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.078288 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9ae45d95-5ab3-4a8f-906a-6ed43bb21612-webhook-cert\") pod \"packageserver-d55dfcdfc-52tgw\" (UID: \"9ae45d95-5ab3-4a8f-906a-6ed43bb21612\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-52tgw" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.074700 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fnsg\" (UniqueName: \"kubernetes.io/projected/ad665bdf-4477-45c2-bcde-96a4042e2176-kube-api-access-7fnsg\") pod \"package-server-manager-789f6589d5-r28xn\" (UID: \"ad665bdf-4477-45c2-bcde-96a4042e2176\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r28xn" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.079017 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/36b6965c-2288-4689-bb83-d099cd6e4a3d-node-bootstrap-token\") pod \"machine-config-server-h8fq9\" (UID: \"36b6965c-2288-4689-bb83-d099cd6e4a3d\") " pod="openshift-machine-config-operator/machine-config-server-h8fq9" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.079053 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjgnk\" (UniqueName: \"kubernetes.io/projected/6709a903-5bfa-42c6-be52-df8efb1d106e-kube-api-access-tjgnk\") pod \"collect-profiles-29320680-q9p7h\" (UID: \"6709a903-5bfa-42c6-be52-df8efb1d106e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320680-q9p7h" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.079090 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nx5vn\" (UniqueName: \"kubernetes.io/projected/515397b2-ff6d-491a-9c49-fb217236b19f-kube-api-access-nx5vn\") pod \"dns-default-j465h\" (UID: \"515397b2-ff6d-491a-9c49-fb217236b19f\") " pod="openshift-dns/dns-default-j465h" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.079117 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9ae45d95-5ab3-4a8f-906a-6ed43bb21612-apiservice-cert\") pod \"packageserver-d55dfcdfc-52tgw\" (UID: \"9ae45d95-5ab3-4a8f-906a-6ed43bb21612\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-52tgw" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.079141 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v68kh\" (UniqueName: \"kubernetes.io/projected/4ec8c2c0-6649-4fce-bd49-c178b25d9da1-kube-api-access-v68kh\") pod \"marketplace-operator-79b997595-7wrkd\" (UID: \"4ec8c2c0-6649-4fce-bd49-c178b25d9da1\") " pod="openshift-marketplace/marketplace-operator-79b997595-7wrkd" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.079238 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/515397b2-ff6d-491a-9c49-fb217236b19f-metrics-tls\") pod \"dns-default-j465h\" (UID: \"515397b2-ff6d-491a-9c49-fb217236b19f\") " pod="openshift-dns/dns-default-j465h" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.079274 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/41db2279-05aa-4558-8f60-1761471ba62c-cert\") pod \"ingress-canary-sngkt\" (UID: \"41db2279-05aa-4558-8f60-1761471ba62c\") " pod="openshift-ingress-canary/ingress-canary-sngkt" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.079302 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsxks\" (UniqueName: \"kubernetes.io/projected/c23c466d-b781-4d90-a824-3061cf8890be-kube-api-access-tsxks\") pod \"service-ca-operator-777779d784-8dzwh\" (UID: \"c23c466d-b781-4d90-a824-3061cf8890be\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8dzwh" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.079349 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f8d50b6f-eaaa-4b6a-b447-a8614539c9f8-signing-key\") pod \"service-ca-9c57cc56f-wfv7t\" (UID: \"f8d50b6f-eaaa-4b6a-b447-a8614539c9f8\") " pod="openshift-service-ca/service-ca-9c57cc56f-wfv7t" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.079376 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05eb8f48-640e-413a-8845-9b2e3bf86f23-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-gmxgf\" (UID: \"05eb8f48-640e-413a-8845-9b2e3bf86f23\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gmxgf" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.079408 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6709a903-5bfa-42c6-be52-df8efb1d106e-config-volume\") pod \"collect-profiles-29320680-q9p7h\" (UID: \"6709a903-5bfa-42c6-be52-df8efb1d106e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320680-q9p7h" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.079432 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xfd9\" (UniqueName: \"kubernetes.io/projected/41db2279-05aa-4558-8f60-1761471ba62c-kube-api-access-4xfd9\") pod \"ingress-canary-sngkt\" (UID: \"41db2279-05aa-4558-8f60-1761471ba62c\") " pod="openshift-ingress-canary/ingress-canary-sngkt" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.079459 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/36b6965c-2288-4689-bb83-d099cd6e4a3d-certs\") pod \"machine-config-server-h8fq9\" (UID: \"36b6965c-2288-4689-bb83-d099cd6e4a3d\") " pod="openshift-machine-config-operator/machine-config-server-h8fq9" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.079502 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6709a903-5bfa-42c6-be52-df8efb1d106e-secret-volume\") pod \"collect-profiles-29320680-q9p7h\" (UID: \"6709a903-5bfa-42c6-be52-df8efb1d106e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320680-q9p7h" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.079526 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/ad665bdf-4477-45c2-bcde-96a4042e2176-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-r28xn\" (UID: \"ad665bdf-4477-45c2-bcde-96a4042e2176\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r28xn" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.079548 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/9ae45d95-5ab3-4a8f-906a-6ed43bb21612-tmpfs\") pod \"packageserver-d55dfcdfc-52tgw\" (UID: \"9ae45d95-5ab3-4a8f-906a-6ed43bb21612\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-52tgw" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.079595 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j48p6\" (UniqueName: \"kubernetes.io/projected/9ae45d95-5ab3-4a8f-906a-6ed43bb21612-kube-api-access-j48p6\") pod \"packageserver-d55dfcdfc-52tgw\" (UID: \"9ae45d95-5ab3-4a8f-906a-6ed43bb21612\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-52tgw" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.079623 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4ec8c2c0-6649-4fce-bd49-c178b25d9da1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7wrkd\" (UID: \"4ec8c2c0-6649-4fce-bd49-c178b25d9da1\") " pod="openshift-marketplace/marketplace-operator-79b997595-7wrkd" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.079655 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22c2x\" (UniqueName: \"kubernetes.io/projected/2b51a7d3-7ee2-45a0-afa2-e0fb3bbed0b6-kube-api-access-22c2x\") pod \"csi-hostpathplugin-rnmq9\" (UID: \"2b51a7d3-7ee2-45a0-afa2-e0fb3bbed0b6\") " pod="hostpath-provisioner/csi-hostpathplugin-rnmq9" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.079690 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4ec8c2c0-6649-4fce-bd49-c178b25d9da1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7wrkd\" (UID: \"4ec8c2c0-6649-4fce-bd49-c178b25d9da1\") " pod="openshift-marketplace/marketplace-operator-79b997595-7wrkd" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.078966 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c23c466d-b781-4d90-a824-3061cf8890be-serving-cert\") pod \"service-ca-operator-777779d784-8dzwh\" (UID: \"c23c466d-b781-4d90-a824-3061cf8890be\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8dzwh" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.081375 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6709a903-5bfa-42c6-be52-df8efb1d106e-config-volume\") pod \"collect-profiles-29320680-q9p7h\" (UID: \"6709a903-5bfa-42c6-be52-df8efb1d106e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320680-q9p7h" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.081472 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/9ae45d95-5ab3-4a8f-906a-6ed43bb21612-tmpfs\") pod \"packageserver-d55dfcdfc-52tgw\" (UID: \"9ae45d95-5ab3-4a8f-906a-6ed43bb21612\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-52tgw" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.081925 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4ec8c2c0-6649-4fce-bd49-c178b25d9da1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7wrkd\" (UID: \"4ec8c2c0-6649-4fce-bd49-c178b25d9da1\") " pod="openshift-marketplace/marketplace-operator-79b997595-7wrkd" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.083453 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9ae45d95-5ab3-4a8f-906a-6ed43bb21612-apiservice-cert\") pod \"packageserver-d55dfcdfc-52tgw\" (UID: \"9ae45d95-5ab3-4a8f-906a-6ed43bb21612\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-52tgw" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.083842 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/36b6965c-2288-4689-bb83-d099cd6e4a3d-certs\") pod \"machine-config-server-h8fq9\" (UID: \"36b6965c-2288-4689-bb83-d099cd6e4a3d\") " pod="openshift-machine-config-operator/machine-config-server-h8fq9" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.086497 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4ec8c2c0-6649-4fce-bd49-c178b25d9da1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7wrkd\" (UID: \"4ec8c2c0-6649-4fce-bd49-c178b25d9da1\") " pod="openshift-marketplace/marketplace-operator-79b997595-7wrkd" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.087346 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6709a903-5bfa-42c6-be52-df8efb1d106e-secret-volume\") pod \"collect-profiles-29320680-q9p7h\" (UID: \"6709a903-5bfa-42c6-be52-df8efb1d106e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320680-q9p7h" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.088316 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/36b6965c-2288-4689-bb83-d099cd6e4a3d-node-bootstrap-token\") pod \"machine-config-server-h8fq9\" (UID: \"36b6965c-2288-4689-bb83-d099cd6e4a3d\") " pod="openshift-machine-config-operator/machine-config-server-h8fq9" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.088625 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/515397b2-ff6d-491a-9c49-fb217236b19f-metrics-tls\") pod \"dns-default-j465h\" (UID: \"515397b2-ff6d-491a-9c49-fb217236b19f\") " pod="openshift-dns/dns-default-j465h" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.088866 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/ad665bdf-4477-45c2-bcde-96a4042e2176-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-r28xn\" (UID: \"ad665bdf-4477-45c2-bcde-96a4042e2176\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r28xn" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.089314 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05eb8f48-640e-413a-8845-9b2e3bf86f23-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-gmxgf\" (UID: \"05eb8f48-640e-413a-8845-9b2e3bf86f23\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gmxgf" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.089956 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/41db2279-05aa-4558-8f60-1761471ba62c-cert\") pod \"ingress-canary-sngkt\" (UID: \"41db2279-05aa-4558-8f60-1761471ba62c\") " pod="openshift-ingress-canary/ingress-canary-sngkt" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.090406 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lvcb\" (UniqueName: \"kubernetes.io/projected/627ad3bd-de42-40d1-ac5c-ab04b2b2ddca-kube-api-access-8lvcb\") pod \"catalog-operator-68c6474976-6mnvc\" (UID: \"627ad3bd-de42-40d1-ac5c-ab04b2b2ddca\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6mnvc" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.093029 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mqdxw" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.096457 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f8d50b6f-eaaa-4b6a-b447-a8614539c9f8-signing-key\") pod \"service-ca-9c57cc56f-wfv7t\" (UID: \"f8d50b6f-eaaa-4b6a-b447-a8614539c9f8\") " pod="openshift-service-ca/service-ca-9c57cc56f-wfv7t" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.100439 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-qbdpc" event={"ID":"61b1921b-4102-4959-abd7-c86ca3ae880e","Type":"ContainerStarted","Data":"b97143f2266a0da32711e5f0615f458fab187e7156146ccd6101995077239953"} Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.100515 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-qbdpc" event={"ID":"61b1921b-4102-4959-abd7-c86ca3ae880e","Type":"ContainerStarted","Data":"4e025dbf33a38f3867667d8c52bbc6658048ca466e959b5560f4f5184bc50b80"} Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.100690 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-qbdpc" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.108820 4676 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-qbdpc container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.109185 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-qbdpc" podUID="61b1921b-4102-4959-abd7-c86ca3ae880e" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.110198 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-wwnsl"] Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.111084 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8nz6\" (UniqueName: \"kubernetes.io/projected/216b0fb0-8e21-4b04-96c0-1f2c984b09e2-kube-api-access-d8nz6\") pod \"machine-config-controller-84d6567774-2sdrd\" (UID: \"216b0fb0-8e21-4b04-96c0-1f2c984b09e2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2sdrd" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.112258 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h9nl2" event={"ID":"79abcd65-3889-4034-874b-a8c0ad78caac","Type":"ContainerStarted","Data":"d9f09f7e37dd26bbb17d8074a7461e93169126c47676600143b6d528069b9b90"} Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.112336 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h9nl2" event={"ID":"79abcd65-3889-4034-874b-a8c0ad78caac","Type":"ContainerStarted","Data":"363c233ef7986a33e0f508eb1e76d4c1fa5622594f4f9ebda3c3aaa37fb5fda6"} Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.128560 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4mgv7" event={"ID":"4c806cf0-18da-4035-bf9c-f134a1b23485","Type":"ContainerStarted","Data":"6ae3821bf55a85b5def945d0548e428c07e6779baa03172d009f39ebf5fe163b"} Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.129282 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7c68x\" (UniqueName: \"kubernetes.io/projected/2ef453c0-cdb0-4da7-8c32-3b975e1009a1-kube-api-access-7c68x\") pod \"downloads-7954f5f757-qnmrm\" (UID: \"2ef453c0-cdb0-4da7-8c32-3b975e1009a1\") " pod="openshift-console/downloads-7954f5f757-qnmrm" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.154986 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82m47\" (UniqueName: \"kubernetes.io/projected/d39e9d64-8aa0-4b36-8cf6-9a7ebe5499ba-kube-api-access-82m47\") pod \"machine-approver-56656f9798-bthfv\" (UID: \"d39e9d64-8aa0-4b36-8cf6-9a7ebe5499ba\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bthfv" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.155469 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7sxgh"] Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.171706 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h82kj\" (UniqueName: \"kubernetes.io/projected/0c9f7b25-e209-4e15-b74f-ac572638fc9a-kube-api-access-h82kj\") pod \"kube-storage-version-migrator-operator-b67b599dd-s5h2z\" (UID: \"0c9f7b25-e209-4e15-b74f-ac572638fc9a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s5h2z" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.181114 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 14:00:38 crc kubenswrapper[4676]: E0930 14:00:38.183731 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 14:00:38.683694694 +0000 UTC m=+142.666783113 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.184429 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:38 crc kubenswrapper[4676]: E0930 14:00:38.186120 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 14:00:38.68604857 +0000 UTC m=+142.669137199 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rgs57" (UID: "aa6b14cb-7f79-4bc6-bc14-58325daf3c86") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.189280 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aa6b14cb-7f79-4bc6-bc14-58325daf3c86-bound-sa-token\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.194845 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-rs79q" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.201525 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-qnmrm" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.214620 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlmdb\" (UniqueName: \"kubernetes.io/projected/69f91148-b6d8-4a40-a4b4-f0411e9617ed-kube-api-access-xlmdb\") pod \"etcd-operator-b45778765-5hctc\" (UID: \"69f91148-b6d8-4a40-a4b4-f0411e9617ed\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5hctc" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.215831 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bthfv" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.247857 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7p4nz" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.250937 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fndjg\" (UniqueName: \"kubernetes.io/projected/aa6b14cb-7f79-4bc6-bc14-58325daf3c86-kube-api-access-fndjg\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.252520 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s5h2z" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.259035 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6mnvc" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.265942 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4m5j\" (UniqueName: \"kubernetes.io/projected/1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d-kube-api-access-x4m5j\") pod \"console-f9d7485db-zgcbx\" (UID: \"1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d\") " pod="openshift-console/console-f9d7485db-zgcbx" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.270646 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kv7h\" (UniqueName: \"kubernetes.io/projected/3c5e85dd-ceb0-40eb-86b8-353702e07379-kube-api-access-9kv7h\") pod \"oauth-openshift-558db77b4-k8mk5\" (UID: \"3c5e85dd-ceb0-40eb-86b8-353702e07379\") " pod="openshift-authentication/oauth-openshift-558db77b4-k8mk5" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.286765 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 14:00:38 crc kubenswrapper[4676]: E0930 14:00:38.287607 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 14:00:38.787582316 +0000 UTC m=+142.770670745 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.294505 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2sdrd" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.300703 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-66mls" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.302450 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f1f17b4e-4e0a-418e-9c13-33300291d209-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zgqt5\" (UID: \"f1f17b4e-4e0a-418e-9c13-33300291d209\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zgqt5" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.312091 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-5rrn6"] Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.320108 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvvqh\" (UniqueName: \"kubernetes.io/projected/97f344fc-42a3-4630-af31-ea25b72941e6-kube-api-access-nvvqh\") pod \"control-plane-machine-set-operator-78cbb6b69f-wb8d2\" (UID: \"97f344fc-42a3-4630-af31-ea25b72941e6\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wb8d2" Sep 30 14:00:38 crc kubenswrapper[4676]: E0930 14:00:38.322644 4676 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c806cf0_18da_4035_bf9c_f134a1b23485.slice/crio-783730c496ee48ab5e1f8272e1dccad64ad4c3b4505f2bd49c198ba96ef48e5a.scope\": RecentStats: unable to find data in memory cache]" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.332745 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7t9ls\" (UniqueName: \"kubernetes.io/projected/b7545eda-fbd7-4d47-8f48-084b1319bf34-kube-api-access-7t9ls\") pod \"router-default-5444994796-tj9mh\" (UID: \"b7545eda-fbd7-4d47-8f48-084b1319bf34\") " pod="openshift-ingress/router-default-5444994796-tj9mh" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.374411 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eeefe4d3-1e4c-4149-9da0-9c3533991a83-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-ggdd9\" (UID: \"eeefe4d3-1e4c-4149-9da0-9c3533991a83\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ggdd9" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.374568 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f2376906-3f0c-4ba2-b27b-ae1464676554-bound-sa-token\") pod \"ingress-operator-5b745b69d9-gcvn8\" (UID: \"f2376906-3f0c-4ba2-b27b-ae1464676554\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gcvn8" Sep 30 14:00:38 crc kubenswrapper[4676]: W0930 14:00:38.379066 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa148973_699f_4aeb_839b_ec65aa1a2a37.slice/crio-63aa99d89639dd232f52b832e7d3d44ff5cea1ffd8dafb0d72e75114b00e96c7 WatchSource:0}: Error finding container 63aa99d89639dd232f52b832e7d3d44ff5cea1ffd8dafb0d72e75114b00e96c7: Status 404 returned error can't find the container with id 63aa99d89639dd232f52b832e7d3d44ff5cea1ffd8dafb0d72e75114b00e96c7 Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.388490 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:38 crc kubenswrapper[4676]: E0930 14:00:38.389454 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 14:00:38.889434142 +0000 UTC m=+142.872522571 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rgs57" (UID: "aa6b14cb-7f79-4bc6-bc14-58325daf3c86") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.407210 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjkk4\" (UniqueName: \"kubernetes.io/projected/f2376906-3f0c-4ba2-b27b-ae1464676554-kube-api-access-rjkk4\") pod \"ingress-operator-5b745b69d9-gcvn8\" (UID: \"f2376906-3f0c-4ba2-b27b-ae1464676554\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gcvn8" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.412143 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56bpp\" (UniqueName: \"kubernetes.io/projected/a296de91-e1c3-4820-a2ec-cedfc4eac0db-kube-api-access-56bpp\") pod \"openshift-controller-manager-operator-756b6f6bc6-p9gxt\" (UID: \"a296de91-e1c3-4820-a2ec-cedfc4eac0db\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p9gxt" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.450516 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpj79\" (UniqueName: \"kubernetes.io/projected/f8d50b6f-eaaa-4b6a-b447-a8614539c9f8-kube-api-access-tpj79\") pod \"service-ca-9c57cc56f-wfv7t\" (UID: \"f8d50b6f-eaaa-4b6a-b447-a8614539c9f8\") " pod="openshift-service-ca/service-ca-9c57cc56f-wfv7t" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.459923 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-zgcbx" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.473743 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrftm\" (UniqueName: \"kubernetes.io/projected/63ad8370-13f3-4bcc-80f4-8a8e6b657667-kube-api-access-zrftm\") pod \"migrator-59844c95c7-7ttmn\" (UID: \"63ad8370-13f3-4bcc-80f4-8a8e6b657667\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7ttmn" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.487753 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-5hctc" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.490599 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 14:00:38 crc kubenswrapper[4676]: E0930 14:00:38.492321 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 14:00:38.992293275 +0000 UTC m=+142.975381704 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.496683 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/05eb8f48-640e-413a-8845-9b2e3bf86f23-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-gmxgf\" (UID: \"05eb8f48-640e-413a-8845-9b2e3bf86f23\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gmxgf" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.508080 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-k8mk5" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.509776 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-qnmrm"] Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.514125 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjm2d\" (UniqueName: \"kubernetes.io/projected/36b6965c-2288-4689-bb83-d099cd6e4a3d-kube-api-access-sjm2d\") pod \"machine-config-server-h8fq9\" (UID: \"36b6965c-2288-4689-bb83-d099cd6e4a3d\") " pod="openshift-machine-config-operator/machine-config-server-h8fq9" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.535781 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p9gxt" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.539360 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gcvn8" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.547649 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fnsg\" (UniqueName: \"kubernetes.io/projected/ad665bdf-4477-45c2-bcde-96a4042e2176-kube-api-access-7fnsg\") pod \"package-server-manager-789f6589d5-r28xn\" (UID: \"ad665bdf-4477-45c2-bcde-96a4042e2176\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r28xn" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.567816 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ggdd9" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.572658 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsxks\" (UniqueName: \"kubernetes.io/projected/c23c466d-b781-4d90-a824-3061cf8890be-kube-api-access-tsxks\") pod \"service-ca-operator-777779d784-8dzwh\" (UID: \"c23c466d-b781-4d90-a824-3061cf8890be\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8dzwh" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.573175 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wb8d2" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.573874 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjgnk\" (UniqueName: \"kubernetes.io/projected/6709a903-5bfa-42c6-be52-df8efb1d106e-kube-api-access-tjgnk\") pod \"collect-profiles-29320680-q9p7h\" (UID: \"6709a903-5bfa-42c6-be52-df8efb1d106e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320680-q9p7h" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.580300 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-tj9mh" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.587093 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zgqt5" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.593104 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:38 crc kubenswrapper[4676]: E0930 14:00:38.593625 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 14:00:39.093601965 +0000 UTC m=+143.076690394 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rgs57" (UID: "aa6b14cb-7f79-4bc6-bc14-58325daf3c86") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.595580 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22c2x\" (UniqueName: \"kubernetes.io/projected/2b51a7d3-7ee2-45a0-afa2-e0fb3bbed0b6-kube-api-access-22c2x\") pod \"csi-hostpathplugin-rnmq9\" (UID: \"2b51a7d3-7ee2-45a0-afa2-e0fb3bbed0b6\") " pod="hostpath-provisioner/csi-hostpathplugin-rnmq9" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.614466 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gmxgf" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.614981 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j48p6\" (UniqueName: \"kubernetes.io/projected/9ae45d95-5ab3-4a8f-906a-6ed43bb21612-kube-api-access-j48p6\") pod \"packageserver-d55dfcdfc-52tgw\" (UID: \"9ae45d95-5ab3-4a8f-906a-6ed43bb21612\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-52tgw" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.621115 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r28xn" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.627414 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7ttmn" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.631567 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xfd9\" (UniqueName: \"kubernetes.io/projected/41db2279-05aa-4558-8f60-1761471ba62c-kube-api-access-4xfd9\") pod \"ingress-canary-sngkt\" (UID: \"41db2279-05aa-4558-8f60-1761471ba62c\") " pod="openshift-ingress-canary/ingress-canary-sngkt" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.642060 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-8dzwh" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.650799 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-wfv7t" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.664411 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320680-q9p7h" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.667127 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v68kh\" (UniqueName: \"kubernetes.io/projected/4ec8c2c0-6649-4fce-bd49-c178b25d9da1-kube-api-access-v68kh\") pod \"marketplace-operator-79b997595-7wrkd\" (UID: \"4ec8c2c0-6649-4fce-bd49-c178b25d9da1\") " pod="openshift-marketplace/marketplace-operator-79b997595-7wrkd" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.667569 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7wrkd" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.677033 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-52tgw" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.682649 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nx5vn\" (UniqueName: \"kubernetes.io/projected/515397b2-ff6d-491a-9c49-fb217236b19f-kube-api-access-nx5vn\") pod \"dns-default-j465h\" (UID: \"515397b2-ff6d-491a-9c49-fb217236b19f\") " pod="openshift-dns/dns-default-j465h" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.695551 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-rnmq9" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.696126 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 14:00:38 crc kubenswrapper[4676]: E0930 14:00:38.696617 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 14:00:39.196587501 +0000 UTC m=+143.179675930 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.705374 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-h8fq9" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.720063 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-sngkt" Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.722969 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-j465h" Sep 30 14:00:38 crc kubenswrapper[4676]: W0930 14:00:38.767783 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ef453c0_cdb0_4da7_8c32_3b975e1009a1.slice/crio-a07842733d2ddd61ba75d79aeacef9d954ece9cfe86a79fc2ca4ed7d192870ac WatchSource:0}: Error finding container a07842733d2ddd61ba75d79aeacef9d954ece9cfe86a79fc2ca4ed7d192870ac: Status 404 returned error can't find the container with id a07842733d2ddd61ba75d79aeacef9d954ece9cfe86a79fc2ca4ed7d192870ac Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.798619 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:38 crc kubenswrapper[4676]: E0930 14:00:38.799131 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 14:00:39.299114706 +0000 UTC m=+143.282203135 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rgs57" (UID: "aa6b14cb-7f79-4bc6-bc14-58325daf3c86") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.856698 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-mqdxw"] Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.867023 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-66mls"] Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.870330 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-rs79q"] Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.900138 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 14:00:38 crc kubenswrapper[4676]: E0930 14:00:38.900958 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 14:00:39.400908369 +0000 UTC m=+143.383996808 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.943993 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s5h2z"] Sep 30 14:00:38 crc kubenswrapper[4676]: I0930 14:00:38.946065 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-7p4nz"] Sep 30 14:00:39 crc kubenswrapper[4676]: I0930 14:00:39.002628 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:39 crc kubenswrapper[4676]: E0930 14:00:39.003107 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 14:00:39.503092974 +0000 UTC m=+143.486181403 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rgs57" (UID: "aa6b14cb-7f79-4bc6-bc14-58325daf3c86") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:39 crc kubenswrapper[4676]: I0930 14:00:39.062930 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-2sdrd"] Sep 30 14:00:39 crc kubenswrapper[4676]: I0930 14:00:39.105818 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 14:00:39 crc kubenswrapper[4676]: E0930 14:00:39.106268 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 14:00:39.606247496 +0000 UTC m=+143.589335925 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:39 crc kubenswrapper[4676]: W0930 14:00:39.130144 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf652cd09_f743_4500_bace_2652974c9ef3.slice/crio-aff1c3c405a989b6677334f7b9b4ec0a3511833645eb4b4e5e1dc66ff5549cb8 WatchSource:0}: Error finding container aff1c3c405a989b6677334f7b9b4ec0a3511833645eb4b4e5e1dc66ff5549cb8: Status 404 returned error can't find the container with id aff1c3c405a989b6677334f7b9b4ec0a3511833645eb4b4e5e1dc66ff5549cb8 Sep 30 14:00:39 crc kubenswrapper[4676]: I0930 14:00:39.150710 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6mnvc"] Sep 30 14:00:39 crc kubenswrapper[4676]: I0930 14:00:39.178002 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-rs79q" event={"ID":"a60ae8d9-5908-469c-aede-1af2fb6b8631","Type":"ContainerStarted","Data":"24be3c881006a886dc1b8f88888504d01e5312bc582332cd898e5364d58a7045"} Sep 30 14:00:39 crc kubenswrapper[4676]: W0930 14:00:39.188070 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod216b0fb0_8e21_4b04_96c0_1f2c984b09e2.slice/crio-ad28f438d67bce47cb33b3a3db3efc119ce5c101aef4c42419921ab00bb907c7 WatchSource:0}: Error finding container ad28f438d67bce47cb33b3a3db3efc119ce5c101aef4c42419921ab00bb907c7: Status 404 returned error can't find the container with id ad28f438d67bce47cb33b3a3db3efc119ce5c101aef4c42419921ab00bb907c7 Sep 30 14:00:39 crc kubenswrapper[4676]: I0930 14:00:39.192966 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-66mls" event={"ID":"6bf298e3-e793-4a56-9856-094437b77046","Type":"ContainerStarted","Data":"6faeaddeada9dc85fb7a337475ae6835b83a6bff6830c078da30cbec42132884"} Sep 30 14:00:39 crc kubenswrapper[4676]: I0930 14:00:39.194970 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-qnmrm" event={"ID":"2ef453c0-cdb0-4da7-8c32-3b975e1009a1","Type":"ContainerStarted","Data":"a07842733d2ddd61ba75d79aeacef9d954ece9cfe86a79fc2ca4ed7d192870ac"} Sep 30 14:00:39 crc kubenswrapper[4676]: I0930 14:00:39.197558 4676 generic.go:334] "Generic (PLEG): container finished" podID="4c806cf0-18da-4035-bf9c-f134a1b23485" containerID="783730c496ee48ab5e1f8272e1dccad64ad4c3b4505f2bd49c198ba96ef48e5a" exitCode=0 Sep 30 14:00:39 crc kubenswrapper[4676]: I0930 14:00:39.197704 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4mgv7" event={"ID":"4c806cf0-18da-4035-bf9c-f134a1b23485","Type":"ContainerDied","Data":"783730c496ee48ab5e1f8272e1dccad64ad4c3b4505f2bd49c198ba96ef48e5a"} Sep 30 14:00:39 crc kubenswrapper[4676]: I0930 14:00:39.202525 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bthfv" event={"ID":"d39e9d64-8aa0-4b36-8cf6-9a7ebe5499ba","Type":"ContainerStarted","Data":"830458d6c35ce88bafa166858d773c96e917806e2d4711f22c0ce4b22e789a1a"} Sep 30 14:00:39 crc kubenswrapper[4676]: I0930 14:00:39.202583 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bthfv" event={"ID":"d39e9d64-8aa0-4b36-8cf6-9a7ebe5499ba","Type":"ContainerStarted","Data":"4edce11ae23abaeb1f83bab898634a9d8471f460b1e1f3001615a9beb367b7c8"} Sep 30 14:00:39 crc kubenswrapper[4676]: I0930 14:00:39.208366 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-5rrn6" event={"ID":"fa148973-699f-4aeb-839b-ec65aa1a2a37","Type":"ContainerStarted","Data":"63aa99d89639dd232f52b832e7d3d44ff5cea1ffd8dafb0d72e75114b00e96c7"} Sep 30 14:00:39 crc kubenswrapper[4676]: I0930 14:00:39.209344 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mqdxw" event={"ID":"f652cd09-f743-4500-bace-2652974c9ef3","Type":"ContainerStarted","Data":"aff1c3c405a989b6677334f7b9b4ec0a3511833645eb4b4e5e1dc66ff5549cb8"} Sep 30 14:00:39 crc kubenswrapper[4676]: I0930 14:00:39.209465 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:39 crc kubenswrapper[4676]: E0930 14:00:39.210499 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 14:00:39.710476557 +0000 UTC m=+143.693564996 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rgs57" (UID: "aa6b14cb-7f79-4bc6-bc14-58325daf3c86") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:39 crc kubenswrapper[4676]: I0930 14:00:39.211242 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-c9kjj" event={"ID":"ebd6b987-d54a-4692-800a-8eadc5e8690c","Type":"ContainerStarted","Data":"07b52ba0aa25a65a18e15b7295aeb714a15ec13d3ca735002295abee5c06da78"} Sep 30 14:00:39 crc kubenswrapper[4676]: I0930 14:00:39.211267 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-c9kjj" event={"ID":"ebd6b987-d54a-4692-800a-8eadc5e8690c","Type":"ContainerStarted","Data":"ec7249d6a6578227e99c7439ece4156ca794ef196d7f3dc8e2a507c48b747146"} Sep 30 14:00:39 crc kubenswrapper[4676]: I0930 14:00:39.215810 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-wwnsl" event={"ID":"3fc15677-b9d9-4e15-ac99-4072e3ad4c91","Type":"ContainerStarted","Data":"4689f4f39a125ff3b8ae2dde17069201a53f16c92c151c53a0a6fb050d437ddb"} Sep 30 14:00:39 crc kubenswrapper[4676]: I0930 14:00:39.215861 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-wwnsl" event={"ID":"3fc15677-b9d9-4e15-ac99-4072e3ad4c91","Type":"ContainerStarted","Data":"1e16fc6fb984ab9705e77b7dfb872e4a4a4b4d49498db8f7c2e55b79c878f00e"} Sep 30 14:00:39 crc kubenswrapper[4676]: I0930 14:00:39.216583 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-wwnsl" Sep 30 14:00:39 crc kubenswrapper[4676]: I0930 14:00:39.222999 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7sxgh" event={"ID":"c931aa3f-02a2-4dd7-bd02-a6a8e4484875","Type":"ContainerStarted","Data":"ed3ed6fcb50de66959cf3f01dca17684da8dbb008a7b7feec789f9812b5cdf6c"} Sep 30 14:00:39 crc kubenswrapper[4676]: I0930 14:00:39.223080 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7sxgh" event={"ID":"c931aa3f-02a2-4dd7-bd02-a6a8e4484875","Type":"ContainerStarted","Data":"6104d18bd31e4a76f4c56eb25514b6fa47a3782c83313d92dc44801e871326a8"} Sep 30 14:00:39 crc kubenswrapper[4676]: I0930 14:00:39.232718 4676 patch_prober.go:28] interesting pod/console-operator-58897d9998-wwnsl container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/readyz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Sep 30 14:00:39 crc kubenswrapper[4676]: I0930 14:00:39.232803 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-wwnsl" podUID="3fc15677-b9d9-4e15-ac99-4072e3ad4c91" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.12:8443/readyz\": dial tcp 10.217.0.12:8443: connect: connection refused" Sep 30 14:00:39 crc kubenswrapper[4676]: I0930 14:00:39.265112 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6fq4x" event={"ID":"10f8e308-867e-4dd1-be59-c73d8297cbfe","Type":"ContainerStarted","Data":"225f820b19a0270969158fe3b5f4bf694eecade2fe20dc07bd5e06b64b15132d"} Sep 30 14:00:39 crc kubenswrapper[4676]: I0930 14:00:39.267735 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6fq4x" event={"ID":"10f8e308-867e-4dd1-be59-c73d8297cbfe","Type":"ContainerStarted","Data":"14f484b3d8c7481d316f302ea32b6ea16d32f15194112e9f4e7a7fae265f2f81"} Sep 30 14:00:39 crc kubenswrapper[4676]: I0930 14:00:39.267756 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6fq4x" event={"ID":"10f8e308-867e-4dd1-be59-c73d8297cbfe","Type":"ContainerStarted","Data":"b5553a70b3cee7efd38d384b98bb342cf2392e45e579e8b7dd4e53e633cb4f6a"} Sep 30 14:00:39 crc kubenswrapper[4676]: I0930 14:00:39.269255 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-gcvn8"] Sep 30 14:00:39 crc kubenswrapper[4676]: I0930 14:00:39.269678 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gn7dj" event={"ID":"0b3b285f-9406-4f9b-9768-8827933418d7","Type":"ContainerStarted","Data":"bb9f3c494c7aa1a871ede7d9a4d19bc790048d46eefbc6f4f7a6fc1e15ce6ddf"} Sep 30 14:00:39 crc kubenswrapper[4676]: I0930 14:00:39.270070 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gn7dj" Sep 30 14:00:39 crc kubenswrapper[4676]: I0930 14:00:39.271645 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-5hctc"] Sep 30 14:00:39 crc kubenswrapper[4676]: I0930 14:00:39.281120 4676 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-gn7dj container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Sep 30 14:00:39 crc kubenswrapper[4676]: I0930 14:00:39.281182 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gn7dj" podUID="0b3b285f-9406-4f9b-9768-8827933418d7" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Sep 30 14:00:39 crc kubenswrapper[4676]: I0930 14:00:39.283971 4676 generic.go:334] "Generic (PLEG): container finished" podID="5d26a632-0017-40e4-ac56-5794c4800d83" containerID="c259a6bb33937196b9d5e0dd0aea7a466f8f74f673b57ebbd55ae70ced0eeb48" exitCode=0 Sep 30 14:00:39 crc kubenswrapper[4676]: I0930 14:00:39.284156 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-xr2v8" event={"ID":"5d26a632-0017-40e4-ac56-5794c4800d83","Type":"ContainerDied","Data":"c259a6bb33937196b9d5e0dd0aea7a466f8f74f673b57ebbd55ae70ced0eeb48"} Sep 30 14:00:39 crc kubenswrapper[4676]: I0930 14:00:39.284916 4676 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-qbdpc container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Sep 30 14:00:39 crc kubenswrapper[4676]: I0930 14:00:39.284977 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-qbdpc" podUID="61b1921b-4102-4959-abd7-c86ca3ae880e" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Sep 30 14:00:39 crc kubenswrapper[4676]: I0930 14:00:39.299916 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-zgcbx"] Sep 30 14:00:39 crc kubenswrapper[4676]: I0930 14:00:39.311417 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 14:00:39 crc kubenswrapper[4676]: E0930 14:00:39.312636 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 14:00:39.812584789 +0000 UTC m=+143.795673218 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:39 crc kubenswrapper[4676]: I0930 14:00:39.378783 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gn7dj" podStartSLOduration=117.378760522 podStartE2EDuration="1m57.378760522s" podCreationTimestamp="2025-09-30 13:58:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:00:39.376394156 +0000 UTC m=+143.359482595" watchObservedRunningTime="2025-09-30 14:00:39.378760522 +0000 UTC m=+143.361848951" Sep 30 14:00:39 crc kubenswrapper[4676]: W0930 14:00:39.413759 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36b6965c_2288_4689_bb83_d099cd6e4a3d.slice/crio-d9efa3d5a3e21c1da53c19d3d1500533af41c948555f22b9faf654f693b2da99 WatchSource:0}: Error finding container d9efa3d5a3e21c1da53c19d3d1500533af41c948555f22b9faf654f693b2da99: Status 404 returned error can't find the container with id d9efa3d5a3e21c1da53c19d3d1500533af41c948555f22b9faf654f693b2da99 Sep 30 14:00:39 crc kubenswrapper[4676]: I0930 14:00:39.417689 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:39 crc kubenswrapper[4676]: I0930 14:00:39.419232 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zgqt5"] Sep 30 14:00:39 crc kubenswrapper[4676]: E0930 14:00:39.419922 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 14:00:39.919904657 +0000 UTC m=+143.902993086 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rgs57" (UID: "aa6b14cb-7f79-4bc6-bc14-58325daf3c86") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:39 crc kubenswrapper[4676]: I0930 14:00:39.493415 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ggdd9"] Sep 30 14:00:39 crc kubenswrapper[4676]: I0930 14:00:39.521481 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-6fq4x" podStartSLOduration=118.521453784 podStartE2EDuration="1m58.521453784s" podCreationTimestamp="2025-09-30 13:58:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:00:39.468419938 +0000 UTC m=+143.451508367" watchObservedRunningTime="2025-09-30 14:00:39.521453784 +0000 UTC m=+143.504542213" Sep 30 14:00:39 crc kubenswrapper[4676]: I0930 14:00:39.525478 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 14:00:39 crc kubenswrapper[4676]: E0930 14:00:39.526028 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 14:00:40.026011401 +0000 UTC m=+144.009099830 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:39 crc kubenswrapper[4676]: I0930 14:00:39.533816 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wb8d2"] Sep 30 14:00:39 crc kubenswrapper[4676]: I0930 14:00:39.584525 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320680-q9p7h"] Sep 30 14:00:39 crc kubenswrapper[4676]: I0930 14:00:39.605714 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p9gxt"] Sep 30 14:00:39 crc kubenswrapper[4676]: I0930 14:00:39.621303 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-k8mk5"] Sep 30 14:00:39 crc kubenswrapper[4676]: I0930 14:00:39.631230 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:39 crc kubenswrapper[4676]: E0930 14:00:39.631689 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 14:00:40.131675072 +0000 UTC m=+144.114763501 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rgs57" (UID: "aa6b14cb-7f79-4bc6-bc14-58325daf3c86") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:39 crc kubenswrapper[4676]: I0930 14:00:39.734511 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 14:00:39 crc kubenswrapper[4676]: E0930 14:00:39.734978 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 14:00:40.234953877 +0000 UTC m=+144.218042306 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:39 crc kubenswrapper[4676]: I0930 14:00:39.834538 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gmxgf"] Sep 30 14:00:39 crc kubenswrapper[4676]: I0930 14:00:39.842628 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:39 crc kubenswrapper[4676]: E0930 14:00:39.843118 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 14:00:40.343099227 +0000 UTC m=+144.326187656 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rgs57" (UID: "aa6b14cb-7f79-4bc6-bc14-58325daf3c86") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:39 crc kubenswrapper[4676]: I0930 14:00:39.891776 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-wwnsl" podStartSLOduration=118.891758832 podStartE2EDuration="1m58.891758832s" podCreationTimestamp="2025-09-30 13:58:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:00:39.889697075 +0000 UTC m=+143.872785514" watchObservedRunningTime="2025-09-30 14:00:39.891758832 +0000 UTC m=+143.874847261" Sep 30 14:00:39 crc kubenswrapper[4676]: I0930 14:00:39.947580 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 14:00:39 crc kubenswrapper[4676]: E0930 14:00:39.947800 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 14:00:40.447761801 +0000 UTC m=+144.430850230 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:39 crc kubenswrapper[4676]: I0930 14:00:39.948106 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:39 crc kubenswrapper[4676]: E0930 14:00:39.948509 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 14:00:40.448502142 +0000 UTC m=+144.431590561 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rgs57" (UID: "aa6b14cb-7f79-4bc6-bc14-58325daf3c86") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:40 crc kubenswrapper[4676]: I0930 14:00:40.054527 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 14:00:40 crc kubenswrapper[4676]: E0930 14:00:40.054963 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 14:00:40.554937325 +0000 UTC m=+144.538025754 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:40 crc kubenswrapper[4676]: I0930 14:00:40.073141 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h9nl2" podStartSLOduration=119.073115591 podStartE2EDuration="1m59.073115591s" podCreationTimestamp="2025-09-30 13:58:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:00:40.025041962 +0000 UTC m=+144.008130401" watchObservedRunningTime="2025-09-30 14:00:40.073115591 +0000 UTC m=+144.056204020" Sep 30 14:00:40 crc kubenswrapper[4676]: I0930 14:00:40.134690 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-qhnkq" podStartSLOduration=119.134668174 podStartE2EDuration="1m59.134668174s" podCreationTimestamp="2025-09-30 13:58:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:00:40.131095925 +0000 UTC m=+144.114184354" watchObservedRunningTime="2025-09-30 14:00:40.134668174 +0000 UTC m=+144.117756613" Sep 30 14:00:40 crc kubenswrapper[4676]: I0930 14:00:40.169918 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:40 crc kubenswrapper[4676]: E0930 14:00:40.170429 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 14:00:40.670416409 +0000 UTC m=+144.653504828 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rgs57" (UID: "aa6b14cb-7f79-4bc6-bc14-58325daf3c86") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:40 crc kubenswrapper[4676]: I0930 14:00:40.256523 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-c9kjj" podStartSLOduration=118.256504936 podStartE2EDuration="1m58.256504936s" podCreationTimestamp="2025-09-30 13:58:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:00:40.228353661 +0000 UTC m=+144.211442090" watchObservedRunningTime="2025-09-30 14:00:40.256504936 +0000 UTC m=+144.239593365" Sep 30 14:00:40 crc kubenswrapper[4676]: I0930 14:00:40.257748 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-wfv7t"] Sep 30 14:00:40 crc kubenswrapper[4676]: I0930 14:00:40.272554 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 14:00:40 crc kubenswrapper[4676]: E0930 14:00:40.273111 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 14:00:40.773087497 +0000 UTC m=+144.756175926 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:40 crc kubenswrapper[4676]: I0930 14:00:40.335754 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2sdrd" event={"ID":"216b0fb0-8e21-4b04-96c0-1f2c984b09e2","Type":"ContainerStarted","Data":"ad28f438d67bce47cb33b3a3db3efc119ce5c101aef4c42419921ab00bb907c7"} Sep 30 14:00:40 crc kubenswrapper[4676]: I0930 14:00:40.374003 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:40 crc kubenswrapper[4676]: E0930 14:00:40.374380 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 14:00:40.874360967 +0000 UTC m=+144.857449436 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rgs57" (UID: "aa6b14cb-7f79-4bc6-bc14-58325daf3c86") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:40 crc kubenswrapper[4676]: I0930 14:00:40.379798 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-5rrn6" event={"ID":"fa148973-699f-4aeb-839b-ec65aa1a2a37","Type":"ContainerStarted","Data":"bcdc7fc7bf05c2253934eb3b8d9132acffe413fdc2e8289ab7112e4a03f2cffa"} Sep 30 14:00:40 crc kubenswrapper[4676]: I0930 14:00:40.418181 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-52tgw"] Sep 30 14:00:40 crc kubenswrapper[4676]: I0930 14:00:40.441468 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6mnvc" event={"ID":"627ad3bd-de42-40d1-ac5c-ab04b2b2ddca","Type":"ContainerStarted","Data":"e93ab197d7d4514fee618d9d9c07d896c2dc00123f90c4da18044186b63477ef"} Sep 30 14:00:40 crc kubenswrapper[4676]: I0930 14:00:40.499033 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r28xn"] Sep 30 14:00:40 crc kubenswrapper[4676]: I0930 14:00:40.502173 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 14:00:40 crc kubenswrapper[4676]: E0930 14:00:40.505459 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 14:00:41.005437525 +0000 UTC m=+144.988525954 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:40 crc kubenswrapper[4676]: I0930 14:00:40.506944 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:40 crc kubenswrapper[4676]: E0930 14:00:40.507748 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 14:00:41.007739479 +0000 UTC m=+144.990827908 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rgs57" (UID: "aa6b14cb-7f79-4bc6-bc14-58325daf3c86") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:40 crc kubenswrapper[4676]: I0930 14:00:40.544158 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7wrkd"] Sep 30 14:00:40 crc kubenswrapper[4676]: I0930 14:00:40.556556 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-h8fq9" event={"ID":"36b6965c-2288-4689-bb83-d099cd6e4a3d","Type":"ContainerStarted","Data":"d9efa3d5a3e21c1da53c19d3d1500533af41c948555f22b9faf654f693b2da99"} Sep 30 14:00:40 crc kubenswrapper[4676]: I0930 14:00:40.569183 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7sxgh" podStartSLOduration=119.569167919 podStartE2EDuration="1m59.569167919s" podCreationTimestamp="2025-09-30 13:58:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:00:40.535046049 +0000 UTC m=+144.518134488" watchObservedRunningTime="2025-09-30 14:00:40.569167919 +0000 UTC m=+144.552256348" Sep 30 14:00:40 crc kubenswrapper[4676]: I0930 14:00:40.569355 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-7ttmn"] Sep 30 14:00:40 crc kubenswrapper[4676]: I0930 14:00:40.569382 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-sngkt"] Sep 30 14:00:40 crc kubenswrapper[4676]: I0930 14:00:40.574373 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-qbdpc" podStartSLOduration=119.574356294 podStartE2EDuration="1m59.574356294s" podCreationTimestamp="2025-09-30 13:58:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:00:40.569118458 +0000 UTC m=+144.552206887" watchObservedRunningTime="2025-09-30 14:00:40.574356294 +0000 UTC m=+144.557444743" Sep 30 14:00:40 crc kubenswrapper[4676]: I0930 14:00:40.580032 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zgqt5" event={"ID":"f1f17b4e-4e0a-418e-9c13-33300291d209","Type":"ContainerStarted","Data":"5e6861e0056332e026c211b3ab866df4511c20c996af413cc16f97f40ccee26e"} Sep 30 14:00:40 crc kubenswrapper[4676]: I0930 14:00:40.589350 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-tj9mh" event={"ID":"b7545eda-fbd7-4d47-8f48-084b1319bf34","Type":"ContainerStarted","Data":"c48ccf16dd98b4f0d52b4fd097ef3f022a6044aef7e1a35dd2a62e3ab510a74a"} Sep 30 14:00:40 crc kubenswrapper[4676]: I0930 14:00:40.603382 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ggdd9" event={"ID":"eeefe4d3-1e4c-4149-9da0-9c3533991a83","Type":"ContainerStarted","Data":"841040ea63a6e893704c94b7a8e5f6d5777b044c0161debdebf7b3fb98b61a31"} Sep 30 14:00:40 crc kubenswrapper[4676]: I0930 14:00:40.612433 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 14:00:40 crc kubenswrapper[4676]: E0930 14:00:40.613231 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 14:00:41.113202155 +0000 UTC m=+145.096290584 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:40 crc kubenswrapper[4676]: I0930 14:00:40.644194 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-j465h"] Sep 30 14:00:40 crc kubenswrapper[4676]: I0930 14:00:40.672064 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-8dzwh"] Sep 30 14:00:40 crc kubenswrapper[4676]: W0930 14:00:40.673297 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97f344fc_42a3_4630_af31_ea25b72941e6.slice/crio-5f101c80931531bdf52d363fa22ca58b18bda7dc9b0df6b343eb8feca7057946 WatchSource:0}: Error finding container 5f101c80931531bdf52d363fa22ca58b18bda7dc9b0df6b343eb8feca7057946: Status 404 returned error can't find the container with id 5f101c80931531bdf52d363fa22ca58b18bda7dc9b0df6b343eb8feca7057946 Sep 30 14:00:40 crc kubenswrapper[4676]: I0930 14:00:40.674229 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s5h2z" event={"ID":"0c9f7b25-e209-4e15-b74f-ac572638fc9a","Type":"ContainerStarted","Data":"2a332835ff1797cf59bc78a07190e0e93538d137c652a2a84edc7497c566bf39"} Sep 30 14:00:40 crc kubenswrapper[4676]: W0930 14:00:40.688439 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ae45d95_5ab3_4a8f_906a_6ed43bb21612.slice/crio-34c0d89f7d39ebf10b3c082e0e51dea76a5dda43ac97bfa1b36b0ecc2fefd1d8 WatchSource:0}: Error finding container 34c0d89f7d39ebf10b3c082e0e51dea76a5dda43ac97bfa1b36b0ecc2fefd1d8: Status 404 returned error can't find the container with id 34c0d89f7d39ebf10b3c082e0e51dea76a5dda43ac97bfa1b36b0ecc2fefd1d8 Sep 30 14:00:40 crc kubenswrapper[4676]: I0930 14:00:40.698705 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-rnmq9"] Sep 30 14:00:40 crc kubenswrapper[4676]: I0930 14:00:40.715630 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:40 crc kubenswrapper[4676]: E0930 14:00:40.728208 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 14:00:41.228188605 +0000 UTC m=+145.211277034 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rgs57" (UID: "aa6b14cb-7f79-4bc6-bc14-58325daf3c86") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:40 crc kubenswrapper[4676]: I0930 14:00:40.731098 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7p4nz" event={"ID":"3efa2991-05a6-465c-b8c1-105edce450d9","Type":"ContainerStarted","Data":"a5cf2394875a2b60bfc0ed3928ef47000b35cb3446a4c8e7106fb317a7f236c9"} Sep 30 14:00:40 crc kubenswrapper[4676]: I0930 14:00:40.769651 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gcvn8" event={"ID":"f2376906-3f0c-4ba2-b27b-ae1464676554","Type":"ContainerStarted","Data":"01020ecfa11d277caaa9b3c813c4f868ce33f6f1795e87126ada83fbb0bfa3f8"} Sep 30 14:00:40 crc kubenswrapper[4676]: I0930 14:00:40.770987 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-tj9mh" podStartSLOduration=119.770974497 podStartE2EDuration="1m59.770974497s" podCreationTimestamp="2025-09-30 13:58:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:00:40.770261447 +0000 UTC m=+144.753349876" watchObservedRunningTime="2025-09-30 14:00:40.770974497 +0000 UTC m=+144.754062926" Sep 30 14:00:40 crc kubenswrapper[4676]: I0930 14:00:40.805399 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-zgcbx" event={"ID":"1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d","Type":"ContainerStarted","Data":"a7d1e40a274a98d8dff2e0d85a1bc4ade490d1fdac6303a396af1869601339d6"} Sep 30 14:00:40 crc kubenswrapper[4676]: I0930 14:00:40.808000 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320680-q9p7h" event={"ID":"6709a903-5bfa-42c6-be52-df8efb1d106e","Type":"ContainerStarted","Data":"56cb7f2a7b3c3d7f359899746aba8e0b5796f31ddbcd4873b4e1942e0c4e5951"} Sep 30 14:00:40 crc kubenswrapper[4676]: I0930 14:00:40.812517 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-5hctc" event={"ID":"69f91148-b6d8-4a40-a4b4-f0411e9617ed","Type":"ContainerStarted","Data":"3126bd9243eee0618158b9a7264d167faa92d39051a5c3aa77254b85d0a97c88"} Sep 30 14:00:40 crc kubenswrapper[4676]: I0930 14:00:40.821236 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gn7dj" Sep 30 14:00:40 crc kubenswrapper[4676]: I0930 14:00:40.829484 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 14:00:40 crc kubenswrapper[4676]: E0930 14:00:40.830956 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 14:00:41.330937906 +0000 UTC m=+145.314026345 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:40 crc kubenswrapper[4676]: I0930 14:00:40.930950 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:40 crc kubenswrapper[4676]: E0930 14:00:40.931325 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 14:00:41.43131233 +0000 UTC m=+145.414400759 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rgs57" (UID: "aa6b14cb-7f79-4bc6-bc14-58325daf3c86") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:41 crc kubenswrapper[4676]: I0930 14:00:41.032911 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 14:00:41 crc kubenswrapper[4676]: E0930 14:00:41.033501 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 14:00:41.533481454 +0000 UTC m=+145.516569883 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:41 crc kubenswrapper[4676]: I0930 14:00:41.134777 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:41 crc kubenswrapper[4676]: E0930 14:00:41.135250 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 14:00:41.635233487 +0000 UTC m=+145.618321926 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rgs57" (UID: "aa6b14cb-7f79-4bc6-bc14-58325daf3c86") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:41 crc kubenswrapper[4676]: I0930 14:00:41.235772 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 14:00:41 crc kubenswrapper[4676]: E0930 14:00:41.237102 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 14:00:41.737072052 +0000 UTC m=+145.720160481 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:41 crc kubenswrapper[4676]: I0930 14:00:41.337785 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:41 crc kubenswrapper[4676]: E0930 14:00:41.338128 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 14:00:41.838117134 +0000 UTC m=+145.821205553 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rgs57" (UID: "aa6b14cb-7f79-4bc6-bc14-58325daf3c86") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:41 crc kubenswrapper[4676]: I0930 14:00:41.438505 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 14:00:41 crc kubenswrapper[4676]: E0930 14:00:41.438840 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 14:00:41.938825018 +0000 UTC m=+145.921913447 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:41 crc kubenswrapper[4676]: I0930 14:00:41.476102 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-wwnsl" Sep 30 14:00:41 crc kubenswrapper[4676]: I0930 14:00:41.539785 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:41 crc kubenswrapper[4676]: E0930 14:00:41.540227 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 14:00:42.04021106 +0000 UTC m=+146.023299499 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rgs57" (UID: "aa6b14cb-7f79-4bc6-bc14-58325daf3c86") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:41 crc kubenswrapper[4676]: I0930 14:00:41.591338 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-tj9mh" Sep 30 14:00:41 crc kubenswrapper[4676]: I0930 14:00:41.603120 4676 patch_prober.go:28] interesting pod/router-default-5444994796-tj9mh container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Sep 30 14:00:41 crc kubenswrapper[4676]: I0930 14:00:41.603179 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tj9mh" podUID="b7545eda-fbd7-4d47-8f48-084b1319bf34" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Sep 30 14:00:41 crc kubenswrapper[4676]: I0930 14:00:41.644100 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 14:00:41 crc kubenswrapper[4676]: E0930 14:00:41.644845 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 14:00:42.144820762 +0000 UTC m=+146.127909181 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:41 crc kubenswrapper[4676]: I0930 14:00:41.746219 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:41 crc kubenswrapper[4676]: E0930 14:00:41.746556 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 14:00:42.246544074 +0000 UTC m=+146.229632503 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rgs57" (UID: "aa6b14cb-7f79-4bc6-bc14-58325daf3c86") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:41 crc kubenswrapper[4676]: I0930 14:00:41.845048 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7ttmn" event={"ID":"63ad8370-13f3-4bcc-80f4-8a8e6b657667","Type":"ContainerStarted","Data":"78edea8ccc60951e7c8960b978332a7512de2ea71b18fd7f8a212dde46cb2366"} Sep 30 14:00:41 crc kubenswrapper[4676]: I0930 14:00:41.847834 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 14:00:41 crc kubenswrapper[4676]: E0930 14:00:41.848122 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 14:00:42.34809339 +0000 UTC m=+146.331181819 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:41 crc kubenswrapper[4676]: I0930 14:00:41.848566 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:41 crc kubenswrapper[4676]: E0930 14:00:41.849158 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 14:00:42.349124289 +0000 UTC m=+146.332212718 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rgs57" (UID: "aa6b14cb-7f79-4bc6-bc14-58325daf3c86") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:41 crc kubenswrapper[4676]: I0930 14:00:41.856906 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gcvn8" event={"ID":"f2376906-3f0c-4ba2-b27b-ae1464676554","Type":"ContainerStarted","Data":"fc8ce160a96911fee827a0da206d6d0857c453b290465cd33c2a7faeea285aaa"} Sep 30 14:00:41 crc kubenswrapper[4676]: I0930 14:00:41.863802 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-wfv7t" event={"ID":"f8d50b6f-eaaa-4b6a-b447-a8614539c9f8","Type":"ContainerStarted","Data":"b6913eac81434be16a4a78583ed0eb33603395eef424b64c4ad0fa9d72d128b8"} Sep 30 14:00:41 crc kubenswrapper[4676]: I0930 14:00:41.863857 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-wfv7t" event={"ID":"f8d50b6f-eaaa-4b6a-b447-a8614539c9f8","Type":"ContainerStarted","Data":"b880cdc3a7ac62f2a2c569e57b09912926c4e952c0580e84a92ebe44b203a2ac"} Sep 30 14:00:41 crc kubenswrapper[4676]: I0930 14:00:41.867022 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4mgv7" event={"ID":"4c806cf0-18da-4035-bf9c-f134a1b23485","Type":"ContainerStarted","Data":"8fc0467a104767ced8c5d073f8bdcb2e7d98207e956b547a2bcaa9f033193839"} Sep 30 14:00:41 crc kubenswrapper[4676]: I0930 14:00:41.869208 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-rnmq9" event={"ID":"2b51a7d3-7ee2-45a0-afa2-e0fb3bbed0b6","Type":"ContainerStarted","Data":"5ebdcec91172538a256a35ead8dce1b70a717c6b28a2897488d9876742a26b94"} Sep 30 14:00:41 crc kubenswrapper[4676]: I0930 14:00:41.872311 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s5h2z" event={"ID":"0c9f7b25-e209-4e15-b74f-ac572638fc9a","Type":"ContainerStarted","Data":"0504d6a2b710f246e7ea9112a9011a43e2e6bae88cac8fa38e251a37cdbcd48b"} Sep 30 14:00:41 crc kubenswrapper[4676]: I0930 14:00:41.875806 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-5hctc" event={"ID":"69f91148-b6d8-4a40-a4b4-f0411e9617ed","Type":"ContainerStarted","Data":"4b90928adc0388e49816a83d0a9d630b46766ffeef97fcc04e7a9479ea5ecfb5"} Sep 30 14:00:41 crc kubenswrapper[4676]: I0930 14:00:41.885088 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7p4nz" event={"ID":"3efa2991-05a6-465c-b8c1-105edce450d9","Type":"ContainerStarted","Data":"3532e40365119c2896be646946b70c902285c70083001e4d4e5d0e6a6a686966"} Sep 30 14:00:41 crc kubenswrapper[4676]: I0930 14:00:41.885159 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7p4nz" event={"ID":"3efa2991-05a6-465c-b8c1-105edce450d9","Type":"ContainerStarted","Data":"f224ae705bb7ddc11877203d3522b7a3e81ada3a3e469bacb24113b8edc6e88d"} Sep 30 14:00:41 crc kubenswrapper[4676]: I0930 14:00:41.887242 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6mnvc" event={"ID":"627ad3bd-de42-40d1-ac5c-ab04b2b2ddca","Type":"ContainerStarted","Data":"31b3e2d37acc1affa7deaf62ddea7354f98fd02749c004bc605f3b2ec32d4e11"} Sep 30 14:00:41 crc kubenswrapper[4676]: I0930 14:00:41.888724 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6mnvc" Sep 30 14:00:41 crc kubenswrapper[4676]: I0930 14:00:41.893301 4676 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-6mnvc container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" start-of-body= Sep 30 14:00:41 crc kubenswrapper[4676]: I0930 14:00:41.893358 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6mnvc" podUID="627ad3bd-de42-40d1-ac5c-ab04b2b2ddca" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" Sep 30 14:00:41 crc kubenswrapper[4676]: I0930 14:00:41.931032 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4mgv7" podStartSLOduration=119.931014239 podStartE2EDuration="1m59.931014239s" podCreationTimestamp="2025-09-30 13:58:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:00:41.928852939 +0000 UTC m=+145.911941368" watchObservedRunningTime="2025-09-30 14:00:41.931014239 +0000 UTC m=+145.914102668" Sep 30 14:00:41 crc kubenswrapper[4676]: I0930 14:00:41.941036 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-66mls" event={"ID":"6bf298e3-e793-4a56-9856-094437b77046","Type":"ContainerStarted","Data":"570106ec1767b53cb6f32ec46c17682414e5939ac0681248f56b65d4be8ae18e"} Sep 30 14:00:41 crc kubenswrapper[4676]: I0930 14:00:41.941970 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-66mls" Sep 30 14:00:41 crc kubenswrapper[4676]: I0930 14:00:41.951466 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 14:00:41 crc kubenswrapper[4676]: E0930 14:00:41.951671 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 14:00:42.451647103 +0000 UTC m=+146.434735522 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:41 crc kubenswrapper[4676]: I0930 14:00:41.951971 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:41 crc kubenswrapper[4676]: E0930 14:00:41.953467 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 14:00:42.453459434 +0000 UTC m=+146.436547863 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rgs57" (UID: "aa6b14cb-7f79-4bc6-bc14-58325daf3c86") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:41 crc kubenswrapper[4676]: I0930 14:00:41.954832 4676 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-66mls container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Sep 30 14:00:41 crc kubenswrapper[4676]: I0930 14:00:41.955234 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-66mls" podUID="6bf298e3-e793-4a56-9856-094437b77046" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" Sep 30 14:00:41 crc kubenswrapper[4676]: I0930 14:00:41.960698 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7p4nz" podStartSLOduration=119.960680075 podStartE2EDuration="1m59.960680075s" podCreationTimestamp="2025-09-30 13:58:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:00:41.960280843 +0000 UTC m=+145.943369282" watchObservedRunningTime="2025-09-30 14:00:41.960680075 +0000 UTC m=+145.943768504" Sep 30 14:00:41 crc kubenswrapper[4676]: I0930 14:00:41.961528 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-tj9mh" event={"ID":"b7545eda-fbd7-4d47-8f48-084b1319bf34","Type":"ContainerStarted","Data":"498175ac95111a1b842989303e84825867a6e942a685f1649b5c6fc0b65c0972"} Sep 30 14:00:41 crc kubenswrapper[4676]: I0930 14:00:41.970599 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r28xn" event={"ID":"ad665bdf-4477-45c2-bcde-96a4042e2176","Type":"ContainerStarted","Data":"fc5a652e7f8a70f5942a256fcf16a2fd9ee98bb90291d74d7fc0bc69e4068dda"} Sep 30 14:00:41 crc kubenswrapper[4676]: I0930 14:00:41.996415 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6mnvc" podStartSLOduration=119.996398659 podStartE2EDuration="1m59.996398659s" podCreationTimestamp="2025-09-30 13:58:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:00:41.995141154 +0000 UTC m=+145.978229583" watchObservedRunningTime="2025-09-30 14:00:41.996398659 +0000 UTC m=+145.979487088" Sep 30 14:00:42 crc kubenswrapper[4676]: I0930 14:00:42.000101 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wb8d2" event={"ID":"97f344fc-42a3-4630-af31-ea25b72941e6","Type":"ContainerStarted","Data":"5f101c80931531bdf52d363fa22ca58b18bda7dc9b0df6b343eb8feca7057946"} Sep 30 14:00:42 crc kubenswrapper[4676]: I0930 14:00:42.022086 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ggdd9" event={"ID":"eeefe4d3-1e4c-4149-9da0-9c3533991a83","Type":"ContainerStarted","Data":"b96a086daef8fb1f3f43a65e8861db60a91f4ab067190dac0405c977f1e0844f"} Sep 30 14:00:42 crc kubenswrapper[4676]: I0930 14:00:42.028013 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2sdrd" event={"ID":"216b0fb0-8e21-4b04-96c0-1f2c984b09e2","Type":"ContainerStarted","Data":"0a3eded0b28ff4644ac61fcbf5d1e2c08f62f7be7320a4e62b0ff087c20b6998"} Sep 30 14:00:42 crc kubenswrapper[4676]: I0930 14:00:42.036640 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bthfv" event={"ID":"d39e9d64-8aa0-4b36-8cf6-9a7ebe5499ba","Type":"ContainerStarted","Data":"76e7711a0a318e4f8d0c8a2ca4bfa4c362ced78e8e08973ec806253a64261def"} Sep 30 14:00:42 crc kubenswrapper[4676]: I0930 14:00:42.042036 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-xr2v8" event={"ID":"5d26a632-0017-40e4-ac56-5794c4800d83","Type":"ContainerStarted","Data":"6e15ec473bf6eb192133d9503e7a5cbe58dfdcb37b9a0d5e9fdbcd449f062c8b"} Sep 30 14:00:42 crc kubenswrapper[4676]: I0930 14:00:42.045604 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s5h2z" podStartSLOduration=120.045560917 podStartE2EDuration="2m0.045560917s" podCreationTimestamp="2025-09-30 13:58:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:00:42.044039505 +0000 UTC m=+146.027127944" watchObservedRunningTime="2025-09-30 14:00:42.045560917 +0000 UTC m=+146.028649356" Sep 30 14:00:42 crc kubenswrapper[4676]: I0930 14:00:42.046513 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-5rrn6" event={"ID":"fa148973-699f-4aeb-839b-ec65aa1a2a37","Type":"ContainerStarted","Data":"3ea44a1f99a453a3d153b46af495aca5290c954dee535faf9e3a9809eec326e5"} Sep 30 14:00:42 crc kubenswrapper[4676]: I0930 14:00:42.048068 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-5hctc" podStartSLOduration=121.048048627 podStartE2EDuration="2m1.048048627s" podCreationTimestamp="2025-09-30 13:58:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:00:42.024785459 +0000 UTC m=+146.007873888" watchObservedRunningTime="2025-09-30 14:00:42.048048627 +0000 UTC m=+146.031137056" Sep 30 14:00:42 crc kubenswrapper[4676]: I0930 14:00:42.048848 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-h8fq9" event={"ID":"36b6965c-2288-4689-bb83-d099cd6e4a3d","Type":"ContainerStarted","Data":"0c91dd22acdce93892731b79ef4f9a979ac7cd16c72d7cd5bf5a7642f16f407d"} Sep 30 14:00:42 crc kubenswrapper[4676]: I0930 14:00:42.050607 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zgqt5" event={"ID":"f1f17b4e-4e0a-418e-9c13-33300291d209","Type":"ContainerStarted","Data":"7575cfb52f9ab787d35d9bc0fd17dd13dfb6e82189dac142edc319d394212191"} Sep 30 14:00:42 crc kubenswrapper[4676]: I0930 14:00:42.052045 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7wrkd" event={"ID":"4ec8c2c0-6649-4fce-bd49-c178b25d9da1","Type":"ContainerStarted","Data":"bbd3ed4b1da8ec0fd64440ede714280bced06fa4169a8b6684fa9cfb0afe8791"} Sep 30 14:00:42 crc kubenswrapper[4676]: I0930 14:00:42.054360 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 14:00:42 crc kubenswrapper[4676]: E0930 14:00:42.055548 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 14:00:42.555531285 +0000 UTC m=+146.538619714 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:42 crc kubenswrapper[4676]: I0930 14:00:42.058785 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320680-q9p7h" event={"ID":"6709a903-5bfa-42c6-be52-df8efb1d106e","Type":"ContainerStarted","Data":"519d01dc7fce15dba99590d1f77048e20d3892fdab8716958f9af3893e8ad4b7"} Sep 30 14:00:42 crc kubenswrapper[4676]: I0930 14:00:42.063397 4676 generic.go:334] "Generic (PLEG): container finished" podID="f652cd09-f743-4500-bace-2652974c9ef3" containerID="5dfb0332767e750210bda3795fcba6b8477021cd4020aa3a55654ac5b8e361e8" exitCode=0 Sep 30 14:00:42 crc kubenswrapper[4676]: I0930 14:00:42.063486 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mqdxw" event={"ID":"f652cd09-f743-4500-bace-2652974c9ef3","Type":"ContainerDied","Data":"5dfb0332767e750210bda3795fcba6b8477021cd4020aa3a55654ac5b8e361e8"} Sep 30 14:00:42 crc kubenswrapper[4676]: I0930 14:00:42.067198 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-8dzwh" event={"ID":"c23c466d-b781-4d90-a824-3061cf8890be","Type":"ContainerStarted","Data":"94c530ac818e3187182b8f488101811700b041b9b5dbc11ac1876e81fb491fc5"} Sep 30 14:00:42 crc kubenswrapper[4676]: I0930 14:00:42.072696 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bthfv" podStartSLOduration=121.072678253 podStartE2EDuration="2m1.072678253s" podCreationTimestamp="2025-09-30 13:58:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:00:42.071531581 +0000 UTC m=+146.054620020" watchObservedRunningTime="2025-09-30 14:00:42.072678253 +0000 UTC m=+146.055766682" Sep 30 14:00:42 crc kubenswrapper[4676]: I0930 14:00:42.077207 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-sngkt" event={"ID":"41db2279-05aa-4558-8f60-1761471ba62c","Type":"ContainerStarted","Data":"d4a743ca1af1adca109af1f71daa6866e15aa3ea1006146e507a8ffd43a2ddab"} Sep 30 14:00:42 crc kubenswrapper[4676]: I0930 14:00:42.077252 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-sngkt" event={"ID":"41db2279-05aa-4558-8f60-1761471ba62c","Type":"ContainerStarted","Data":"1722e8ef915ffd68e7d69f77663e21fff5b6d258d89b7d1b3ca821cd5fb6bff9"} Sep 30 14:00:42 crc kubenswrapper[4676]: I0930 14:00:42.098195 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-j465h" event={"ID":"515397b2-ff6d-491a-9c49-fb217236b19f","Type":"ContainerStarted","Data":"4d01036e628f599be04a2a85f1939ccbc63d193d4c3770b9f8608a6d9ffd0140"} Sep 30 14:00:42 crc kubenswrapper[4676]: I0930 14:00:42.105178 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-rs79q" event={"ID":"a60ae8d9-5908-469c-aede-1af2fb6b8631","Type":"ContainerStarted","Data":"9b3d38e357ac667bfc693461c0c0d4134ffbe40bbd93e730218d216b84419149"} Sep 30 14:00:42 crc kubenswrapper[4676]: I0930 14:00:42.115043 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ggdd9" podStartSLOduration=121.115023771 podStartE2EDuration="2m1.115023771s" podCreationTimestamp="2025-09-30 13:58:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:00:42.098004228 +0000 UTC m=+146.081092657" watchObservedRunningTime="2025-09-30 14:00:42.115023771 +0000 UTC m=+146.098112200" Sep 30 14:00:42 crc kubenswrapper[4676]: I0930 14:00:42.115398 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-66mls" podStartSLOduration=120.115391991 podStartE2EDuration="2m0.115391991s" podCreationTimestamp="2025-09-30 13:58:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:00:42.114806086 +0000 UTC m=+146.097894525" watchObservedRunningTime="2025-09-30 14:00:42.115391991 +0000 UTC m=+146.098480420" Sep 30 14:00:42 crc kubenswrapper[4676]: I0930 14:00:42.116031 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-52tgw" event={"ID":"9ae45d95-5ab3-4a8f-906a-6ed43bb21612","Type":"ContainerStarted","Data":"6ee9ac439839f4505cfc90d46466eeb72e5b9562d712021bfca3b3fcf021bb0b"} Sep 30 14:00:42 crc kubenswrapper[4676]: I0930 14:00:42.116074 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-52tgw" event={"ID":"9ae45d95-5ab3-4a8f-906a-6ed43bb21612","Type":"ContainerStarted","Data":"34c0d89f7d39ebf10b3c082e0e51dea76a5dda43ac97bfa1b36b0ecc2fefd1d8"} Sep 30 14:00:42 crc kubenswrapper[4676]: I0930 14:00:42.117601 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-52tgw" Sep 30 14:00:42 crc kubenswrapper[4676]: I0930 14:00:42.123466 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-zgcbx" event={"ID":"1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d","Type":"ContainerStarted","Data":"2c4503746a0f21b49f56c0355db44aff31fb1af8504aa8170e55afcd34cbfce1"} Sep 30 14:00:42 crc kubenswrapper[4676]: I0930 14:00:42.124134 4676 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-52tgw container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.42:5443/healthz\": dial tcp 10.217.0.42:5443: connect: connection refused" start-of-body= Sep 30 14:00:42 crc kubenswrapper[4676]: I0930 14:00:42.124180 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-52tgw" podUID="9ae45d95-5ab3-4a8f-906a-6ed43bb21612" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.42:5443/healthz\": dial tcp 10.217.0.42:5443: connect: connection refused" Sep 30 14:00:42 crc kubenswrapper[4676]: I0930 14:00:42.136090 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p9gxt" event={"ID":"a296de91-e1c3-4820-a2ec-cedfc4eac0db","Type":"ContainerStarted","Data":"1b596203ac7b97e4dc966622352b32fa7f4723ff91571b667ebd81f70a875dc0"} Sep 30 14:00:42 crc kubenswrapper[4676]: I0930 14:00:42.136140 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p9gxt" event={"ID":"a296de91-e1c3-4820-a2ec-cedfc4eac0db","Type":"ContainerStarted","Data":"5e2410227d1c9c1c3475fc91e7fbfee139d9432cde057e29db9dff573857ffdf"} Sep 30 14:00:42 crc kubenswrapper[4676]: I0930 14:00:42.142517 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29320680-q9p7h" podStartSLOduration=42.142498617 podStartE2EDuration="42.142498617s" podCreationTimestamp="2025-09-30 14:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:00:42.136465508 +0000 UTC m=+146.119553947" watchObservedRunningTime="2025-09-30 14:00:42.142498617 +0000 UTC m=+146.125587046" Sep 30 14:00:42 crc kubenswrapper[4676]: I0930 14:00:42.144944 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gmxgf" event={"ID":"05eb8f48-640e-413a-8845-9b2e3bf86f23","Type":"ContainerStarted","Data":"9964cfeb69a50e85a59694306fb1845e57a749adaae998f697097f42eaad71a3"} Sep 30 14:00:42 crc kubenswrapper[4676]: I0930 14:00:42.152828 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-qnmrm" event={"ID":"2ef453c0-cdb0-4da7-8c32-3b975e1009a1","Type":"ContainerStarted","Data":"60d48710d1a7d68ce89361029700c61a1e305a3aa98e65abbee77dcb339d30a3"} Sep 30 14:00:42 crc kubenswrapper[4676]: I0930 14:00:42.153674 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-qnmrm" Sep 30 14:00:42 crc kubenswrapper[4676]: I0930 14:00:42.155005 4676 patch_prober.go:28] interesting pod/downloads-7954f5f757-qnmrm container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Sep 30 14:00:42 crc kubenswrapper[4676]: I0930 14:00:42.155184 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-k8mk5" event={"ID":"3c5e85dd-ceb0-40eb-86b8-353702e07379","Type":"ContainerStarted","Data":"265d5ec9c55f6c2f5e7f65806973a49fdaf6b19740357cccadb896a1717162f9"} Sep 30 14:00:42 crc kubenswrapper[4676]: I0930 14:00:42.155414 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qnmrm" podUID="2ef453c0-cdb0-4da7-8c32-3b975e1009a1" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Sep 30 14:00:42 crc kubenswrapper[4676]: I0930 14:00:42.157031 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:42 crc kubenswrapper[4676]: E0930 14:00:42.161772 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 14:00:42.661759042 +0000 UTC m=+146.644847461 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rgs57" (UID: "aa6b14cb-7f79-4bc6-bc14-58325daf3c86") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:42 crc kubenswrapper[4676]: I0930 14:00:42.167828 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zgqt5" podStartSLOduration=121.167806861 podStartE2EDuration="2m1.167806861s" podCreationTimestamp="2025-09-30 13:58:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:00:42.16527071 +0000 UTC m=+146.148359139" watchObservedRunningTime="2025-09-30 14:00:42.167806861 +0000 UTC m=+146.150895290" Sep 30 14:00:42 crc kubenswrapper[4676]: I0930 14:00:42.186689 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-sngkt" podStartSLOduration=7.186641615 podStartE2EDuration="7.186641615s" podCreationTimestamp="2025-09-30 14:00:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:00:42.185254956 +0000 UTC m=+146.168343385" watchObservedRunningTime="2025-09-30 14:00:42.186641615 +0000 UTC m=+146.169730044" Sep 30 14:00:42 crc kubenswrapper[4676]: I0930 14:00:42.258130 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 14:00:42 crc kubenswrapper[4676]: E0930 14:00:42.259384 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 14:00:42.759354789 +0000 UTC m=+146.742443218 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:42 crc kubenswrapper[4676]: I0930 14:00:42.279270 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-h8fq9" podStartSLOduration=7.279248533 podStartE2EDuration="7.279248533s" podCreationTimestamp="2025-09-30 14:00:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:00:42.240557826 +0000 UTC m=+146.223646255" watchObservedRunningTime="2025-09-30 14:00:42.279248533 +0000 UTC m=+146.262336962" Sep 30 14:00:42 crc kubenswrapper[4676]: I0930 14:00:42.312311 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-5rrn6" podStartSLOduration=120.312286663 podStartE2EDuration="2m0.312286663s" podCreationTimestamp="2025-09-30 13:58:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:00:42.280369034 +0000 UTC m=+146.263457463" watchObservedRunningTime="2025-09-30 14:00:42.312286663 +0000 UTC m=+146.295375092" Sep 30 14:00:42 crc kubenswrapper[4676]: I0930 14:00:42.344238 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-p9gxt" podStartSLOduration=121.344218722 podStartE2EDuration="2m1.344218722s" podCreationTimestamp="2025-09-30 13:58:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:00:42.317292362 +0000 UTC m=+146.300380801" watchObservedRunningTime="2025-09-30 14:00:42.344218722 +0000 UTC m=+146.327307151" Sep 30 14:00:42 crc kubenswrapper[4676]: I0930 14:00:42.360046 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:42 crc kubenswrapper[4676]: E0930 14:00:42.360446 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 14:00:42.860431713 +0000 UTC m=+146.843520152 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rgs57" (UID: "aa6b14cb-7f79-4bc6-bc14-58325daf3c86") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:42 crc kubenswrapper[4676]: I0930 14:00:42.367907 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-zgcbx" podStartSLOduration=121.367871099 podStartE2EDuration="2m1.367871099s" podCreationTimestamp="2025-09-30 13:58:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:00:42.344656164 +0000 UTC m=+146.327744593" watchObservedRunningTime="2025-09-30 14:00:42.367871099 +0000 UTC m=+146.350959528" Sep 30 14:00:42 crc kubenswrapper[4676]: I0930 14:00:42.372104 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-qnmrm" podStartSLOduration=121.372085797 podStartE2EDuration="2m1.372085797s" podCreationTimestamp="2025-09-30 13:58:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:00:42.365442123 +0000 UTC m=+146.348530572" watchObservedRunningTime="2025-09-30 14:00:42.372085797 +0000 UTC m=+146.355174226" Sep 30 14:00:42 crc kubenswrapper[4676]: I0930 14:00:42.399448 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-52tgw" podStartSLOduration=120.399428098 podStartE2EDuration="2m0.399428098s" podCreationTimestamp="2025-09-30 13:58:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:00:42.398816392 +0000 UTC m=+146.381904831" watchObservedRunningTime="2025-09-30 14:00:42.399428098 +0000 UTC m=+146.382516527" Sep 30 14:00:42 crc kubenswrapper[4676]: I0930 14:00:42.461927 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 14:00:42 crc kubenswrapper[4676]: E0930 14:00:42.462073 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 14:00:42.962053572 +0000 UTC m=+146.945142001 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:42 crc kubenswrapper[4676]: I0930 14:00:42.462392 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:42 crc kubenswrapper[4676]: E0930 14:00:42.462707 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 14:00:42.962699029 +0000 UTC m=+146.945787458 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rgs57" (UID: "aa6b14cb-7f79-4bc6-bc14-58325daf3c86") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:42 crc kubenswrapper[4676]: I0930 14:00:42.464525 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4mgv7" Sep 30 14:00:42 crc kubenswrapper[4676]: I0930 14:00:42.464609 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4mgv7" Sep 30 14:00:42 crc kubenswrapper[4676]: I0930 14:00:42.466416 4676 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-4mgv7 container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.18:8443/livez\": dial tcp 10.217.0.18:8443: connect: connection refused" start-of-body= Sep 30 14:00:42 crc kubenswrapper[4676]: I0930 14:00:42.466467 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4mgv7" podUID="4c806cf0-18da-4035-bf9c-f134a1b23485" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.18:8443/livez\": dial tcp 10.217.0.18:8443: connect: connection refused" Sep 30 14:00:42 crc kubenswrapper[4676]: I0930 14:00:42.563478 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 14:00:42 crc kubenswrapper[4676]: E0930 14:00:42.563925 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 14:00:43.063908687 +0000 UTC m=+147.046997116 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:42 crc kubenswrapper[4676]: I0930 14:00:42.597895 4676 patch_prober.go:28] interesting pod/router-default-5444994796-tj9mh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 14:00:42 crc kubenswrapper[4676]: [-]has-synced failed: reason withheld Sep 30 14:00:42 crc kubenswrapper[4676]: [+]process-running ok Sep 30 14:00:42 crc kubenswrapper[4676]: healthz check failed Sep 30 14:00:42 crc kubenswrapper[4676]: I0930 14:00:42.597952 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tj9mh" podUID="b7545eda-fbd7-4d47-8f48-084b1319bf34" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 14:00:42 crc kubenswrapper[4676]: I0930 14:00:42.665088 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:42 crc kubenswrapper[4676]: E0930 14:00:42.665382 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 14:00:43.165369392 +0000 UTC m=+147.148457821 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rgs57" (UID: "aa6b14cb-7f79-4bc6-bc14-58325daf3c86") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:42 crc kubenswrapper[4676]: I0930 14:00:42.766246 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 14:00:42 crc kubenswrapper[4676]: E0930 14:00:42.766724 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 14:00:43.266704182 +0000 UTC m=+147.249792621 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:42 crc kubenswrapper[4676]: I0930 14:00:42.868463 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:42 crc kubenswrapper[4676]: E0930 14:00:42.868940 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 14:00:43.368922768 +0000 UTC m=+147.352011207 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rgs57" (UID: "aa6b14cb-7f79-4bc6-bc14-58325daf3c86") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:42 crc kubenswrapper[4676]: I0930 14:00:42.969868 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 14:00:42 crc kubenswrapper[4676]: E0930 14:00:42.970387 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 14:00:43.470349801 +0000 UTC m=+147.453438230 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:43 crc kubenswrapper[4676]: I0930 14:00:43.072436 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:43 crc kubenswrapper[4676]: E0930 14:00:43.073045 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 14:00:43.573015469 +0000 UTC m=+147.556104078 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rgs57" (UID: "aa6b14cb-7f79-4bc6-bc14-58325daf3c86") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:43 crc kubenswrapper[4676]: I0930 14:00:43.173589 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 14:00:43 crc kubenswrapper[4676]: E0930 14:00:43.173687 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 14:00:43.673664501 +0000 UTC m=+147.656752940 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:43 crc kubenswrapper[4676]: I0930 14:00:43.173951 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:43 crc kubenswrapper[4676]: E0930 14:00:43.174274 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 14:00:43.674261127 +0000 UTC m=+147.657349556 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rgs57" (UID: "aa6b14cb-7f79-4bc6-bc14-58325daf3c86") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:43 crc kubenswrapper[4676]: I0930 14:00:43.175019 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r28xn" event={"ID":"ad665bdf-4477-45c2-bcde-96a4042e2176","Type":"ContainerStarted","Data":"672418458df7a5c4ce592d41f7ebaa5b0c6032e4ca6149ff1ffb1eb5956b137a"} Sep 30 14:00:43 crc kubenswrapper[4676]: I0930 14:00:43.183469 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-rs79q" event={"ID":"a60ae8d9-5908-469c-aede-1af2fb6b8631","Type":"ContainerStarted","Data":"66c457cb4f19c3723ec6d4546645222849631cf2ce925f13a775177612664fff"} Sep 30 14:00:43 crc kubenswrapper[4676]: I0930 14:00:43.191244 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2sdrd" event={"ID":"216b0fb0-8e21-4b04-96c0-1f2c984b09e2","Type":"ContainerStarted","Data":"4a99feea15cfca8cd3920c7103ca47367f3e7cc87a04fc6e17b8678982d2a4ed"} Sep 30 14:00:43 crc kubenswrapper[4676]: I0930 14:00:43.205700 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gcvn8" event={"ID":"f2376906-3f0c-4ba2-b27b-ae1464676554","Type":"ContainerStarted","Data":"7f46f4bf3bcfd5688442b863ad371bf4b1403aaf4c88d3f98acb788d307865b5"} Sep 30 14:00:43 crc kubenswrapper[4676]: I0930 14:00:43.210619 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7wrkd" event={"ID":"4ec8c2c0-6649-4fce-bd49-c178b25d9da1","Type":"ContainerStarted","Data":"2556a81d50a43dce8ab2ef1af4e228ec7834ac896071d3944068268a66d3370d"} Sep 30 14:00:43 crc kubenswrapper[4676]: I0930 14:00:43.213065 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-xr2v8" event={"ID":"5d26a632-0017-40e4-ac56-5794c4800d83","Type":"ContainerStarted","Data":"b35c0d2ad06dccee7ef7cd3310ae725460d86e6f502e52b1259c6aa2c8678efe"} Sep 30 14:00:43 crc kubenswrapper[4676]: I0930 14:00:43.217837 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-j465h" event={"ID":"515397b2-ff6d-491a-9c49-fb217236b19f","Type":"ContainerStarted","Data":"df8e9fa77ca3a5af681fe4324f0bba73e9df52e895ccb8213cb3a04bda2c7a40"} Sep 30 14:00:43 crc kubenswrapper[4676]: I0930 14:00:43.224990 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-k8mk5" event={"ID":"3c5e85dd-ceb0-40eb-86b8-353702e07379","Type":"ContainerStarted","Data":"014d9ba740334fd634f6fbc27edf28fa9755a2f261ac738d71b22dd711bf3428"} Sep 30 14:00:43 crc kubenswrapper[4676]: I0930 14:00:43.225146 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-k8mk5" Sep 30 14:00:43 crc kubenswrapper[4676]: I0930 14:00:43.229900 4676 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-k8mk5 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.29:6443/healthz\": dial tcp 10.217.0.29:6443: connect: connection refused" start-of-body= Sep 30 14:00:43 crc kubenswrapper[4676]: I0930 14:00:43.229994 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-k8mk5" podUID="3c5e85dd-ceb0-40eb-86b8-353702e07379" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.29:6443/healthz\": dial tcp 10.217.0.29:6443: connect: connection refused" Sep 30 14:00:43 crc kubenswrapper[4676]: I0930 14:00:43.232258 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7ttmn" event={"ID":"63ad8370-13f3-4bcc-80f4-8a8e6b657667","Type":"ContainerStarted","Data":"48b7c60a01ee75fd5984ba0b07f33e0371150fc8c662f219a1dc1e636e0ec815"} Sep 30 14:00:43 crc kubenswrapper[4676]: I0930 14:00:43.236880 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wb8d2" event={"ID":"97f344fc-42a3-4630-af31-ea25b72941e6","Type":"ContainerStarted","Data":"b4f43bded5f21772657368b9ac5baf944960760bd0a52d25673663dec634fed4"} Sep 30 14:00:43 crc kubenswrapper[4676]: I0930 14:00:43.246136 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gmxgf" event={"ID":"05eb8f48-640e-413a-8845-9b2e3bf86f23","Type":"ContainerStarted","Data":"b7a4aa3a63134fbe128e881c902ea6966572266b623a4c7aab542e47f606438e"} Sep 30 14:00:43 crc kubenswrapper[4676]: I0930 14:00:43.268574 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mqdxw" event={"ID":"f652cd09-f743-4500-bace-2652974c9ef3","Type":"ContainerStarted","Data":"5d0565d801c17373ab861025ea269d1d7e602d8fa90c90ae0290f3095abf0051"} Sep 30 14:00:43 crc kubenswrapper[4676]: I0930 14:00:43.275448 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 14:00:43 crc kubenswrapper[4676]: E0930 14:00:43.275833 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 14:00:43.775812974 +0000 UTC m=+147.758901403 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:43 crc kubenswrapper[4676]: I0930 14:00:43.291395 4676 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-6mnvc container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" start-of-body= Sep 30 14:00:43 crc kubenswrapper[4676]: I0930 14:00:43.291458 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6mnvc" podUID="627ad3bd-de42-40d1-ac5c-ab04b2b2ddca" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" Sep 30 14:00:43 crc kubenswrapper[4676]: I0930 14:00:43.291718 4676 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-66mls container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Sep 30 14:00:43 crc kubenswrapper[4676]: I0930 14:00:43.291744 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-66mls" podUID="6bf298e3-e793-4a56-9856-094437b77046" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" Sep 30 14:00:43 crc kubenswrapper[4676]: I0930 14:00:43.291805 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-8dzwh" event={"ID":"c23c466d-b781-4d90-a824-3061cf8890be","Type":"ContainerStarted","Data":"0dc516e6a5f60babb6d3612e00cdb0c959e3d6503cc5852c8fcbc21c72d3626c"} Sep 30 14:00:43 crc kubenswrapper[4676]: I0930 14:00:43.292272 4676 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-52tgw container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.42:5443/healthz\": dial tcp 10.217.0.42:5443: connect: connection refused" start-of-body= Sep 30 14:00:43 crc kubenswrapper[4676]: I0930 14:00:43.293179 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-52tgw" podUID="9ae45d95-5ab3-4a8f-906a-6ed43bb21612" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.42:5443/healthz\": dial tcp 10.217.0.42:5443: connect: connection refused" Sep 30 14:00:43 crc kubenswrapper[4676]: I0930 14:00:43.292382 4676 patch_prober.go:28] interesting pod/downloads-7954f5f757-qnmrm container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Sep 30 14:00:43 crc kubenswrapper[4676]: I0930 14:00:43.293225 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qnmrm" podUID="2ef453c0-cdb0-4da7-8c32-3b975e1009a1" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Sep 30 14:00:43 crc kubenswrapper[4676]: I0930 14:00:43.327346 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-k8mk5" podStartSLOduration=122.327320099 podStartE2EDuration="2m2.327320099s" podCreationTimestamp="2025-09-30 13:58:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:00:43.269523329 +0000 UTC m=+147.252611758" watchObservedRunningTime="2025-09-30 14:00:43.327320099 +0000 UTC m=+147.310408528" Sep 30 14:00:43 crc kubenswrapper[4676]: I0930 14:00:43.327847 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-8dzwh" podStartSLOduration=121.327841323 podStartE2EDuration="2m1.327841323s" podCreationTimestamp="2025-09-30 13:58:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:00:43.32774972 +0000 UTC m=+147.310838169" watchObservedRunningTime="2025-09-30 14:00:43.327841323 +0000 UTC m=+147.310929752" Sep 30 14:00:43 crc kubenswrapper[4676]: I0930 14:00:43.362059 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-wfv7t" podStartSLOduration=121.362024095 podStartE2EDuration="2m1.362024095s" podCreationTimestamp="2025-09-30 13:58:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:00:43.361251303 +0000 UTC m=+147.344339732" watchObservedRunningTime="2025-09-30 14:00:43.362024095 +0000 UTC m=+147.345112524" Sep 30 14:00:43 crc kubenswrapper[4676]: I0930 14:00:43.378322 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:43 crc kubenswrapper[4676]: E0930 14:00:43.381559 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 14:00:43.881540897 +0000 UTC m=+147.864629326 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rgs57" (UID: "aa6b14cb-7f79-4bc6-bc14-58325daf3c86") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:43 crc kubenswrapper[4676]: I0930 14:00:43.480602 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 14:00:43 crc kubenswrapper[4676]: E0930 14:00:43.480861 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 14:00:43.98080126 +0000 UTC m=+147.963889689 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:43 crc kubenswrapper[4676]: I0930 14:00:43.482658 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:43 crc kubenswrapper[4676]: E0930 14:00:43.490211 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 14:00:43.990188762 +0000 UTC m=+147.973277201 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rgs57" (UID: "aa6b14cb-7f79-4bc6-bc14-58325daf3c86") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:43 crc kubenswrapper[4676]: I0930 14:00:43.583902 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 14:00:43 crc kubenswrapper[4676]: E0930 14:00:43.585031 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 14:00:44.084985441 +0000 UTC m=+148.068073870 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:43 crc kubenswrapper[4676]: I0930 14:00:43.585812 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:43 crc kubenswrapper[4676]: E0930 14:00:43.586274 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 14:00:44.086247846 +0000 UTC m=+148.069336275 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rgs57" (UID: "aa6b14cb-7f79-4bc6-bc14-58325daf3c86") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:43 crc kubenswrapper[4676]: I0930 14:00:43.587147 4676 patch_prober.go:28] interesting pod/router-default-5444994796-tj9mh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 14:00:43 crc kubenswrapper[4676]: [-]has-synced failed: reason withheld Sep 30 14:00:43 crc kubenswrapper[4676]: [+]process-running ok Sep 30 14:00:43 crc kubenswrapper[4676]: healthz check failed Sep 30 14:00:43 crc kubenswrapper[4676]: I0930 14:00:43.587219 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tj9mh" podUID="b7545eda-fbd7-4d47-8f48-084b1319bf34" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 14:00:43 crc kubenswrapper[4676]: I0930 14:00:43.687074 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 14:00:43 crc kubenswrapper[4676]: E0930 14:00:43.687496 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 14:00:44.187437503 +0000 UTC m=+148.170525942 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:43 crc kubenswrapper[4676]: I0930 14:00:43.687780 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:43 crc kubenswrapper[4676]: E0930 14:00:43.688264 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 14:00:44.188246666 +0000 UTC m=+148.171335095 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rgs57" (UID: "aa6b14cb-7f79-4bc6-bc14-58325daf3c86") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:43 crc kubenswrapper[4676]: I0930 14:00:43.788808 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 14:00:43 crc kubenswrapper[4676]: E0930 14:00:43.789033 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 14:00:44.28900243 +0000 UTC m=+148.272090859 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:43 crc kubenswrapper[4676]: I0930 14:00:43.789220 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:43 crc kubenswrapper[4676]: E0930 14:00:43.789713 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 14:00:44.28970088 +0000 UTC m=+148.272789299 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rgs57" (UID: "aa6b14cb-7f79-4bc6-bc14-58325daf3c86") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:43 crc kubenswrapper[4676]: I0930 14:00:43.890275 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 14:00:43 crc kubenswrapper[4676]: E0930 14:00:43.890781 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 14:00:44.390761353 +0000 UTC m=+148.373849782 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:43 crc kubenswrapper[4676]: I0930 14:00:43.992035 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:43 crc kubenswrapper[4676]: E0930 14:00:43.992474 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 14:00:44.492455544 +0000 UTC m=+148.475543973 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rgs57" (UID: "aa6b14cb-7f79-4bc6-bc14-58325daf3c86") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:44 crc kubenswrapper[4676]: I0930 14:00:44.093048 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 14:00:44 crc kubenswrapper[4676]: E0930 14:00:44.093620 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 14:00:44.593596039 +0000 UTC m=+148.576684468 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:44 crc kubenswrapper[4676]: I0930 14:00:44.094142 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:44 crc kubenswrapper[4676]: E0930 14:00:44.094567 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 14:00:44.594553366 +0000 UTC m=+148.577641795 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rgs57" (UID: "aa6b14cb-7f79-4bc6-bc14-58325daf3c86") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:44 crc kubenswrapper[4676]: I0930 14:00:44.195145 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 14:00:44 crc kubenswrapper[4676]: E0930 14:00:44.195494 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 14:00:44.695463135 +0000 UTC m=+148.678551564 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:44 crc kubenswrapper[4676]: I0930 14:00:44.195924 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:44 crc kubenswrapper[4676]: E0930 14:00:44.196261 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 14:00:44.696247857 +0000 UTC m=+148.679336296 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rgs57" (UID: "aa6b14cb-7f79-4bc6-bc14-58325daf3c86") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:44 crc kubenswrapper[4676]: I0930 14:00:44.298006 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 14:00:44 crc kubenswrapper[4676]: E0930 14:00:44.298149 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 14:00:44.798126673 +0000 UTC m=+148.781215102 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:44 crc kubenswrapper[4676]: I0930 14:00:44.298241 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 14:00:44 crc kubenswrapper[4676]: I0930 14:00:44.298326 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:44 crc kubenswrapper[4676]: I0930 14:00:44.298366 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 14:00:44 crc kubenswrapper[4676]: E0930 14:00:44.298827 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 14:00:44.798816383 +0000 UTC m=+148.781904872 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rgs57" (UID: "aa6b14cb-7f79-4bc6-bc14-58325daf3c86") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:44 crc kubenswrapper[4676]: I0930 14:00:44.299502 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 14:00:44 crc kubenswrapper[4676]: I0930 14:00:44.305155 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 14:00:44 crc kubenswrapper[4676]: I0930 14:00:44.309409 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-rnmq9" event={"ID":"2b51a7d3-7ee2-45a0-afa2-e0fb3bbed0b6","Type":"ContainerStarted","Data":"34ad40702b568dc92a5c2dc321347624c9f93952595c8b8e9ee027cdaae1f903"} Sep 30 14:00:44 crc kubenswrapper[4676]: I0930 14:00:44.310712 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-j465h" event={"ID":"515397b2-ff6d-491a-9c49-fb217236b19f","Type":"ContainerStarted","Data":"793447b17ec230470a21e1495de2f972d90a2d99b15731b90f49bcf3d9c5984b"} Sep 30 14:00:44 crc kubenswrapper[4676]: I0930 14:00:44.311610 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-j465h" Sep 30 14:00:44 crc kubenswrapper[4676]: I0930 14:00:44.312763 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7ttmn" event={"ID":"63ad8370-13f3-4bcc-80f4-8a8e6b657667","Type":"ContainerStarted","Data":"05775b1f422e14a81440f13fecf116add577820af851f8afcca0e206d148ec63"} Sep 30 14:00:44 crc kubenswrapper[4676]: I0930 14:00:44.314918 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r28xn" event={"ID":"ad665bdf-4477-45c2-bcde-96a4042e2176","Type":"ContainerStarted","Data":"9ed0e6b1a41298f3e73021998dbf6de7a0d67d2a7ec3d034915c5ab01cf5c865"} Sep 30 14:00:44 crc kubenswrapper[4676]: I0930 14:00:44.314956 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r28xn" Sep 30 14:00:44 crc kubenswrapper[4676]: I0930 14:00:44.317211 4676 patch_prober.go:28] interesting pod/downloads-7954f5f757-qnmrm container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Sep 30 14:00:44 crc kubenswrapper[4676]: I0930 14:00:44.317254 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qnmrm" podUID="2ef453c0-cdb0-4da7-8c32-3b975e1009a1" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Sep 30 14:00:44 crc kubenswrapper[4676]: I0930 14:00:44.317717 4676 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-k8mk5 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.29:6443/healthz\": dial tcp 10.217.0.29:6443: connect: connection refused" start-of-body= Sep 30 14:00:44 crc kubenswrapper[4676]: I0930 14:00:44.317756 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-k8mk5" podUID="3c5e85dd-ceb0-40eb-86b8-353702e07379" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.29:6443/healthz\": dial tcp 10.217.0.29:6443: connect: connection refused" Sep 30 14:00:44 crc kubenswrapper[4676]: I0930 14:00:44.322388 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6mnvc" Sep 30 14:00:44 crc kubenswrapper[4676]: I0930 14:00:44.326011 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-66mls" Sep 30 14:00:44 crc kubenswrapper[4676]: I0930 14:00:44.342064 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-j465h" podStartSLOduration=9.342047725 podStartE2EDuration="9.342047725s" podCreationTimestamp="2025-09-30 14:00:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:00:44.336647096 +0000 UTC m=+148.319735535" watchObservedRunningTime="2025-09-30 14:00:44.342047725 +0000 UTC m=+148.325136154" Sep 30 14:00:44 crc kubenswrapper[4676]: I0930 14:00:44.359792 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 14:00:44 crc kubenswrapper[4676]: I0930 14:00:44.404404 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 14:00:44 crc kubenswrapper[4676]: I0930 14:00:44.404798 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 14:00:44 crc kubenswrapper[4676]: I0930 14:00:44.404822 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 14:00:44 crc kubenswrapper[4676]: E0930 14:00:44.406036 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 14:00:44.906019686 +0000 UTC m=+148.889108115 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:44 crc kubenswrapper[4676]: I0930 14:00:44.411767 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 14:00:44 crc kubenswrapper[4676]: I0930 14:00:44.426554 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 14:00:44 crc kubenswrapper[4676]: I0930 14:00:44.491443 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-rs79q" podStartSLOduration=123.491401303 podStartE2EDuration="2m3.491401303s" podCreationTimestamp="2025-09-30 13:58:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:00:44.396634565 +0000 UTC m=+148.379722994" watchObservedRunningTime="2025-09-30 14:00:44.491401303 +0000 UTC m=+148.474489742" Sep 30 14:00:44 crc kubenswrapper[4676]: I0930 14:00:44.493421 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2sdrd" podStartSLOduration=122.493410379 podStartE2EDuration="2m2.493410379s" podCreationTimestamp="2025-09-30 13:58:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:00:44.486441205 +0000 UTC m=+148.469529634" watchObservedRunningTime="2025-09-30 14:00:44.493410379 +0000 UTC m=+148.476498818" Sep 30 14:00:44 crc kubenswrapper[4676]: I0930 14:00:44.507325 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:44 crc kubenswrapper[4676]: E0930 14:00:44.510115 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 14:00:45.010099373 +0000 UTC m=+148.993187872 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rgs57" (UID: "aa6b14cb-7f79-4bc6-bc14-58325daf3c86") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:44 crc kubenswrapper[4676]: I0930 14:00:44.515545 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7ttmn" podStartSLOduration=122.515528004 podStartE2EDuration="2m2.515528004s" podCreationTimestamp="2025-09-30 13:58:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:00:44.512919523 +0000 UTC m=+148.496007952" watchObservedRunningTime="2025-09-30 14:00:44.515528004 +0000 UTC m=+148.498616433" Sep 30 14:00:44 crc kubenswrapper[4676]: I0930 14:00:44.585268 4676 patch_prober.go:28] interesting pod/router-default-5444994796-tj9mh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 14:00:44 crc kubenswrapper[4676]: [-]has-synced failed: reason withheld Sep 30 14:00:44 crc kubenswrapper[4676]: [+]process-running ok Sep 30 14:00:44 crc kubenswrapper[4676]: healthz check failed Sep 30 14:00:44 crc kubenswrapper[4676]: I0930 14:00:44.586583 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tj9mh" podUID="b7545eda-fbd7-4d47-8f48-084b1319bf34" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 14:00:44 crc kubenswrapper[4676]: I0930 14:00:44.617472 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 14:00:44 crc kubenswrapper[4676]: E0930 14:00:44.618181 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 14:00:45.118160892 +0000 UTC m=+149.101249321 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:44 crc kubenswrapper[4676]: I0930 14:00:44.645568 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 14:00:44 crc kubenswrapper[4676]: I0930 14:00:44.653529 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 14:00:44 crc kubenswrapper[4676]: I0930 14:00:44.712112 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-xr2v8" podStartSLOduration=123.712092587 podStartE2EDuration="2m3.712092587s" podCreationTimestamp="2025-09-30 13:58:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:00:44.66193055 +0000 UTC m=+148.645018999" watchObservedRunningTime="2025-09-30 14:00:44.712092587 +0000 UTC m=+148.695181016" Sep 30 14:00:44 crc kubenswrapper[4676]: I0930 14:00:44.725700 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:44 crc kubenswrapper[4676]: E0930 14:00:44.726048 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 14:00:45.226036805 +0000 UTC m=+149.209125234 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rgs57" (UID: "aa6b14cb-7f79-4bc6-bc14-58325daf3c86") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:44 crc kubenswrapper[4676]: I0930 14:00:44.755100 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gcvn8" podStartSLOduration=123.755080083 podStartE2EDuration="2m3.755080083s" podCreationTimestamp="2025-09-30 13:58:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:00:44.711551591 +0000 UTC m=+148.694640040" watchObservedRunningTime="2025-09-30 14:00:44.755080083 +0000 UTC m=+148.738168512" Sep 30 14:00:44 crc kubenswrapper[4676]: I0930 14:00:44.828075 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 14:00:44 crc kubenswrapper[4676]: E0930 14:00:44.828455 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 14:00:45.328433305 +0000 UTC m=+149.311521734 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:44 crc kubenswrapper[4676]: I0930 14:00:44.828501 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wb8d2" podStartSLOduration=122.828484557 podStartE2EDuration="2m2.828484557s" podCreationTimestamp="2025-09-30 13:58:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:00:44.767141969 +0000 UTC m=+148.750230398" watchObservedRunningTime="2025-09-30 14:00:44.828484557 +0000 UTC m=+148.811572976" Sep 30 14:00:44 crc kubenswrapper[4676]: I0930 14:00:44.829391 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gmxgf" podStartSLOduration=123.829384842 podStartE2EDuration="2m3.829384842s" podCreationTimestamp="2025-09-30 13:58:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:00:44.813225512 +0000 UTC m=+148.796313951" watchObservedRunningTime="2025-09-30 14:00:44.829384842 +0000 UTC m=+148.812473271" Sep 30 14:00:44 crc kubenswrapper[4676]: I0930 14:00:44.863225 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r28xn" podStartSLOduration=122.863198053 podStartE2EDuration="2m2.863198053s" podCreationTimestamp="2025-09-30 13:58:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:00:44.860169639 +0000 UTC m=+148.843258078" watchObservedRunningTime="2025-09-30 14:00:44.863198053 +0000 UTC m=+148.846286492" Sep 30 14:00:44 crc kubenswrapper[4676]: I0930 14:00:44.901935 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mqdxw" podStartSLOduration=123.90191446 podStartE2EDuration="2m3.90191446s" podCreationTimestamp="2025-09-30 13:58:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:00:44.898955448 +0000 UTC m=+148.882043877" watchObservedRunningTime="2025-09-30 14:00:44.90191446 +0000 UTC m=+148.885002889" Sep 30 14:00:44 crc kubenswrapper[4676]: I0930 14:00:44.929015 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:44 crc kubenswrapper[4676]: E0930 14:00:44.929353 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 14:00:45.429340744 +0000 UTC m=+149.412429173 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rgs57" (UID: "aa6b14cb-7f79-4bc6-bc14-58325daf3c86") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:44 crc kubenswrapper[4676]: I0930 14:00:44.930244 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-7wrkd" podStartSLOduration=122.930231339 podStartE2EDuration="2m2.930231339s" podCreationTimestamp="2025-09-30 13:58:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:00:44.928177412 +0000 UTC m=+148.911265851" watchObservedRunningTime="2025-09-30 14:00:44.930231339 +0000 UTC m=+148.913319768" Sep 30 14:00:45 crc kubenswrapper[4676]: I0930 14:00:45.029841 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 14:00:45 crc kubenswrapper[4676]: E0930 14:00:45.030279 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 14:00:45.530254283 +0000 UTC m=+149.513342712 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:45 crc kubenswrapper[4676]: I0930 14:00:45.030526 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:45 crc kubenswrapper[4676]: E0930 14:00:45.030905 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 14:00:45.53087649 +0000 UTC m=+149.513964919 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rgs57" (UID: "aa6b14cb-7f79-4bc6-bc14-58325daf3c86") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:45 crc kubenswrapper[4676]: I0930 14:00:45.148762 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 14:00:45 crc kubenswrapper[4676]: E0930 14:00:45.149386 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 14:00:45.649365879 +0000 UTC m=+149.632454308 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:45 crc kubenswrapper[4676]: I0930 14:00:45.257154 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:45 crc kubenswrapper[4676]: E0930 14:00:45.257698 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 14:00:45.757679204 +0000 UTC m=+149.740767633 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rgs57" (UID: "aa6b14cb-7f79-4bc6-bc14-58325daf3c86") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:45 crc kubenswrapper[4676]: I0930 14:00:45.327080 4676 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-52tgw container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.42:5443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Sep 30 14:00:45 crc kubenswrapper[4676]: I0930 14:00:45.327152 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-52tgw" podUID="9ae45d95-5ab3-4a8f-906a-6ed43bb21612" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.42:5443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 14:00:45 crc kubenswrapper[4676]: W0930 14:00:45.333064 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-240aecb14b87759ae3d3e360afd59d73530056d3f7e24f987d38a72ac7574d28 WatchSource:0}: Error finding container 240aecb14b87759ae3d3e360afd59d73530056d3f7e24f987d38a72ac7574d28: Status 404 returned error can't find the container with id 240aecb14b87759ae3d3e360afd59d73530056d3f7e24f987d38a72ac7574d28 Sep 30 14:00:45 crc kubenswrapper[4676]: I0930 14:00:45.359432 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 14:00:45 crc kubenswrapper[4676]: E0930 14:00:45.359768 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 14:00:45.859753395 +0000 UTC m=+149.842841824 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:45 crc kubenswrapper[4676]: I0930 14:00:45.461568 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:45 crc kubenswrapper[4676]: E0930 14:00:45.468985 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 14:00:45.968971356 +0000 UTC m=+149.952059785 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rgs57" (UID: "aa6b14cb-7f79-4bc6-bc14-58325daf3c86") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:45 crc kubenswrapper[4676]: I0930 14:00:45.479124 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8bmsn"] Sep 30 14:00:45 crc kubenswrapper[4676]: I0930 14:00:45.480075 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8bmsn"] Sep 30 14:00:45 crc kubenswrapper[4676]: I0930 14:00:45.480175 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8bmsn" Sep 30 14:00:45 crc kubenswrapper[4676]: I0930 14:00:45.482775 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Sep 30 14:00:45 crc kubenswrapper[4676]: I0930 14:00:45.562670 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 14:00:45 crc kubenswrapper[4676]: E0930 14:00:45.562855 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 14:00:46.062822948 +0000 UTC m=+150.045911377 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:45 crc kubenswrapper[4676]: I0930 14:00:45.562932 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19e521ab-fa8b-43c8-a6ca-c57b507fce2d-catalog-content\") pod \"certified-operators-8bmsn\" (UID: \"19e521ab-fa8b-43c8-a6ca-c57b507fce2d\") " pod="openshift-marketplace/certified-operators-8bmsn" Sep 30 14:00:45 crc kubenswrapper[4676]: I0930 14:00:45.563022 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19e521ab-fa8b-43c8-a6ca-c57b507fce2d-utilities\") pod \"certified-operators-8bmsn\" (UID: \"19e521ab-fa8b-43c8-a6ca-c57b507fce2d\") " pod="openshift-marketplace/certified-operators-8bmsn" Sep 30 14:00:45 crc kubenswrapper[4676]: I0930 14:00:45.563070 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbfbm\" (UniqueName: \"kubernetes.io/projected/19e521ab-fa8b-43c8-a6ca-c57b507fce2d-kube-api-access-wbfbm\") pod \"certified-operators-8bmsn\" (UID: \"19e521ab-fa8b-43c8-a6ca-c57b507fce2d\") " pod="openshift-marketplace/certified-operators-8bmsn" Sep 30 14:00:45 crc kubenswrapper[4676]: I0930 14:00:45.585597 4676 patch_prober.go:28] interesting pod/router-default-5444994796-tj9mh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 14:00:45 crc kubenswrapper[4676]: [-]has-synced failed: reason withheld Sep 30 14:00:45 crc kubenswrapper[4676]: [+]process-running ok Sep 30 14:00:45 crc kubenswrapper[4676]: healthz check failed Sep 30 14:00:45 crc kubenswrapper[4676]: I0930 14:00:45.585662 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tj9mh" podUID="b7545eda-fbd7-4d47-8f48-084b1319bf34" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 14:00:45 crc kubenswrapper[4676]: I0930 14:00:45.631333 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-w8xnq"] Sep 30 14:00:45 crc kubenswrapper[4676]: I0930 14:00:45.633419 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w8xnq" Sep 30 14:00:45 crc kubenswrapper[4676]: I0930 14:00:45.638219 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Sep 30 14:00:45 crc kubenswrapper[4676]: I0930 14:00:45.656179 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-w8xnq"] Sep 30 14:00:45 crc kubenswrapper[4676]: I0930 14:00:45.664496 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbfbm\" (UniqueName: \"kubernetes.io/projected/19e521ab-fa8b-43c8-a6ca-c57b507fce2d-kube-api-access-wbfbm\") pod \"certified-operators-8bmsn\" (UID: \"19e521ab-fa8b-43c8-a6ca-c57b507fce2d\") " pod="openshift-marketplace/certified-operators-8bmsn" Sep 30 14:00:45 crc kubenswrapper[4676]: I0930 14:00:45.664557 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cad652ea-7de0-4021-b594-7ea2a0681286-utilities\") pod \"community-operators-w8xnq\" (UID: \"cad652ea-7de0-4021-b594-7ea2a0681286\") " pod="openshift-marketplace/community-operators-w8xnq" Sep 30 14:00:45 crc kubenswrapper[4676]: I0930 14:00:45.664607 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:45 crc kubenswrapper[4676]: I0930 14:00:45.664638 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9g5q\" (UniqueName: \"kubernetes.io/projected/cad652ea-7de0-4021-b594-7ea2a0681286-kube-api-access-h9g5q\") pod \"community-operators-w8xnq\" (UID: \"cad652ea-7de0-4021-b594-7ea2a0681286\") " pod="openshift-marketplace/community-operators-w8xnq" Sep 30 14:00:45 crc kubenswrapper[4676]: I0930 14:00:45.664737 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19e521ab-fa8b-43c8-a6ca-c57b507fce2d-catalog-content\") pod \"certified-operators-8bmsn\" (UID: \"19e521ab-fa8b-43c8-a6ca-c57b507fce2d\") " pod="openshift-marketplace/certified-operators-8bmsn" Sep 30 14:00:45 crc kubenswrapper[4676]: I0930 14:00:45.664765 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19e521ab-fa8b-43c8-a6ca-c57b507fce2d-utilities\") pod \"certified-operators-8bmsn\" (UID: \"19e521ab-fa8b-43c8-a6ca-c57b507fce2d\") " pod="openshift-marketplace/certified-operators-8bmsn" Sep 30 14:00:45 crc kubenswrapper[4676]: I0930 14:00:45.664781 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cad652ea-7de0-4021-b594-7ea2a0681286-catalog-content\") pod \"community-operators-w8xnq\" (UID: \"cad652ea-7de0-4021-b594-7ea2a0681286\") " pod="openshift-marketplace/community-operators-w8xnq" Sep 30 14:00:45 crc kubenswrapper[4676]: E0930 14:00:45.665441 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 14:00:46.165429305 +0000 UTC m=+150.148517724 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rgs57" (UID: "aa6b14cb-7f79-4bc6-bc14-58325daf3c86") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:45 crc kubenswrapper[4676]: I0930 14:00:45.666016 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19e521ab-fa8b-43c8-a6ca-c57b507fce2d-catalog-content\") pod \"certified-operators-8bmsn\" (UID: \"19e521ab-fa8b-43c8-a6ca-c57b507fce2d\") " pod="openshift-marketplace/certified-operators-8bmsn" Sep 30 14:00:45 crc kubenswrapper[4676]: I0930 14:00:45.666356 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19e521ab-fa8b-43c8-a6ca-c57b507fce2d-utilities\") pod \"certified-operators-8bmsn\" (UID: \"19e521ab-fa8b-43c8-a6ca-c57b507fce2d\") " pod="openshift-marketplace/certified-operators-8bmsn" Sep 30 14:00:45 crc kubenswrapper[4676]: I0930 14:00:45.700525 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbfbm\" (UniqueName: \"kubernetes.io/projected/19e521ab-fa8b-43c8-a6ca-c57b507fce2d-kube-api-access-wbfbm\") pod \"certified-operators-8bmsn\" (UID: \"19e521ab-fa8b-43c8-a6ca-c57b507fce2d\") " pod="openshift-marketplace/certified-operators-8bmsn" Sep 30 14:00:45 crc kubenswrapper[4676]: I0930 14:00:45.768182 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 14:00:45 crc kubenswrapper[4676]: I0930 14:00:45.768409 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cad652ea-7de0-4021-b594-7ea2a0681286-utilities\") pod \"community-operators-w8xnq\" (UID: \"cad652ea-7de0-4021-b594-7ea2a0681286\") " pod="openshift-marketplace/community-operators-w8xnq" Sep 30 14:00:45 crc kubenswrapper[4676]: I0930 14:00:45.768482 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9g5q\" (UniqueName: \"kubernetes.io/projected/cad652ea-7de0-4021-b594-7ea2a0681286-kube-api-access-h9g5q\") pod \"community-operators-w8xnq\" (UID: \"cad652ea-7de0-4021-b594-7ea2a0681286\") " pod="openshift-marketplace/community-operators-w8xnq" Sep 30 14:00:45 crc kubenswrapper[4676]: I0930 14:00:45.768518 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cad652ea-7de0-4021-b594-7ea2a0681286-catalog-content\") pod \"community-operators-w8xnq\" (UID: \"cad652ea-7de0-4021-b594-7ea2a0681286\") " pod="openshift-marketplace/community-operators-w8xnq" Sep 30 14:00:45 crc kubenswrapper[4676]: E0930 14:00:45.768808 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 14:00:46.268782251 +0000 UTC m=+150.251870680 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:45 crc kubenswrapper[4676]: I0930 14:00:45.769100 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cad652ea-7de0-4021-b594-7ea2a0681286-catalog-content\") pod \"community-operators-w8xnq\" (UID: \"cad652ea-7de0-4021-b594-7ea2a0681286\") " pod="openshift-marketplace/community-operators-w8xnq" Sep 30 14:00:45 crc kubenswrapper[4676]: I0930 14:00:45.769646 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cad652ea-7de0-4021-b594-7ea2a0681286-utilities\") pod \"community-operators-w8xnq\" (UID: \"cad652ea-7de0-4021-b594-7ea2a0681286\") " pod="openshift-marketplace/community-operators-w8xnq" Sep 30 14:00:45 crc kubenswrapper[4676]: I0930 14:00:45.790616 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9g5q\" (UniqueName: \"kubernetes.io/projected/cad652ea-7de0-4021-b594-7ea2a0681286-kube-api-access-h9g5q\") pod \"community-operators-w8xnq\" (UID: \"cad652ea-7de0-4021-b594-7ea2a0681286\") " pod="openshift-marketplace/community-operators-w8xnq" Sep 30 14:00:45 crc kubenswrapper[4676]: I0930 14:00:45.826035 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7dvk6"] Sep 30 14:00:45 crc kubenswrapper[4676]: I0930 14:00:45.827005 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7dvk6" Sep 30 14:00:45 crc kubenswrapper[4676]: I0930 14:00:45.843585 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7dvk6"] Sep 30 14:00:45 crc kubenswrapper[4676]: I0930 14:00:45.869652 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bb07181-c62b-4440-9ab1-b41f968c8a05-utilities\") pod \"certified-operators-7dvk6\" (UID: \"6bb07181-c62b-4440-9ab1-b41f968c8a05\") " pod="openshift-marketplace/certified-operators-7dvk6" Sep 30 14:00:45 crc kubenswrapper[4676]: I0930 14:00:45.869766 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bb07181-c62b-4440-9ab1-b41f968c8a05-catalog-content\") pod \"certified-operators-7dvk6\" (UID: \"6bb07181-c62b-4440-9ab1-b41f968c8a05\") " pod="openshift-marketplace/certified-operators-7dvk6" Sep 30 14:00:45 crc kubenswrapper[4676]: I0930 14:00:45.869801 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75h65\" (UniqueName: \"kubernetes.io/projected/6bb07181-c62b-4440-9ab1-b41f968c8a05-kube-api-access-75h65\") pod \"certified-operators-7dvk6\" (UID: \"6bb07181-c62b-4440-9ab1-b41f968c8a05\") " pod="openshift-marketplace/certified-operators-7dvk6" Sep 30 14:00:45 crc kubenswrapper[4676]: I0930 14:00:45.869825 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:45 crc kubenswrapper[4676]: E0930 14:00:45.870114 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 14:00:46.370102502 +0000 UTC m=+150.353190931 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rgs57" (UID: "aa6b14cb-7f79-4bc6-bc14-58325daf3c86") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:45 crc kubenswrapper[4676]: I0930 14:00:45.920711 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8bmsn" Sep 30 14:00:45 crc kubenswrapper[4676]: I0930 14:00:45.962434 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w8xnq" Sep 30 14:00:45 crc kubenswrapper[4676]: I0930 14:00:45.970776 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 14:00:45 crc kubenswrapper[4676]: I0930 14:00:45.971158 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75h65\" (UniqueName: \"kubernetes.io/projected/6bb07181-c62b-4440-9ab1-b41f968c8a05-kube-api-access-75h65\") pod \"certified-operators-7dvk6\" (UID: \"6bb07181-c62b-4440-9ab1-b41f968c8a05\") " pod="openshift-marketplace/certified-operators-7dvk6" Sep 30 14:00:45 crc kubenswrapper[4676]: E0930 14:00:45.971190 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 14:00:46.471168866 +0000 UTC m=+150.454257305 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:45 crc kubenswrapper[4676]: I0930 14:00:45.971220 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bb07181-c62b-4440-9ab1-b41f968c8a05-catalog-content\") pod \"certified-operators-7dvk6\" (UID: \"6bb07181-c62b-4440-9ab1-b41f968c8a05\") " pod="openshift-marketplace/certified-operators-7dvk6" Sep 30 14:00:45 crc kubenswrapper[4676]: I0930 14:00:45.971263 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:45 crc kubenswrapper[4676]: I0930 14:00:45.971306 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bb07181-c62b-4440-9ab1-b41f968c8a05-utilities\") pod \"certified-operators-7dvk6\" (UID: \"6bb07181-c62b-4440-9ab1-b41f968c8a05\") " pod="openshift-marketplace/certified-operators-7dvk6" Sep 30 14:00:45 crc kubenswrapper[4676]: E0930 14:00:45.971673 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 14:00:46.471655019 +0000 UTC m=+150.454743448 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rgs57" (UID: "aa6b14cb-7f79-4bc6-bc14-58325daf3c86") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:45 crc kubenswrapper[4676]: I0930 14:00:45.971764 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bb07181-c62b-4440-9ab1-b41f968c8a05-catalog-content\") pod \"certified-operators-7dvk6\" (UID: \"6bb07181-c62b-4440-9ab1-b41f968c8a05\") " pod="openshift-marketplace/certified-operators-7dvk6" Sep 30 14:00:45 crc kubenswrapper[4676]: I0930 14:00:45.972362 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bb07181-c62b-4440-9ab1-b41f968c8a05-utilities\") pod \"certified-operators-7dvk6\" (UID: \"6bb07181-c62b-4440-9ab1-b41f968c8a05\") " pod="openshift-marketplace/certified-operators-7dvk6" Sep 30 14:00:45 crc kubenswrapper[4676]: I0930 14:00:45.998782 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75h65\" (UniqueName: \"kubernetes.io/projected/6bb07181-c62b-4440-9ab1-b41f968c8a05-kube-api-access-75h65\") pod \"certified-operators-7dvk6\" (UID: \"6bb07181-c62b-4440-9ab1-b41f968c8a05\") " pod="openshift-marketplace/certified-operators-7dvk6" Sep 30 14:00:46 crc kubenswrapper[4676]: I0930 14:00:46.023040 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4z4mt"] Sep 30 14:00:46 crc kubenswrapper[4676]: I0930 14:00:46.026448 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4z4mt" Sep 30 14:00:46 crc kubenswrapper[4676]: I0930 14:00:46.045833 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4z4mt"] Sep 30 14:00:46 crc kubenswrapper[4676]: I0930 14:00:46.072397 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 14:00:46 crc kubenswrapper[4676]: I0930 14:00:46.072634 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwd8p\" (UniqueName: \"kubernetes.io/projected/a878b8c0-a59b-4cc7-ac47-18164d4337af-kube-api-access-qwd8p\") pod \"community-operators-4z4mt\" (UID: \"a878b8c0-a59b-4cc7-ac47-18164d4337af\") " pod="openshift-marketplace/community-operators-4z4mt" Sep 30 14:00:46 crc kubenswrapper[4676]: I0930 14:00:46.072670 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a878b8c0-a59b-4cc7-ac47-18164d4337af-catalog-content\") pod \"community-operators-4z4mt\" (UID: \"a878b8c0-a59b-4cc7-ac47-18164d4337af\") " pod="openshift-marketplace/community-operators-4z4mt" Sep 30 14:00:46 crc kubenswrapper[4676]: I0930 14:00:46.072723 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a878b8c0-a59b-4cc7-ac47-18164d4337af-utilities\") pod \"community-operators-4z4mt\" (UID: \"a878b8c0-a59b-4cc7-ac47-18164d4337af\") " pod="openshift-marketplace/community-operators-4z4mt" Sep 30 14:00:46 crc kubenswrapper[4676]: E0930 14:00:46.072836 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 14:00:46.572820796 +0000 UTC m=+150.555909225 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:46 crc kubenswrapper[4676]: I0930 14:00:46.144451 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7dvk6" Sep 30 14:00:46 crc kubenswrapper[4676]: I0930 14:00:46.184682 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwd8p\" (UniqueName: \"kubernetes.io/projected/a878b8c0-a59b-4cc7-ac47-18164d4337af-kube-api-access-qwd8p\") pod \"community-operators-4z4mt\" (UID: \"a878b8c0-a59b-4cc7-ac47-18164d4337af\") " pod="openshift-marketplace/community-operators-4z4mt" Sep 30 14:00:46 crc kubenswrapper[4676]: I0930 14:00:46.184716 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a878b8c0-a59b-4cc7-ac47-18164d4337af-catalog-content\") pod \"community-operators-4z4mt\" (UID: \"a878b8c0-a59b-4cc7-ac47-18164d4337af\") " pod="openshift-marketplace/community-operators-4z4mt" Sep 30 14:00:46 crc kubenswrapper[4676]: I0930 14:00:46.184743 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:46 crc kubenswrapper[4676]: I0930 14:00:46.184786 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a878b8c0-a59b-4cc7-ac47-18164d4337af-utilities\") pod \"community-operators-4z4mt\" (UID: \"a878b8c0-a59b-4cc7-ac47-18164d4337af\") " pod="openshift-marketplace/community-operators-4z4mt" Sep 30 14:00:46 crc kubenswrapper[4676]: I0930 14:00:46.185200 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a878b8c0-a59b-4cc7-ac47-18164d4337af-utilities\") pod \"community-operators-4z4mt\" (UID: \"a878b8c0-a59b-4cc7-ac47-18164d4337af\") " pod="openshift-marketplace/community-operators-4z4mt" Sep 30 14:00:46 crc kubenswrapper[4676]: E0930 14:00:46.185666 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 14:00:46.685653926 +0000 UTC m=+150.668742355 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rgs57" (UID: "aa6b14cb-7f79-4bc6-bc14-58325daf3c86") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:46 crc kubenswrapper[4676]: I0930 14:00:46.185688 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a878b8c0-a59b-4cc7-ac47-18164d4337af-catalog-content\") pod \"community-operators-4z4mt\" (UID: \"a878b8c0-a59b-4cc7-ac47-18164d4337af\") " pod="openshift-marketplace/community-operators-4z4mt" Sep 30 14:00:46 crc kubenswrapper[4676]: I0930 14:00:46.216342 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwd8p\" (UniqueName: \"kubernetes.io/projected/a878b8c0-a59b-4cc7-ac47-18164d4337af-kube-api-access-qwd8p\") pod \"community-operators-4z4mt\" (UID: \"a878b8c0-a59b-4cc7-ac47-18164d4337af\") " pod="openshift-marketplace/community-operators-4z4mt" Sep 30 14:00:46 crc kubenswrapper[4676]: I0930 14:00:46.285689 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 14:00:46 crc kubenswrapper[4676]: E0930 14:00:46.285895 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 14:00:46.785855546 +0000 UTC m=+150.768943975 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:46 crc kubenswrapper[4676]: I0930 14:00:46.286031 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:46 crc kubenswrapper[4676]: E0930 14:00:46.286617 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 14:00:46.786599846 +0000 UTC m=+150.769688285 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rgs57" (UID: "aa6b14cb-7f79-4bc6-bc14-58325daf3c86") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:46 crc kubenswrapper[4676]: I0930 14:00:46.292747 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8bmsn"] Sep 30 14:00:46 crc kubenswrapper[4676]: I0930 14:00:46.295619 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-k8mk5" Sep 30 14:00:46 crc kubenswrapper[4676]: W0930 14:00:46.351927 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19e521ab_fa8b_43c8_a6ca_c57b507fce2d.slice/crio-2d7f69527f71ccc2d1781dc6a8c454f62e75d4b1f192fadb55e90dad1721bcf3 WatchSource:0}: Error finding container 2d7f69527f71ccc2d1781dc6a8c454f62e75d4b1f192fadb55e90dad1721bcf3: Status 404 returned error can't find the container with id 2d7f69527f71ccc2d1781dc6a8c454f62e75d4b1f192fadb55e90dad1721bcf3 Sep 30 14:00:46 crc kubenswrapper[4676]: I0930 14:00:46.360941 4676 generic.go:334] "Generic (PLEG): container finished" podID="6709a903-5bfa-42c6-be52-df8efb1d106e" containerID="519d01dc7fce15dba99590d1f77048e20d3892fdab8716958f9af3893e8ad4b7" exitCode=0 Sep 30 14:00:46 crc kubenswrapper[4676]: I0930 14:00:46.361300 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320680-q9p7h" event={"ID":"6709a903-5bfa-42c6-be52-df8efb1d106e","Type":"ContainerDied","Data":"519d01dc7fce15dba99590d1f77048e20d3892fdab8716958f9af3893e8ad4b7"} Sep 30 14:00:46 crc kubenswrapper[4676]: I0930 14:00:46.361308 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4z4mt" Sep 30 14:00:46 crc kubenswrapper[4676]: I0930 14:00:46.372940 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"e5319fd7eb68c547a259024c29909042ba0844607aee4fcdf9f68771b924dfd9"} Sep 30 14:00:46 crc kubenswrapper[4676]: I0930 14:00:46.373000 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"240aecb14b87759ae3d3e360afd59d73530056d3f7e24f987d38a72ac7574d28"} Sep 30 14:00:46 crc kubenswrapper[4676]: I0930 14:00:46.391804 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 14:00:46 crc kubenswrapper[4676]: E0930 14:00:46.392421 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 14:00:46.892397752 +0000 UTC m=+150.875486181 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:46 crc kubenswrapper[4676]: I0930 14:00:46.413316 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-w8xnq"] Sep 30 14:00:46 crc kubenswrapper[4676]: I0930 14:00:46.413367 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"32e035000bea5623a02d5003febd25aec1034925b5f7842b8314b06239beacfb"} Sep 30 14:00:46 crc kubenswrapper[4676]: I0930 14:00:46.413424 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"bf984b0e6502178d61ea85b6bfcb948358004573d9cb07cfffa6bc609a137ae3"} Sep 30 14:00:46 crc kubenswrapper[4676]: I0930 14:00:46.414144 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 14:00:46 crc kubenswrapper[4676]: I0930 14:00:46.436073 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"a7798f9cc9d8167c672ae6b2e251de16823ff1e652a738d0504e24d7f1bba04e"} Sep 30 14:00:46 crc kubenswrapper[4676]: I0930 14:00:46.436141 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"59ea36608b52e7a08988ea3449e4838b837344693c141f66e3514dc3a68a6d4f"} Sep 30 14:00:46 crc kubenswrapper[4676]: I0930 14:00:46.504854 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:46 crc kubenswrapper[4676]: E0930 14:00:46.508874 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 14:00:47.008856993 +0000 UTC m=+150.991945422 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rgs57" (UID: "aa6b14cb-7f79-4bc6-bc14-58325daf3c86") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:46 crc kubenswrapper[4676]: I0930 14:00:46.587818 4676 patch_prober.go:28] interesting pod/router-default-5444994796-tj9mh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 14:00:46 crc kubenswrapper[4676]: [-]has-synced failed: reason withheld Sep 30 14:00:46 crc kubenswrapper[4676]: [+]process-running ok Sep 30 14:00:46 crc kubenswrapper[4676]: healthz check failed Sep 30 14:00:46 crc kubenswrapper[4676]: I0930 14:00:46.587887 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tj9mh" podUID="b7545eda-fbd7-4d47-8f48-084b1319bf34" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 14:00:46 crc kubenswrapper[4676]: I0930 14:00:46.610136 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 14:00:46 crc kubenswrapper[4676]: E0930 14:00:46.610554 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 14:00:47.110535024 +0000 UTC m=+151.093623453 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:46 crc kubenswrapper[4676]: I0930 14:00:46.716836 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:46 crc kubenswrapper[4676]: E0930 14:00:46.717342 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 14:00:47.217325297 +0000 UTC m=+151.200413726 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rgs57" (UID: "aa6b14cb-7f79-4bc6-bc14-58325daf3c86") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:46 crc kubenswrapper[4676]: I0930 14:00:46.751549 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7dvk6"] Sep 30 14:00:46 crc kubenswrapper[4676]: I0930 14:00:46.818487 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 14:00:46 crc kubenswrapper[4676]: E0930 14:00:46.818681 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 14:00:47.318646407 +0000 UTC m=+151.301734836 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:46 crc kubenswrapper[4676]: I0930 14:00:46.819318 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:46 crc kubenswrapper[4676]: E0930 14:00:46.819683 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 14:00:47.319672376 +0000 UTC m=+151.302760865 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rgs57" (UID: "aa6b14cb-7f79-4bc6-bc14-58325daf3c86") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:46 crc kubenswrapper[4676]: I0930 14:00:46.920180 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 14:00:46 crc kubenswrapper[4676]: E0930 14:00:46.920624 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 14:00:47.420598795 +0000 UTC m=+151.403687224 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.021372 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:47 crc kubenswrapper[4676]: E0930 14:00:47.021753 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 14:00:47.52173373 +0000 UTC m=+151.504822149 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rgs57" (UID: "aa6b14cb-7f79-4bc6-bc14-58325daf3c86") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.037232 4676 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.064241 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4z4mt"] Sep 30 14:00:47 crc kubenswrapper[4676]: W0930 14:00:47.073110 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda878b8c0_a59b_4cc7_ac47_18164d4337af.slice/crio-c11e4b5f46f2df7c3861e0c463f3ea35f366b21f1c89538e095ce7de1549d386 WatchSource:0}: Error finding container c11e4b5f46f2df7c3861e0c463f3ea35f366b21f1c89538e095ce7de1549d386: Status 404 returned error can't find the container with id c11e4b5f46f2df7c3861e0c463f3ea35f366b21f1c89538e095ce7de1549d386 Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.094321 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mqdxw" Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.107251 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mqdxw" Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.123205 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 14:00:47 crc kubenswrapper[4676]: E0930 14:00:47.123356 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 14:00:47.623329479 +0000 UTC m=+151.606417918 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.123481 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:47 crc kubenswrapper[4676]: E0930 14:00:47.123843 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 14:00:47.623833062 +0000 UTC m=+151.606921491 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rgs57" (UID: "aa6b14cb-7f79-4bc6-bc14-58325daf3c86") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.224927 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 14:00:47 crc kubenswrapper[4676]: E0930 14:00:47.225429 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 14:00:47.72540709 +0000 UTC m=+151.708495519 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.226434 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:47 crc kubenswrapper[4676]: E0930 14:00:47.227556 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 14:00:47.727541249 +0000 UTC m=+151.710629678 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rgs57" (UID: "aa6b14cb-7f79-4bc6-bc14-58325daf3c86") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.327725 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 14:00:47 crc kubenswrapper[4676]: E0930 14:00:47.328149 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 14:00:47.828134139 +0000 UTC m=+151.811222568 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.390706 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-qbdpc" Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.423605 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-t6gvw"] Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.425521 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t6gvw" Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.427264 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.443506 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:47 crc kubenswrapper[4676]: E0930 14:00:47.446101 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 14:00:47.946080383 +0000 UTC m=+151.929168812 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rgs57" (UID: "aa6b14cb-7f79-4bc6-bc14-58325daf3c86") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.485777 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t6gvw"] Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.485866 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4mgv7" Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.498311 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4mgv7" Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.502348 4676 generic.go:334] "Generic (PLEG): container finished" podID="19e521ab-fa8b-43c8-a6ca-c57b507fce2d" containerID="bb4a9e0b3ab2ff339c01cf98a8d22afcaf60713b61bc17a6789fa3a15ae8a237" exitCode=0 Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.502674 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8bmsn" event={"ID":"19e521ab-fa8b-43c8-a6ca-c57b507fce2d","Type":"ContainerDied","Data":"bb4a9e0b3ab2ff339c01cf98a8d22afcaf60713b61bc17a6789fa3a15ae8a237"} Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.502723 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8bmsn" event={"ID":"19e521ab-fa8b-43c8-a6ca-c57b507fce2d","Type":"ContainerStarted","Data":"2d7f69527f71ccc2d1781dc6a8c454f62e75d4b1f192fadb55e90dad1721bcf3"} Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.504523 4676 generic.go:334] "Generic (PLEG): container finished" podID="6bb07181-c62b-4440-9ab1-b41f968c8a05" containerID="ea4e1dae03ca9464b2ec8d14207dc1271d626e142019a7310590b23a9c23e199" exitCode=0 Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.504590 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7dvk6" event={"ID":"6bb07181-c62b-4440-9ab1-b41f968c8a05","Type":"ContainerDied","Data":"ea4e1dae03ca9464b2ec8d14207dc1271d626e142019a7310590b23a9c23e199"} Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.504615 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7dvk6" event={"ID":"6bb07181-c62b-4440-9ab1-b41f968c8a05","Type":"ContainerStarted","Data":"6ea55ee2bb1c8724011da425e5c6a8f49fe4b2bb2f8872d24e5006b4bbf27052"} Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.505611 4676 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.510447 4676 generic.go:334] "Generic (PLEG): container finished" podID="a878b8c0-a59b-4cc7-ac47-18164d4337af" containerID="b09edd7b0dc416e77fc7379f5bfcf28d6b3ff9ff2f74b3e346b3233a427bab99" exitCode=0 Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.510756 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4z4mt" event={"ID":"a878b8c0-a59b-4cc7-ac47-18164d4337af","Type":"ContainerDied","Data":"b09edd7b0dc416e77fc7379f5bfcf28d6b3ff9ff2f74b3e346b3233a427bab99"} Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.510805 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4z4mt" event={"ID":"a878b8c0-a59b-4cc7-ac47-18164d4337af","Type":"ContainerStarted","Data":"c11e4b5f46f2df7c3861e0c463f3ea35f366b21f1c89538e095ce7de1549d386"} Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.514546 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-rnmq9" event={"ID":"2b51a7d3-7ee2-45a0-afa2-e0fb3bbed0b6","Type":"ContainerStarted","Data":"730bed93c2606d347b56dec356ef25a1b3a8e27e53573d139f3ce1b6636847ec"} Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.514585 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-rnmq9" event={"ID":"2b51a7d3-7ee2-45a0-afa2-e0fb3bbed0b6","Type":"ContainerStarted","Data":"8ae7994df5720e8f2b032a355f86fd73273b2fa697ce9cf586417b4c688a9e27"} Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.514676 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-rnmq9" event={"ID":"2b51a7d3-7ee2-45a0-afa2-e0fb3bbed0b6","Type":"ContainerStarted","Data":"60fd3d977cbdef5099428bd665ef2ccb90dd25cc11690bccebb75efa704a0e92"} Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.520498 4676 generic.go:334] "Generic (PLEG): container finished" podID="cad652ea-7de0-4021-b594-7ea2a0681286" containerID="67d6562e91b061be4c580510460c491552061166877e34c5e0c845be22e9cf27" exitCode=0 Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.520537 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w8xnq" event={"ID":"cad652ea-7de0-4021-b594-7ea2a0681286","Type":"ContainerDied","Data":"67d6562e91b061be4c580510460c491552061166877e34c5e0c845be22e9cf27"} Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.520570 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w8xnq" event={"ID":"cad652ea-7de0-4021-b594-7ea2a0681286","Type":"ContainerStarted","Data":"e777f253c3e60f7a18141d1913731826cbf10f1e5c498e36e2934c061a8873b6"} Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.546813 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.547112 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/586d40ff-8404-4b50-bfab-bd99ba97daca-utilities\") pod \"redhat-marketplace-t6gvw\" (UID: \"586d40ff-8404-4b50-bfab-bd99ba97daca\") " pod="openshift-marketplace/redhat-marketplace-t6gvw" Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.547256 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/586d40ff-8404-4b50-bfab-bd99ba97daca-catalog-content\") pod \"redhat-marketplace-t6gvw\" (UID: \"586d40ff-8404-4b50-bfab-bd99ba97daca\") " pod="openshift-marketplace/redhat-marketplace-t6gvw" Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.547284 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hkbr\" (UniqueName: \"kubernetes.io/projected/586d40ff-8404-4b50-bfab-bd99ba97daca-kube-api-access-7hkbr\") pod \"redhat-marketplace-t6gvw\" (UID: \"586d40ff-8404-4b50-bfab-bd99ba97daca\") " pod="openshift-marketplace/redhat-marketplace-t6gvw" Sep 30 14:00:47 crc kubenswrapper[4676]: E0930 14:00:47.547702 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 14:00:48.047679391 +0000 UTC m=+152.030767830 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.588647 4676 patch_prober.go:28] interesting pod/router-default-5444994796-tj9mh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 14:00:47 crc kubenswrapper[4676]: [-]has-synced failed: reason withheld Sep 30 14:00:47 crc kubenswrapper[4676]: [+]process-running ok Sep 30 14:00:47 crc kubenswrapper[4676]: healthz check failed Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.588704 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tj9mh" podUID="b7545eda-fbd7-4d47-8f48-084b1319bf34" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.619148 4676 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-09-30T14:00:47.037260412Z","Handler":null,"Name":""} Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.624155 4676 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.624215 4676 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.659410 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/586d40ff-8404-4b50-bfab-bd99ba97daca-catalog-content\") pod \"redhat-marketplace-t6gvw\" (UID: \"586d40ff-8404-4b50-bfab-bd99ba97daca\") " pod="openshift-marketplace/redhat-marketplace-t6gvw" Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.653659 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/586d40ff-8404-4b50-bfab-bd99ba97daca-catalog-content\") pod \"redhat-marketplace-t6gvw\" (UID: \"586d40ff-8404-4b50-bfab-bd99ba97daca\") " pod="openshift-marketplace/redhat-marketplace-t6gvw" Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.668826 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hkbr\" (UniqueName: \"kubernetes.io/projected/586d40ff-8404-4b50-bfab-bd99ba97daca-kube-api-access-7hkbr\") pod \"redhat-marketplace-t6gvw\" (UID: \"586d40ff-8404-4b50-bfab-bd99ba97daca\") " pod="openshift-marketplace/redhat-marketplace-t6gvw" Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.669064 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/586d40ff-8404-4b50-bfab-bd99ba97daca-utilities\") pod \"redhat-marketplace-t6gvw\" (UID: \"586d40ff-8404-4b50-bfab-bd99ba97daca\") " pod="openshift-marketplace/redhat-marketplace-t6gvw" Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.669177 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.671732 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/586d40ff-8404-4b50-bfab-bd99ba97daca-utilities\") pod \"redhat-marketplace-t6gvw\" (UID: \"586d40ff-8404-4b50-bfab-bd99ba97daca\") " pod="openshift-marketplace/redhat-marketplace-t6gvw" Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.685126 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-xr2v8" Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.697611 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-xr2v8" Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.698537 4676 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.698567 4676 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.726701 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hkbr\" (UniqueName: \"kubernetes.io/projected/586d40ff-8404-4b50-bfab-bd99ba97daca-kube-api-access-7hkbr\") pod \"redhat-marketplace-t6gvw\" (UID: \"586d40ff-8404-4b50-bfab-bd99ba97daca\") " pod="openshift-marketplace/redhat-marketplace-t6gvw" Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.739620 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-rnmq9" podStartSLOduration=12.739602454 podStartE2EDuration="12.739602454s" podCreationTimestamp="2025-09-30 14:00:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:00:47.737092674 +0000 UTC m=+151.720181113" watchObservedRunningTime="2025-09-30 14:00:47.739602454 +0000 UTC m=+151.722690883" Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.754630 4676 patch_prober.go:28] interesting pod/apiserver-76f77b778f-xr2v8 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Sep 30 14:00:47 crc kubenswrapper[4676]: [+]log ok Sep 30 14:00:47 crc kubenswrapper[4676]: [+]etcd ok Sep 30 14:00:47 crc kubenswrapper[4676]: [+]poststarthook/start-apiserver-admission-initializer ok Sep 30 14:00:47 crc kubenswrapper[4676]: [+]poststarthook/generic-apiserver-start-informers ok Sep 30 14:00:47 crc kubenswrapper[4676]: [+]poststarthook/max-in-flight-filter ok Sep 30 14:00:47 crc kubenswrapper[4676]: [+]poststarthook/storage-object-count-tracker-hook ok Sep 30 14:00:47 crc kubenswrapper[4676]: [+]poststarthook/image.openshift.io-apiserver-caches ok Sep 30 14:00:47 crc kubenswrapper[4676]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Sep 30 14:00:47 crc kubenswrapper[4676]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Sep 30 14:00:47 crc kubenswrapper[4676]: [+]poststarthook/project.openshift.io-projectcache ok Sep 30 14:00:47 crc kubenswrapper[4676]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Sep 30 14:00:47 crc kubenswrapper[4676]: [+]poststarthook/openshift.io-startinformers ok Sep 30 14:00:47 crc kubenswrapper[4676]: [+]poststarthook/openshift.io-restmapperupdater ok Sep 30 14:00:47 crc kubenswrapper[4676]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Sep 30 14:00:47 crc kubenswrapper[4676]: livez check failed Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.754709 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-xr2v8" podUID="5d26a632-0017-40e4-ac56-5794c4800d83" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.791724 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rgs57\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.798657 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t6gvw" Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.824828 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320680-q9p7h" Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.827474 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7sphw"] Sep 30 14:00:47 crc kubenswrapper[4676]: E0930 14:00:47.827704 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6709a903-5bfa-42c6-be52-df8efb1d106e" containerName="collect-profiles" Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.827721 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="6709a903-5bfa-42c6-be52-df8efb1d106e" containerName="collect-profiles" Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.827863 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="6709a903-5bfa-42c6-be52-df8efb1d106e" containerName="collect-profiles" Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.828632 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7sphw" Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.850168 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7sphw"] Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.871403 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.872334 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.873973 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.877627 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.878214 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.880521 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.900428 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.975208 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6709a903-5bfa-42c6-be52-df8efb1d106e-config-volume\") pod \"6709a903-5bfa-42c6-be52-df8efb1d106e\" (UID: \"6709a903-5bfa-42c6-be52-df8efb1d106e\") " Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.975464 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6709a903-5bfa-42c6-be52-df8efb1d106e-secret-volume\") pod \"6709a903-5bfa-42c6-be52-df8efb1d106e\" (UID: \"6709a903-5bfa-42c6-be52-df8efb1d106e\") " Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.975538 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjgnk\" (UniqueName: \"kubernetes.io/projected/6709a903-5bfa-42c6-be52-df8efb1d106e-kube-api-access-tjgnk\") pod \"6709a903-5bfa-42c6-be52-df8efb1d106e\" (UID: \"6709a903-5bfa-42c6-be52-df8efb1d106e\") " Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.975775 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ddc55b99-87b9-4f3c-a5a5-08b857e41975-utilities\") pod \"redhat-marketplace-7sphw\" (UID: \"ddc55b99-87b9-4f3c-a5a5-08b857e41975\") " pod="openshift-marketplace/redhat-marketplace-7sphw" Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.975855 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a4c8cc1a-bfb8-4999-9a79-be01ae514a3c-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a4c8cc1a-bfb8-4999-9a79-be01ae514a3c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.975904 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a4c8cc1a-bfb8-4999-9a79-be01ae514a3c-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a4c8cc1a-bfb8-4999-9a79-be01ae514a3c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.975935 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ddc55b99-87b9-4f3c-a5a5-08b857e41975-catalog-content\") pod \"redhat-marketplace-7sphw\" (UID: \"ddc55b99-87b9-4f3c-a5a5-08b857e41975\") " pod="openshift-marketplace/redhat-marketplace-7sphw" Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.975973 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4w9l\" (UniqueName: \"kubernetes.io/projected/ddc55b99-87b9-4f3c-a5a5-08b857e41975-kube-api-access-s4w9l\") pod \"redhat-marketplace-7sphw\" (UID: \"ddc55b99-87b9-4f3c-a5a5-08b857e41975\") " pod="openshift-marketplace/redhat-marketplace-7sphw" Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.976067 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6709a903-5bfa-42c6-be52-df8efb1d106e-config-volume" (OuterVolumeSpecName: "config-volume") pod "6709a903-5bfa-42c6-be52-df8efb1d106e" (UID: "6709a903-5bfa-42c6-be52-df8efb1d106e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.982224 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6709a903-5bfa-42c6-be52-df8efb1d106e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6709a903-5bfa-42c6-be52-df8efb1d106e" (UID: "6709a903-5bfa-42c6-be52-df8efb1d106e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:00:47 crc kubenswrapper[4676]: I0930 14:00:47.983139 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6709a903-5bfa-42c6-be52-df8efb1d106e-kube-api-access-tjgnk" (OuterVolumeSpecName: "kube-api-access-tjgnk") pod "6709a903-5bfa-42c6-be52-df8efb1d106e" (UID: "6709a903-5bfa-42c6-be52-df8efb1d106e"). InnerVolumeSpecName "kube-api-access-tjgnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:00:48 crc kubenswrapper[4676]: I0930 14:00:48.070010 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:48 crc kubenswrapper[4676]: I0930 14:00:48.073113 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t6gvw"] Sep 30 14:00:48 crc kubenswrapper[4676]: I0930 14:00:48.077070 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a4c8cc1a-bfb8-4999-9a79-be01ae514a3c-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a4c8cc1a-bfb8-4999-9a79-be01ae514a3c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 14:00:48 crc kubenswrapper[4676]: I0930 14:00:48.077109 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ddc55b99-87b9-4f3c-a5a5-08b857e41975-catalog-content\") pod \"redhat-marketplace-7sphw\" (UID: \"ddc55b99-87b9-4f3c-a5a5-08b857e41975\") " pod="openshift-marketplace/redhat-marketplace-7sphw" Sep 30 14:00:48 crc kubenswrapper[4676]: I0930 14:00:48.077145 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4w9l\" (UniqueName: \"kubernetes.io/projected/ddc55b99-87b9-4f3c-a5a5-08b857e41975-kube-api-access-s4w9l\") pod \"redhat-marketplace-7sphw\" (UID: \"ddc55b99-87b9-4f3c-a5a5-08b857e41975\") " pod="openshift-marketplace/redhat-marketplace-7sphw" Sep 30 14:00:48 crc kubenswrapper[4676]: I0930 14:00:48.077194 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ddc55b99-87b9-4f3c-a5a5-08b857e41975-utilities\") pod \"redhat-marketplace-7sphw\" (UID: \"ddc55b99-87b9-4f3c-a5a5-08b857e41975\") " pod="openshift-marketplace/redhat-marketplace-7sphw" Sep 30 14:00:48 crc kubenswrapper[4676]: I0930 14:00:48.077260 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a4c8cc1a-bfb8-4999-9a79-be01ae514a3c-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a4c8cc1a-bfb8-4999-9a79-be01ae514a3c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 14:00:48 crc kubenswrapper[4676]: I0930 14:00:48.077318 4676 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6709a903-5bfa-42c6-be52-df8efb1d106e-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 14:00:48 crc kubenswrapper[4676]: I0930 14:00:48.077337 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjgnk\" (UniqueName: \"kubernetes.io/projected/6709a903-5bfa-42c6-be52-df8efb1d106e-kube-api-access-tjgnk\") on node \"crc\" DevicePath \"\"" Sep 30 14:00:48 crc kubenswrapper[4676]: I0930 14:00:48.077350 4676 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6709a903-5bfa-42c6-be52-df8efb1d106e-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 14:00:48 crc kubenswrapper[4676]: I0930 14:00:48.077399 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a4c8cc1a-bfb8-4999-9a79-be01ae514a3c-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a4c8cc1a-bfb8-4999-9a79-be01ae514a3c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 14:00:48 crc kubenswrapper[4676]: I0930 14:00:48.078620 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ddc55b99-87b9-4f3c-a5a5-08b857e41975-catalog-content\") pod \"redhat-marketplace-7sphw\" (UID: \"ddc55b99-87b9-4f3c-a5a5-08b857e41975\") " pod="openshift-marketplace/redhat-marketplace-7sphw" Sep 30 14:00:48 crc kubenswrapper[4676]: I0930 14:00:48.078677 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ddc55b99-87b9-4f3c-a5a5-08b857e41975-utilities\") pod \"redhat-marketplace-7sphw\" (UID: \"ddc55b99-87b9-4f3c-a5a5-08b857e41975\") " pod="openshift-marketplace/redhat-marketplace-7sphw" Sep 30 14:00:48 crc kubenswrapper[4676]: I0930 14:00:48.096913 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4w9l\" (UniqueName: \"kubernetes.io/projected/ddc55b99-87b9-4f3c-a5a5-08b857e41975-kube-api-access-s4w9l\") pod \"redhat-marketplace-7sphw\" (UID: \"ddc55b99-87b9-4f3c-a5a5-08b857e41975\") " pod="openshift-marketplace/redhat-marketplace-7sphw" Sep 30 14:00:48 crc kubenswrapper[4676]: I0930 14:00:48.099000 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a4c8cc1a-bfb8-4999-9a79-be01ae514a3c-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a4c8cc1a-bfb8-4999-9a79-be01ae514a3c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 14:00:48 crc kubenswrapper[4676]: I0930 14:00:48.148798 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7sphw" Sep 30 14:00:48 crc kubenswrapper[4676]: I0930 14:00:48.194927 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 14:00:48 crc kubenswrapper[4676]: I0930 14:00:48.206576 4676 patch_prober.go:28] interesting pod/downloads-7954f5f757-qnmrm container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Sep 30 14:00:48 crc kubenswrapper[4676]: I0930 14:00:48.206648 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qnmrm" podUID="2ef453c0-cdb0-4da7-8c32-3b975e1009a1" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Sep 30 14:00:48 crc kubenswrapper[4676]: I0930 14:00:48.211358 4676 patch_prober.go:28] interesting pod/downloads-7954f5f757-qnmrm container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Sep 30 14:00:48 crc kubenswrapper[4676]: I0930 14:00:48.211407 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-qnmrm" podUID="2ef453c0-cdb0-4da7-8c32-3b975e1009a1" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Sep 30 14:00:48 crc kubenswrapper[4676]: I0930 14:00:48.378335 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-rgs57"] Sep 30 14:00:48 crc kubenswrapper[4676]: I0930 14:00:48.461025 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-zgcbx" Sep 30 14:00:48 crc kubenswrapper[4676]: I0930 14:00:48.461096 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-zgcbx" Sep 30 14:00:48 crc kubenswrapper[4676]: I0930 14:00:48.474665 4676 patch_prober.go:28] interesting pod/console-f9d7485db-zgcbx container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.30:8443/health\": dial tcp 10.217.0.30:8443: connect: connection refused" start-of-body= Sep 30 14:00:48 crc kubenswrapper[4676]: I0930 14:00:48.474807 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-zgcbx" podUID="1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d" containerName="console" probeResult="failure" output="Get \"https://10.217.0.30:8443/health\": dial tcp 10.217.0.30:8443: connect: connection refused" Sep 30 14:00:48 crc kubenswrapper[4676]: I0930 14:00:48.549460 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320680-q9p7h" event={"ID":"6709a903-5bfa-42c6-be52-df8efb1d106e","Type":"ContainerDied","Data":"56cb7f2a7b3c3d7f359899746aba8e0b5796f31ddbcd4873b4e1942e0c4e5951"} Sep 30 14:00:48 crc kubenswrapper[4676]: I0930 14:00:48.549506 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56cb7f2a7b3c3d7f359899746aba8e0b5796f31ddbcd4873b4e1942e0c4e5951" Sep 30 14:00:48 crc kubenswrapper[4676]: I0930 14:00:48.549573 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320680-q9p7h" Sep 30 14:00:48 crc kubenswrapper[4676]: I0930 14:00:48.569130 4676 generic.go:334] "Generic (PLEG): container finished" podID="586d40ff-8404-4b50-bfab-bd99ba97daca" containerID="c03fe5212b408772d7c7718143d27cea6e1a71fad7488f0cf38cc07d692947fe" exitCode=0 Sep 30 14:00:48 crc kubenswrapper[4676]: I0930 14:00:48.569234 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t6gvw" event={"ID":"586d40ff-8404-4b50-bfab-bd99ba97daca","Type":"ContainerDied","Data":"c03fe5212b408772d7c7718143d27cea6e1a71fad7488f0cf38cc07d692947fe"} Sep 30 14:00:48 crc kubenswrapper[4676]: I0930 14:00:48.569268 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t6gvw" event={"ID":"586d40ff-8404-4b50-bfab-bd99ba97daca","Type":"ContainerStarted","Data":"116cd9a9aadf880d483ebe7c4bb18311a2798147600f0200765b6e94fa3f9621"} Sep 30 14:00:48 crc kubenswrapper[4676]: I0930 14:00:48.575143 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" event={"ID":"aa6b14cb-7f79-4bc6-bc14-58325daf3c86","Type":"ContainerStarted","Data":"45aaf30474aff199fb5b474ee9cb4ac4253ccc78ab319223fa996ad2992a212c"} Sep 30 14:00:48 crc kubenswrapper[4676]: I0930 14:00:48.582269 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-tj9mh" Sep 30 14:00:48 crc kubenswrapper[4676]: I0930 14:00:48.593931 4676 patch_prober.go:28] interesting pod/router-default-5444994796-tj9mh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 14:00:48 crc kubenswrapper[4676]: [-]has-synced failed: reason withheld Sep 30 14:00:48 crc kubenswrapper[4676]: [+]process-running ok Sep 30 14:00:48 crc kubenswrapper[4676]: healthz check failed Sep 30 14:00:48 crc kubenswrapper[4676]: I0930 14:00:48.594005 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tj9mh" podUID="b7545eda-fbd7-4d47-8f48-084b1319bf34" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 14:00:48 crc kubenswrapper[4676]: I0930 14:00:48.636815 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5q6qs"] Sep 30 14:00:48 crc kubenswrapper[4676]: I0930 14:00:48.640190 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5q6qs" Sep 30 14:00:48 crc kubenswrapper[4676]: I0930 14:00:48.642290 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Sep 30 14:00:48 crc kubenswrapper[4676]: I0930 14:00:48.645282 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5q6qs"] Sep 30 14:00:48 crc kubenswrapper[4676]: I0930 14:00:48.669720 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-7wrkd" Sep 30 14:00:48 crc kubenswrapper[4676]: I0930 14:00:48.677956 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-7wrkd" Sep 30 14:00:48 crc kubenswrapper[4676]: I0930 14:00:48.684456 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-52tgw" Sep 30 14:00:48 crc kubenswrapper[4676]: I0930 14:00:48.718286 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7sphw"] Sep 30 14:00:48 crc kubenswrapper[4676]: I0930 14:00:48.800868 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d686c86-1f90-4466-a2e9-19f8f15c7e87-utilities\") pod \"redhat-operators-5q6qs\" (UID: \"9d686c86-1f90-4466-a2e9-19f8f15c7e87\") " pod="openshift-marketplace/redhat-operators-5q6qs" Sep 30 14:00:48 crc kubenswrapper[4676]: I0930 14:00:48.803130 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvr9w\" (UniqueName: \"kubernetes.io/projected/9d686c86-1f90-4466-a2e9-19f8f15c7e87-kube-api-access-tvr9w\") pod \"redhat-operators-5q6qs\" (UID: \"9d686c86-1f90-4466-a2e9-19f8f15c7e87\") " pod="openshift-marketplace/redhat-operators-5q6qs" Sep 30 14:00:48 crc kubenswrapper[4676]: I0930 14:00:48.803191 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d686c86-1f90-4466-a2e9-19f8f15c7e87-catalog-content\") pod \"redhat-operators-5q6qs\" (UID: \"9d686c86-1f90-4466-a2e9-19f8f15c7e87\") " pod="openshift-marketplace/redhat-operators-5q6qs" Sep 30 14:00:48 crc kubenswrapper[4676]: I0930 14:00:48.861641 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Sep 30 14:00:48 crc kubenswrapper[4676]: W0930 14:00:48.882211 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda4c8cc1a_bfb8_4999_9a79_be01ae514a3c.slice/crio-13c05ddb25066ba161d82385a70db5302b95c393e60d120359db62119fafee05 WatchSource:0}: Error finding container 13c05ddb25066ba161d82385a70db5302b95c393e60d120359db62119fafee05: Status 404 returned error can't find the container with id 13c05ddb25066ba161d82385a70db5302b95c393e60d120359db62119fafee05 Sep 30 14:00:48 crc kubenswrapper[4676]: I0930 14:00:48.920566 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d686c86-1f90-4466-a2e9-19f8f15c7e87-catalog-content\") pod \"redhat-operators-5q6qs\" (UID: \"9d686c86-1f90-4466-a2e9-19f8f15c7e87\") " pod="openshift-marketplace/redhat-operators-5q6qs" Sep 30 14:00:48 crc kubenswrapper[4676]: I0930 14:00:48.920632 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d686c86-1f90-4466-a2e9-19f8f15c7e87-utilities\") pod \"redhat-operators-5q6qs\" (UID: \"9d686c86-1f90-4466-a2e9-19f8f15c7e87\") " pod="openshift-marketplace/redhat-operators-5q6qs" Sep 30 14:00:48 crc kubenswrapper[4676]: I0930 14:00:48.920722 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvr9w\" (UniqueName: \"kubernetes.io/projected/9d686c86-1f90-4466-a2e9-19f8f15c7e87-kube-api-access-tvr9w\") pod \"redhat-operators-5q6qs\" (UID: \"9d686c86-1f90-4466-a2e9-19f8f15c7e87\") " pod="openshift-marketplace/redhat-operators-5q6qs" Sep 30 14:00:48 crc kubenswrapper[4676]: I0930 14:00:48.921203 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d686c86-1f90-4466-a2e9-19f8f15c7e87-catalog-content\") pod \"redhat-operators-5q6qs\" (UID: \"9d686c86-1f90-4466-a2e9-19f8f15c7e87\") " pod="openshift-marketplace/redhat-operators-5q6qs" Sep 30 14:00:48 crc kubenswrapper[4676]: I0930 14:00:48.921497 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d686c86-1f90-4466-a2e9-19f8f15c7e87-utilities\") pod \"redhat-operators-5q6qs\" (UID: \"9d686c86-1f90-4466-a2e9-19f8f15c7e87\") " pod="openshift-marketplace/redhat-operators-5q6qs" Sep 30 14:00:48 crc kubenswrapper[4676]: I0930 14:00:48.946038 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvr9w\" (UniqueName: \"kubernetes.io/projected/9d686c86-1f90-4466-a2e9-19f8f15c7e87-kube-api-access-tvr9w\") pod \"redhat-operators-5q6qs\" (UID: \"9d686c86-1f90-4466-a2e9-19f8f15c7e87\") " pod="openshift-marketplace/redhat-operators-5q6qs" Sep 30 14:00:48 crc kubenswrapper[4676]: I0930 14:00:48.973531 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5q6qs" Sep 30 14:00:49 crc kubenswrapper[4676]: I0930 14:00:49.022070 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ndlhq"] Sep 30 14:00:49 crc kubenswrapper[4676]: I0930 14:00:49.023471 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ndlhq" Sep 30 14:00:49 crc kubenswrapper[4676]: I0930 14:00:49.052027 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ndlhq"] Sep 30 14:00:49 crc kubenswrapper[4676]: I0930 14:00:49.124213 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98511dcf-3151-4955-b80b-c744ff06da2c-utilities\") pod \"redhat-operators-ndlhq\" (UID: \"98511dcf-3151-4955-b80b-c744ff06da2c\") " pod="openshift-marketplace/redhat-operators-ndlhq" Sep 30 14:00:49 crc kubenswrapper[4676]: I0930 14:00:49.124279 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98511dcf-3151-4955-b80b-c744ff06da2c-catalog-content\") pod \"redhat-operators-ndlhq\" (UID: \"98511dcf-3151-4955-b80b-c744ff06da2c\") " pod="openshift-marketplace/redhat-operators-ndlhq" Sep 30 14:00:49 crc kubenswrapper[4676]: I0930 14:00:49.124313 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkmf2\" (UniqueName: \"kubernetes.io/projected/98511dcf-3151-4955-b80b-c744ff06da2c-kube-api-access-gkmf2\") pod \"redhat-operators-ndlhq\" (UID: \"98511dcf-3151-4955-b80b-c744ff06da2c\") " pod="openshift-marketplace/redhat-operators-ndlhq" Sep 30 14:00:49 crc kubenswrapper[4676]: I0930 14:00:49.226292 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98511dcf-3151-4955-b80b-c744ff06da2c-catalog-content\") pod \"redhat-operators-ndlhq\" (UID: \"98511dcf-3151-4955-b80b-c744ff06da2c\") " pod="openshift-marketplace/redhat-operators-ndlhq" Sep 30 14:00:49 crc kubenswrapper[4676]: I0930 14:00:49.226354 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkmf2\" (UniqueName: \"kubernetes.io/projected/98511dcf-3151-4955-b80b-c744ff06da2c-kube-api-access-gkmf2\") pod \"redhat-operators-ndlhq\" (UID: \"98511dcf-3151-4955-b80b-c744ff06da2c\") " pod="openshift-marketplace/redhat-operators-ndlhq" Sep 30 14:00:49 crc kubenswrapper[4676]: I0930 14:00:49.226454 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98511dcf-3151-4955-b80b-c744ff06da2c-utilities\") pod \"redhat-operators-ndlhq\" (UID: \"98511dcf-3151-4955-b80b-c744ff06da2c\") " pod="openshift-marketplace/redhat-operators-ndlhq" Sep 30 14:00:49 crc kubenswrapper[4676]: I0930 14:00:49.226980 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98511dcf-3151-4955-b80b-c744ff06da2c-utilities\") pod \"redhat-operators-ndlhq\" (UID: \"98511dcf-3151-4955-b80b-c744ff06da2c\") " pod="openshift-marketplace/redhat-operators-ndlhq" Sep 30 14:00:49 crc kubenswrapper[4676]: I0930 14:00:49.227001 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98511dcf-3151-4955-b80b-c744ff06da2c-catalog-content\") pod \"redhat-operators-ndlhq\" (UID: \"98511dcf-3151-4955-b80b-c744ff06da2c\") " pod="openshift-marketplace/redhat-operators-ndlhq" Sep 30 14:00:49 crc kubenswrapper[4676]: I0930 14:00:49.251791 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkmf2\" (UniqueName: \"kubernetes.io/projected/98511dcf-3151-4955-b80b-c744ff06da2c-kube-api-access-gkmf2\") pod \"redhat-operators-ndlhq\" (UID: \"98511dcf-3151-4955-b80b-c744ff06da2c\") " pod="openshift-marketplace/redhat-operators-ndlhq" Sep 30 14:00:49 crc kubenswrapper[4676]: I0930 14:00:49.366353 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ndlhq" Sep 30 14:00:49 crc kubenswrapper[4676]: I0930 14:00:49.458730 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Sep 30 14:00:49 crc kubenswrapper[4676]: I0930 14:00:49.572321 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5q6qs"] Sep 30 14:00:49 crc kubenswrapper[4676]: W0930 14:00:49.577981 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d686c86_1f90_4466_a2e9_19f8f15c7e87.slice/crio-4ba66d080e0a95bf6280103cd79fbc8fb5f58534a93c729be94b3e5d126fb92e WatchSource:0}: Error finding container 4ba66d080e0a95bf6280103cd79fbc8fb5f58534a93c729be94b3e5d126fb92e: Status 404 returned error can't find the container with id 4ba66d080e0a95bf6280103cd79fbc8fb5f58534a93c729be94b3e5d126fb92e Sep 30 14:00:49 crc kubenswrapper[4676]: I0930 14:00:49.589101 4676 patch_prober.go:28] interesting pod/router-default-5444994796-tj9mh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 14:00:49 crc kubenswrapper[4676]: [-]has-synced failed: reason withheld Sep 30 14:00:49 crc kubenswrapper[4676]: [+]process-running ok Sep 30 14:00:49 crc kubenswrapper[4676]: healthz check failed Sep 30 14:00:49 crc kubenswrapper[4676]: I0930 14:00:49.589178 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tj9mh" podUID="b7545eda-fbd7-4d47-8f48-084b1319bf34" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 14:00:49 crc kubenswrapper[4676]: I0930 14:00:49.628774 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" event={"ID":"aa6b14cb-7f79-4bc6-bc14-58325daf3c86","Type":"ContainerStarted","Data":"b67ff64aee72ee79fb252d1c7eeebbd63cc03ce7e2a3d4756d92cb1e85f362ff"} Sep 30 14:00:49 crc kubenswrapper[4676]: I0930 14:00:49.628914 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:00:49 crc kubenswrapper[4676]: I0930 14:00:49.666213 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" podStartSLOduration=128.666192475 podStartE2EDuration="2m8.666192475s" podCreationTimestamp="2025-09-30 13:58:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:00:49.663535151 +0000 UTC m=+153.646623580" watchObservedRunningTime="2025-09-30 14:00:49.666192475 +0000 UTC m=+153.649280914" Sep 30 14:00:49 crc kubenswrapper[4676]: I0930 14:00:49.672203 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"a4c8cc1a-bfb8-4999-9a79-be01ae514a3c","Type":"ContainerStarted","Data":"13c05ddb25066ba161d82385a70db5302b95c393e60d120359db62119fafee05"} Sep 30 14:00:49 crc kubenswrapper[4676]: I0930 14:00:49.677345 4676 generic.go:334] "Generic (PLEG): container finished" podID="ddc55b99-87b9-4f3c-a5a5-08b857e41975" containerID="94b5969bfcadc1cbb9f6022cde7f1c6fd569ff610dae2ec00833a0d62dfa37d6" exitCode=0 Sep 30 14:00:49 crc kubenswrapper[4676]: I0930 14:00:49.677400 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7sphw" event={"ID":"ddc55b99-87b9-4f3c-a5a5-08b857e41975","Type":"ContainerDied","Data":"94b5969bfcadc1cbb9f6022cde7f1c6fd569ff610dae2ec00833a0d62dfa37d6"} Sep 30 14:00:49 crc kubenswrapper[4676]: I0930 14:00:49.677468 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7sphw" event={"ID":"ddc55b99-87b9-4f3c-a5a5-08b857e41975","Type":"ContainerStarted","Data":"9493c61515478d51b931a0656af6c818d8fae88c8445b37c3d1f05dec2ff11bf"} Sep 30 14:00:49 crc kubenswrapper[4676]: I0930 14:00:49.810387 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Sep 30 14:00:49 crc kubenswrapper[4676]: I0930 14:00:49.811835 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 14:00:49 crc kubenswrapper[4676]: I0930 14:00:49.817398 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Sep 30 14:00:49 crc kubenswrapper[4676]: I0930 14:00:49.817597 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Sep 30 14:00:49 crc kubenswrapper[4676]: I0930 14:00:49.860632 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Sep 30 14:00:49 crc kubenswrapper[4676]: I0930 14:00:49.923853 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ndlhq"] Sep 30 14:00:49 crc kubenswrapper[4676]: I0930 14:00:49.951050 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b6468238-1faf-4dc1-82f2-1d2ade02bbeb-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"b6468238-1faf-4dc1-82f2-1d2ade02bbeb\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 14:00:49 crc kubenswrapper[4676]: I0930 14:00:49.951128 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b6468238-1faf-4dc1-82f2-1d2ade02bbeb-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"b6468238-1faf-4dc1-82f2-1d2ade02bbeb\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 14:00:49 crc kubenswrapper[4676]: W0930 14:00:49.976645 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98511dcf_3151_4955_b80b_c744ff06da2c.slice/crio-efdb624ca6188c86d10e2410546f4b386814050869b846e0c2e1303427eae893 WatchSource:0}: Error finding container efdb624ca6188c86d10e2410546f4b386814050869b846e0c2e1303427eae893: Status 404 returned error can't find the container with id efdb624ca6188c86d10e2410546f4b386814050869b846e0c2e1303427eae893 Sep 30 14:00:50 crc kubenswrapper[4676]: I0930 14:00:50.052493 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b6468238-1faf-4dc1-82f2-1d2ade02bbeb-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"b6468238-1faf-4dc1-82f2-1d2ade02bbeb\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 14:00:50 crc kubenswrapper[4676]: I0930 14:00:50.052557 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b6468238-1faf-4dc1-82f2-1d2ade02bbeb-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"b6468238-1faf-4dc1-82f2-1d2ade02bbeb\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 14:00:50 crc kubenswrapper[4676]: I0930 14:00:50.052720 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b6468238-1faf-4dc1-82f2-1d2ade02bbeb-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"b6468238-1faf-4dc1-82f2-1d2ade02bbeb\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 14:00:50 crc kubenswrapper[4676]: I0930 14:00:50.075913 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b6468238-1faf-4dc1-82f2-1d2ade02bbeb-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"b6468238-1faf-4dc1-82f2-1d2ade02bbeb\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 14:00:50 crc kubenswrapper[4676]: I0930 14:00:50.144260 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 14:00:50 crc kubenswrapper[4676]: I0930 14:00:50.447141 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Sep 30 14:00:50 crc kubenswrapper[4676]: I0930 14:00:50.586289 4676 patch_prober.go:28] interesting pod/router-default-5444994796-tj9mh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 14:00:50 crc kubenswrapper[4676]: [-]has-synced failed: reason withheld Sep 30 14:00:50 crc kubenswrapper[4676]: [+]process-running ok Sep 30 14:00:50 crc kubenswrapper[4676]: healthz check failed Sep 30 14:00:50 crc kubenswrapper[4676]: I0930 14:00:50.586413 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tj9mh" podUID="b7545eda-fbd7-4d47-8f48-084b1319bf34" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 14:00:50 crc kubenswrapper[4676]: I0930 14:00:50.701094 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"b6468238-1faf-4dc1-82f2-1d2ade02bbeb","Type":"ContainerStarted","Data":"a4fb73f56c7bcc010378e002909e353947cfc025485d27365ab278c8cf5b74ed"} Sep 30 14:00:50 crc kubenswrapper[4676]: I0930 14:00:50.705448 4676 generic.go:334] "Generic (PLEG): container finished" podID="98511dcf-3151-4955-b80b-c744ff06da2c" containerID="d5535043d281d56183c250243cfa6e6f7947d910b74e472d1d1e2b9224f38761" exitCode=0 Sep 30 14:00:50 crc kubenswrapper[4676]: I0930 14:00:50.705563 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ndlhq" event={"ID":"98511dcf-3151-4955-b80b-c744ff06da2c","Type":"ContainerDied","Data":"d5535043d281d56183c250243cfa6e6f7947d910b74e472d1d1e2b9224f38761"} Sep 30 14:00:50 crc kubenswrapper[4676]: I0930 14:00:50.705607 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ndlhq" event={"ID":"98511dcf-3151-4955-b80b-c744ff06da2c","Type":"ContainerStarted","Data":"efdb624ca6188c86d10e2410546f4b386814050869b846e0c2e1303427eae893"} Sep 30 14:00:50 crc kubenswrapper[4676]: I0930 14:00:50.745697 4676 generic.go:334] "Generic (PLEG): container finished" podID="9d686c86-1f90-4466-a2e9-19f8f15c7e87" containerID="e0790c752393455f9fb4c8f94625283230db41e64814ea96115862574c253ae1" exitCode=0 Sep 30 14:00:50 crc kubenswrapper[4676]: I0930 14:00:50.745787 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5q6qs" event={"ID":"9d686c86-1f90-4466-a2e9-19f8f15c7e87","Type":"ContainerDied","Data":"e0790c752393455f9fb4c8f94625283230db41e64814ea96115862574c253ae1"} Sep 30 14:00:50 crc kubenswrapper[4676]: I0930 14:00:50.745834 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5q6qs" event={"ID":"9d686c86-1f90-4466-a2e9-19f8f15c7e87","Type":"ContainerStarted","Data":"4ba66d080e0a95bf6280103cd79fbc8fb5f58534a93c729be94b3e5d126fb92e"} Sep 30 14:00:50 crc kubenswrapper[4676]: I0930 14:00:50.751457 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"a4c8cc1a-bfb8-4999-9a79-be01ae514a3c","Type":"ContainerDied","Data":"12001b01014df1890980e83851d3fea6ad175366784ab634af1a7b1411a398fd"} Sep 30 14:00:50 crc kubenswrapper[4676]: I0930 14:00:50.751242 4676 generic.go:334] "Generic (PLEG): container finished" podID="a4c8cc1a-bfb8-4999-9a79-be01ae514a3c" containerID="12001b01014df1890980e83851d3fea6ad175366784ab634af1a7b1411a398fd" exitCode=0 Sep 30 14:00:51 crc kubenswrapper[4676]: I0930 14:00:51.584366 4676 patch_prober.go:28] interesting pod/router-default-5444994796-tj9mh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 14:00:51 crc kubenswrapper[4676]: [-]has-synced failed: reason withheld Sep 30 14:00:51 crc kubenswrapper[4676]: [+]process-running ok Sep 30 14:00:51 crc kubenswrapper[4676]: healthz check failed Sep 30 14:00:51 crc kubenswrapper[4676]: I0930 14:00:51.585143 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tj9mh" podUID="b7545eda-fbd7-4d47-8f48-084b1319bf34" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 14:00:51 crc kubenswrapper[4676]: I0930 14:00:51.768273 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"b6468238-1faf-4dc1-82f2-1d2ade02bbeb","Type":"ContainerStarted","Data":"9352673ea2de9ad11fde999debfd7bc88933175ac98b2fb2aec1b0d9190f3e52"} Sep 30 14:00:51 crc kubenswrapper[4676]: I0930 14:00:51.786978 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.786959421 podStartE2EDuration="2.786959421s" podCreationTimestamp="2025-09-30 14:00:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:00:51.783454293 +0000 UTC m=+155.766542722" watchObservedRunningTime="2025-09-30 14:00:51.786959421 +0000 UTC m=+155.770047850" Sep 30 14:00:52 crc kubenswrapper[4676]: I0930 14:00:52.146646 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 14:00:52 crc kubenswrapper[4676]: I0930 14:00:52.315750 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a4c8cc1a-bfb8-4999-9a79-be01ae514a3c-kubelet-dir\") pod \"a4c8cc1a-bfb8-4999-9a79-be01ae514a3c\" (UID: \"a4c8cc1a-bfb8-4999-9a79-be01ae514a3c\") " Sep 30 14:00:52 crc kubenswrapper[4676]: I0930 14:00:52.315865 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a4c8cc1a-bfb8-4999-9a79-be01ae514a3c-kube-api-access\") pod \"a4c8cc1a-bfb8-4999-9a79-be01ae514a3c\" (UID: \"a4c8cc1a-bfb8-4999-9a79-be01ae514a3c\") " Sep 30 14:00:52 crc kubenswrapper[4676]: I0930 14:00:52.316475 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a4c8cc1a-bfb8-4999-9a79-be01ae514a3c-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a4c8cc1a-bfb8-4999-9a79-be01ae514a3c" (UID: "a4c8cc1a-bfb8-4999-9a79-be01ae514a3c"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 14:00:52 crc kubenswrapper[4676]: I0930 14:00:52.324207 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4c8cc1a-bfb8-4999-9a79-be01ae514a3c-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a4c8cc1a-bfb8-4999-9a79-be01ae514a3c" (UID: "a4c8cc1a-bfb8-4999-9a79-be01ae514a3c"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:00:52 crc kubenswrapper[4676]: I0930 14:00:52.417033 4676 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a4c8cc1a-bfb8-4999-9a79-be01ae514a3c-kubelet-dir\") on node \"crc\" DevicePath \"\"" Sep 30 14:00:52 crc kubenswrapper[4676]: I0930 14:00:52.417068 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a4c8cc1a-bfb8-4999-9a79-be01ae514a3c-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 14:00:52 crc kubenswrapper[4676]: I0930 14:00:52.548810 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 14:00:52 crc kubenswrapper[4676]: I0930 14:00:52.589518 4676 patch_prober.go:28] interesting pod/router-default-5444994796-tj9mh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 14:00:52 crc kubenswrapper[4676]: [-]has-synced failed: reason withheld Sep 30 14:00:52 crc kubenswrapper[4676]: [+]process-running ok Sep 30 14:00:52 crc kubenswrapper[4676]: healthz check failed Sep 30 14:00:52 crc kubenswrapper[4676]: I0930 14:00:52.589651 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tj9mh" podUID="b7545eda-fbd7-4d47-8f48-084b1319bf34" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 14:00:52 crc kubenswrapper[4676]: I0930 14:00:52.691871 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-xr2v8" Sep 30 14:00:52 crc kubenswrapper[4676]: I0930 14:00:52.699064 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-xr2v8" Sep 30 14:00:52 crc kubenswrapper[4676]: I0930 14:00:52.862468 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 14:00:52 crc kubenswrapper[4676]: I0930 14:00:52.862568 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"a4c8cc1a-bfb8-4999-9a79-be01ae514a3c","Type":"ContainerDied","Data":"13c05ddb25066ba161d82385a70db5302b95c393e60d120359db62119fafee05"} Sep 30 14:00:52 crc kubenswrapper[4676]: I0930 14:00:52.862670 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="13c05ddb25066ba161d82385a70db5302b95c393e60d120359db62119fafee05" Sep 30 14:00:52 crc kubenswrapper[4676]: I0930 14:00:52.905807 4676 generic.go:334] "Generic (PLEG): container finished" podID="b6468238-1faf-4dc1-82f2-1d2ade02bbeb" containerID="9352673ea2de9ad11fde999debfd7bc88933175ac98b2fb2aec1b0d9190f3e52" exitCode=0 Sep 30 14:00:52 crc kubenswrapper[4676]: I0930 14:00:52.906100 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"b6468238-1faf-4dc1-82f2-1d2ade02bbeb","Type":"ContainerDied","Data":"9352673ea2de9ad11fde999debfd7bc88933175ac98b2fb2aec1b0d9190f3e52"} Sep 30 14:00:53 crc kubenswrapper[4676]: I0930 14:00:53.584289 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-tj9mh" Sep 30 14:00:53 crc kubenswrapper[4676]: I0930 14:00:53.588419 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-tj9mh" Sep 30 14:00:53 crc kubenswrapper[4676]: I0930 14:00:53.728445 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-j465h" Sep 30 14:00:54 crc kubenswrapper[4676]: I0930 14:00:54.361311 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 14:00:54 crc kubenswrapper[4676]: I0930 14:00:54.487574 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b6468238-1faf-4dc1-82f2-1d2ade02bbeb-kube-api-access\") pod \"b6468238-1faf-4dc1-82f2-1d2ade02bbeb\" (UID: \"b6468238-1faf-4dc1-82f2-1d2ade02bbeb\") " Sep 30 14:00:54 crc kubenswrapper[4676]: I0930 14:00:54.488362 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b6468238-1faf-4dc1-82f2-1d2ade02bbeb-kubelet-dir\") pod \"b6468238-1faf-4dc1-82f2-1d2ade02bbeb\" (UID: \"b6468238-1faf-4dc1-82f2-1d2ade02bbeb\") " Sep 30 14:00:54 crc kubenswrapper[4676]: I0930 14:00:54.488456 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b6468238-1faf-4dc1-82f2-1d2ade02bbeb-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "b6468238-1faf-4dc1-82f2-1d2ade02bbeb" (UID: "b6468238-1faf-4dc1-82f2-1d2ade02bbeb"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 14:00:54 crc kubenswrapper[4676]: I0930 14:00:54.489051 4676 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b6468238-1faf-4dc1-82f2-1d2ade02bbeb-kubelet-dir\") on node \"crc\" DevicePath \"\"" Sep 30 14:00:54 crc kubenswrapper[4676]: I0930 14:00:54.496984 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6468238-1faf-4dc1-82f2-1d2ade02bbeb-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "b6468238-1faf-4dc1-82f2-1d2ade02bbeb" (UID: "b6468238-1faf-4dc1-82f2-1d2ade02bbeb"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:00:54 crc kubenswrapper[4676]: I0930 14:00:54.590811 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b6468238-1faf-4dc1-82f2-1d2ade02bbeb-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 14:00:54 crc kubenswrapper[4676]: I0930 14:00:54.947278 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"b6468238-1faf-4dc1-82f2-1d2ade02bbeb","Type":"ContainerDied","Data":"a4fb73f56c7bcc010378e002909e353947cfc025485d27365ab278c8cf5b74ed"} Sep 30 14:00:54 crc kubenswrapper[4676]: I0930 14:00:54.947345 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4fb73f56c7bcc010378e002909e353947cfc025485d27365ab278c8cf5b74ed" Sep 30 14:00:54 crc kubenswrapper[4676]: I0930 14:00:54.947664 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 14:00:58 crc kubenswrapper[4676]: I0930 14:00:58.202942 4676 patch_prober.go:28] interesting pod/downloads-7954f5f757-qnmrm container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Sep 30 14:00:58 crc kubenswrapper[4676]: I0930 14:00:58.203375 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-qnmrm" podUID="2ef453c0-cdb0-4da7-8c32-3b975e1009a1" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Sep 30 14:00:58 crc kubenswrapper[4676]: I0930 14:00:58.203108 4676 patch_prober.go:28] interesting pod/downloads-7954f5f757-qnmrm container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Sep 30 14:00:58 crc kubenswrapper[4676]: I0930 14:00:58.204015 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qnmrm" podUID="2ef453c0-cdb0-4da7-8c32-3b975e1009a1" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Sep 30 14:00:58 crc kubenswrapper[4676]: I0930 14:00:58.471016 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-zgcbx" Sep 30 14:00:58 crc kubenswrapper[4676]: I0930 14:00:58.476652 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-zgcbx" Sep 30 14:00:59 crc kubenswrapper[4676]: I0930 14:00:59.919387 4676 patch_prober.go:28] interesting pod/machine-config-daemon-4k2dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:00:59 crc kubenswrapper[4676]: I0930 14:00:59.919996 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:01:04 crc kubenswrapper[4676]: I0930 14:01:04.084744 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e47ea2c6-e937-4411-b4c9-98048a5e5f05-metrics-certs\") pod \"network-metrics-daemon-sksn7\" (UID: \"e47ea2c6-e937-4411-b4c9-98048a5e5f05\") " pod="openshift-multus/network-metrics-daemon-sksn7" Sep 30 14:01:04 crc kubenswrapper[4676]: I0930 14:01:04.094366 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e47ea2c6-e937-4411-b4c9-98048a5e5f05-metrics-certs\") pod \"network-metrics-daemon-sksn7\" (UID: \"e47ea2c6-e937-4411-b4c9-98048a5e5f05\") " pod="openshift-multus/network-metrics-daemon-sksn7" Sep 30 14:01:04 crc kubenswrapper[4676]: I0930 14:01:04.348608 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sksn7" Sep 30 14:01:08 crc kubenswrapper[4676]: I0930 14:01:08.077569 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:01:08 crc kubenswrapper[4676]: I0930 14:01:08.218068 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-qnmrm" Sep 30 14:01:18 crc kubenswrapper[4676]: I0930 14:01:18.630727 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r28xn" Sep 30 14:01:19 crc kubenswrapper[4676]: E0930 14:01:19.176384 4676 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Sep 30 14:01:19 crc kubenswrapper[4676]: E0930 14:01:19.176581 4676 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wbfbm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-8bmsn_openshift-marketplace(19e521ab-fa8b-43c8-a6ca-c57b507fce2d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 30 14:01:19 crc kubenswrapper[4676]: E0930 14:01:19.177904 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-8bmsn" podUID="19e521ab-fa8b-43c8-a6ca-c57b507fce2d" Sep 30 14:01:23 crc kubenswrapper[4676]: I0930 14:01:23.719925 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-sksn7"] Sep 30 14:01:24 crc kubenswrapper[4676]: E0930 14:01:24.078419 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-8bmsn" podUID="19e521ab-fa8b-43c8-a6ca-c57b507fce2d" Sep 30 14:01:24 crc kubenswrapper[4676]: E0930 14:01:24.134661 4676 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Sep 30 14:01:24 crc kubenswrapper[4676]: E0930 14:01:24.134961 4676 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s4w9l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-7sphw_openshift-marketplace(ddc55b99-87b9-4f3c-a5a5-08b857e41975): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 30 14:01:24 crc kubenswrapper[4676]: E0930 14:01:24.136153 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-7sphw" podUID="ddc55b99-87b9-4f3c-a5a5-08b857e41975" Sep 30 14:01:24 crc kubenswrapper[4676]: E0930 14:01:24.180338 4676 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Sep 30 14:01:24 crc kubenswrapper[4676]: E0930 14:01:24.180499 4676 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-75h65,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-7dvk6_openshift-marketplace(6bb07181-c62b-4440-9ab1-b41f968c8a05): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 30 14:01:24 crc kubenswrapper[4676]: E0930 14:01:24.182572 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-7dvk6" podUID="6bb07181-c62b-4440-9ab1-b41f968c8a05" Sep 30 14:01:24 crc kubenswrapper[4676]: I0930 14:01:24.659955 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 14:01:25 crc kubenswrapper[4676]: E0930 14:01:25.409544 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-7dvk6" podUID="6bb07181-c62b-4440-9ab1-b41f968c8a05" Sep 30 14:01:25 crc kubenswrapper[4676]: E0930 14:01:25.409646 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-7sphw" podUID="ddc55b99-87b9-4f3c-a5a5-08b857e41975" Sep 30 14:01:25 crc kubenswrapper[4676]: E0930 14:01:25.482354 4676 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Sep 30 14:01:25 crc kubenswrapper[4676]: E0930 14:01:25.482487 4676 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7hkbr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-t6gvw_openshift-marketplace(586d40ff-8404-4b50-bfab-bd99ba97daca): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 30 14:01:25 crc kubenswrapper[4676]: E0930 14:01:25.484952 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-t6gvw" podUID="586d40ff-8404-4b50-bfab-bd99ba97daca" Sep 30 14:01:25 crc kubenswrapper[4676]: E0930 14:01:25.566180 4676 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Sep 30 14:01:25 crc kubenswrapper[4676]: E0930 14:01:25.566348 4676 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gkmf2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-ndlhq_openshift-marketplace(98511dcf-3151-4955-b80b-c744ff06da2c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 30 14:01:25 crc kubenswrapper[4676]: E0930 14:01:25.567545 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-ndlhq" podUID="98511dcf-3151-4955-b80b-c744ff06da2c" Sep 30 14:01:25 crc kubenswrapper[4676]: E0930 14:01:25.573683 4676 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Sep 30 14:01:25 crc kubenswrapper[4676]: E0930 14:01:25.573813 4676 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qwd8p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-4z4mt_openshift-marketplace(a878b8c0-a59b-4cc7-ac47-18164d4337af): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 30 14:01:25 crc kubenswrapper[4676]: E0930 14:01:25.575461 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-4z4mt" podUID="a878b8c0-a59b-4cc7-ac47-18164d4337af" Sep 30 14:01:25 crc kubenswrapper[4676]: E0930 14:01:25.578250 4676 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Sep 30 14:01:25 crc kubenswrapper[4676]: E0930 14:01:25.578414 4676 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h9g5q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-w8xnq_openshift-marketplace(cad652ea-7de0-4021-b594-7ea2a0681286): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 30 14:01:25 crc kubenswrapper[4676]: E0930 14:01:25.579525 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-w8xnq" podUID="cad652ea-7de0-4021-b594-7ea2a0681286" Sep 30 14:01:26 crc kubenswrapper[4676]: I0930 14:01:26.188475 4676 generic.go:334] "Generic (PLEG): container finished" podID="9d686c86-1f90-4466-a2e9-19f8f15c7e87" containerID="5e112d1e0e150545da3ff5aaa353c0bf4e2b223ca78bb4307ddeb478b281fd90" exitCode=0 Sep 30 14:01:26 crc kubenswrapper[4676]: I0930 14:01:26.188559 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5q6qs" event={"ID":"9d686c86-1f90-4466-a2e9-19f8f15c7e87","Type":"ContainerDied","Data":"5e112d1e0e150545da3ff5aaa353c0bf4e2b223ca78bb4307ddeb478b281fd90"} Sep 30 14:01:26 crc kubenswrapper[4676]: I0930 14:01:26.190945 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-sksn7" event={"ID":"e47ea2c6-e937-4411-b4c9-98048a5e5f05","Type":"ContainerStarted","Data":"fb53cb94753f317180305a75baf1fa3aa6e9c4e9be39a95ad35b84fc6a65b52a"} Sep 30 14:01:26 crc kubenswrapper[4676]: I0930 14:01:26.190988 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-sksn7" event={"ID":"e47ea2c6-e937-4411-b4c9-98048a5e5f05","Type":"ContainerStarted","Data":"6bec63e4a2a15143ec60594c01a9d21a5bfe389a3b0501a8bfde6955cc373f3d"} Sep 30 14:01:26 crc kubenswrapper[4676]: I0930 14:01:26.191000 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-sksn7" event={"ID":"e47ea2c6-e937-4411-b4c9-98048a5e5f05","Type":"ContainerStarted","Data":"d627faa45579aa9cc97e067593e3935bc227aa56835730e3d1e86983f98e9d81"} Sep 30 14:01:26 crc kubenswrapper[4676]: E0930 14:01:26.192703 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-ndlhq" podUID="98511dcf-3151-4955-b80b-c744ff06da2c" Sep 30 14:01:26 crc kubenswrapper[4676]: E0930 14:01:26.192729 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-t6gvw" podUID="586d40ff-8404-4b50-bfab-bd99ba97daca" Sep 30 14:01:26 crc kubenswrapper[4676]: E0930 14:01:26.192820 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-w8xnq" podUID="cad652ea-7de0-4021-b594-7ea2a0681286" Sep 30 14:01:26 crc kubenswrapper[4676]: E0930 14:01:26.192968 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-4z4mt" podUID="a878b8c0-a59b-4cc7-ac47-18164d4337af" Sep 30 14:01:26 crc kubenswrapper[4676]: I0930 14:01:26.252480 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-sksn7" podStartSLOduration=165.252455073 podStartE2EDuration="2m45.252455073s" podCreationTimestamp="2025-09-30 13:58:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:01:26.244762409 +0000 UTC m=+190.227850848" watchObservedRunningTime="2025-09-30 14:01:26.252455073 +0000 UTC m=+190.235543502" Sep 30 14:01:27 crc kubenswrapper[4676]: I0930 14:01:27.202631 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5q6qs" event={"ID":"9d686c86-1f90-4466-a2e9-19f8f15c7e87","Type":"ContainerStarted","Data":"441d673a4747051b8c52cda9c97b09654d585d08777d32ddbf561ff0ad43736f"} Sep 30 14:01:27 crc kubenswrapper[4676]: I0930 14:01:27.233807 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5q6qs" podStartSLOduration=3.055859759 podStartE2EDuration="39.23377242s" podCreationTimestamp="2025-09-30 14:00:48 +0000 UTC" firstStartedPulling="2025-09-30 14:00:50.74731963 +0000 UTC m=+154.730408059" lastFinishedPulling="2025-09-30 14:01:26.925232291 +0000 UTC m=+190.908320720" observedRunningTime="2025-09-30 14:01:27.220455619 +0000 UTC m=+191.203544058" watchObservedRunningTime="2025-09-30 14:01:27.23377242 +0000 UTC m=+191.216860859" Sep 30 14:01:28 crc kubenswrapper[4676]: I0930 14:01:28.975103 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5q6qs" Sep 30 14:01:28 crc kubenswrapper[4676]: I0930 14:01:28.975564 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5q6qs" Sep 30 14:01:29 crc kubenswrapper[4676]: I0930 14:01:29.919586 4676 patch_prober.go:28] interesting pod/machine-config-daemon-4k2dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:01:29 crc kubenswrapper[4676]: I0930 14:01:29.919926 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:01:30 crc kubenswrapper[4676]: I0930 14:01:30.119923 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5q6qs" podUID="9d686c86-1f90-4466-a2e9-19f8f15c7e87" containerName="registry-server" probeResult="failure" output=< Sep 30 14:01:30 crc kubenswrapper[4676]: timeout: failed to connect service ":50051" within 1s Sep 30 14:01:30 crc kubenswrapper[4676]: > Sep 30 14:01:38 crc kubenswrapper[4676]: I0930 14:01:38.266444 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7dvk6" event={"ID":"6bb07181-c62b-4440-9ab1-b41f968c8a05","Type":"ContainerStarted","Data":"e0ae77608f2f926e6e4fe78a61fb434753ce4dd52625563ab766cd891f532996"} Sep 30 14:01:39 crc kubenswrapper[4676]: I0930 14:01:39.026810 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5q6qs" Sep 30 14:01:39 crc kubenswrapper[4676]: I0930 14:01:39.073745 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5q6qs" Sep 30 14:01:39 crc kubenswrapper[4676]: I0930 14:01:39.278737 4676 generic.go:334] "Generic (PLEG): container finished" podID="19e521ab-fa8b-43c8-a6ca-c57b507fce2d" containerID="6b790c37c7a855220e503a0220c33cfa909afe18a7d0817c6e52e44bf1528420" exitCode=0 Sep 30 14:01:39 crc kubenswrapper[4676]: I0930 14:01:39.278843 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8bmsn" event={"ID":"19e521ab-fa8b-43c8-a6ca-c57b507fce2d","Type":"ContainerDied","Data":"6b790c37c7a855220e503a0220c33cfa909afe18a7d0817c6e52e44bf1528420"} Sep 30 14:01:39 crc kubenswrapper[4676]: I0930 14:01:39.281332 4676 generic.go:334] "Generic (PLEG): container finished" podID="ddc55b99-87b9-4f3c-a5a5-08b857e41975" containerID="8b8b4287b4df976d9914490bf4cbc31aadd58541826b75c8bdb71f8f5603959c" exitCode=0 Sep 30 14:01:39 crc kubenswrapper[4676]: I0930 14:01:39.281396 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7sphw" event={"ID":"ddc55b99-87b9-4f3c-a5a5-08b857e41975","Type":"ContainerDied","Data":"8b8b4287b4df976d9914490bf4cbc31aadd58541826b75c8bdb71f8f5603959c"} Sep 30 14:01:39 crc kubenswrapper[4676]: I0930 14:01:39.288247 4676 generic.go:334] "Generic (PLEG): container finished" podID="6bb07181-c62b-4440-9ab1-b41f968c8a05" containerID="e0ae77608f2f926e6e4fe78a61fb434753ce4dd52625563ab766cd891f532996" exitCode=0 Sep 30 14:01:39 crc kubenswrapper[4676]: I0930 14:01:39.288376 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7dvk6" event={"ID":"6bb07181-c62b-4440-9ab1-b41f968c8a05","Type":"ContainerDied","Data":"e0ae77608f2f926e6e4fe78a61fb434753ce4dd52625563ab766cd891f532996"} Sep 30 14:01:40 crc kubenswrapper[4676]: I0930 14:01:40.297190 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7sphw" event={"ID":"ddc55b99-87b9-4f3c-a5a5-08b857e41975","Type":"ContainerStarted","Data":"8452d2b9d5e8c0bd147f2f1d86aa10baceb555116e0f809d81cbc1beac94edf2"} Sep 30 14:01:40 crc kubenswrapper[4676]: I0930 14:01:40.302048 4676 generic.go:334] "Generic (PLEG): container finished" podID="586d40ff-8404-4b50-bfab-bd99ba97daca" containerID="19f1b2f8867d8e868c93e8fefdb2f397d937851c15e0e7ff5b66da0f901491d2" exitCode=0 Sep 30 14:01:40 crc kubenswrapper[4676]: I0930 14:01:40.302103 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t6gvw" event={"ID":"586d40ff-8404-4b50-bfab-bd99ba97daca","Type":"ContainerDied","Data":"19f1b2f8867d8e868c93e8fefdb2f397d937851c15e0e7ff5b66da0f901491d2"} Sep 30 14:01:40 crc kubenswrapper[4676]: I0930 14:01:40.305926 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7dvk6" event={"ID":"6bb07181-c62b-4440-9ab1-b41f968c8a05","Type":"ContainerStarted","Data":"b45196f199a258d0e0a414b6a3321aee06a7eb8dc13102b90116b0111eef439e"} Sep 30 14:01:40 crc kubenswrapper[4676]: I0930 14:01:40.310232 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8bmsn" event={"ID":"19e521ab-fa8b-43c8-a6ca-c57b507fce2d","Type":"ContainerStarted","Data":"02bbce9deeb0e176c9fbc29ffe090d8aa25de3ddbab110a96fc3b86c5fd6f060"} Sep 30 14:01:40 crc kubenswrapper[4676]: I0930 14:01:40.314855 4676 generic.go:334] "Generic (PLEG): container finished" podID="98511dcf-3151-4955-b80b-c744ff06da2c" containerID="5fecd8a9c902f0b022748890b4b1ff08f5532b2fd09d0cb33c378b09bdd5edf1" exitCode=0 Sep 30 14:01:40 crc kubenswrapper[4676]: I0930 14:01:40.314926 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ndlhq" event={"ID":"98511dcf-3151-4955-b80b-c744ff06da2c","Type":"ContainerDied","Data":"5fecd8a9c902f0b022748890b4b1ff08f5532b2fd09d0cb33c378b09bdd5edf1"} Sep 30 14:01:40 crc kubenswrapper[4676]: I0930 14:01:40.320309 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7sphw" podStartSLOduration=3.191996466 podStartE2EDuration="53.320286787s" podCreationTimestamp="2025-09-30 14:00:47 +0000 UTC" firstStartedPulling="2025-09-30 14:00:49.682084937 +0000 UTC m=+153.665173366" lastFinishedPulling="2025-09-30 14:01:39.810375258 +0000 UTC m=+203.793463687" observedRunningTime="2025-09-30 14:01:40.317834318 +0000 UTC m=+204.300922747" watchObservedRunningTime="2025-09-30 14:01:40.320286787 +0000 UTC m=+204.303375216" Sep 30 14:01:40 crc kubenswrapper[4676]: I0930 14:01:40.342623 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8bmsn" podStartSLOduration=3.132303926 podStartE2EDuration="55.342570206s" podCreationTimestamp="2025-09-30 14:00:45 +0000 UTC" firstStartedPulling="2025-09-30 14:00:47.505343523 +0000 UTC m=+151.488431952" lastFinishedPulling="2025-09-30 14:01:39.715609803 +0000 UTC m=+203.698698232" observedRunningTime="2025-09-30 14:01:40.339254054 +0000 UTC m=+204.322342483" watchObservedRunningTime="2025-09-30 14:01:40.342570206 +0000 UTC m=+204.325658635" Sep 30 14:01:40 crc kubenswrapper[4676]: I0930 14:01:40.395031 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7dvk6" podStartSLOduration=2.7980292909999998 podStartE2EDuration="55.394996554s" podCreationTimestamp="2025-09-30 14:00:45 +0000 UTC" firstStartedPulling="2025-09-30 14:00:47.509670653 +0000 UTC m=+151.492759082" lastFinishedPulling="2025-09-30 14:01:40.106637926 +0000 UTC m=+204.089726345" observedRunningTime="2025-09-30 14:01:40.385463068 +0000 UTC m=+204.368551517" watchObservedRunningTime="2025-09-30 14:01:40.394996554 +0000 UTC m=+204.378084983" Sep 30 14:01:41 crc kubenswrapper[4676]: I0930 14:01:41.323378 4676 generic.go:334] "Generic (PLEG): container finished" podID="a878b8c0-a59b-4cc7-ac47-18164d4337af" containerID="e51e99d99c81798227bd197549ee6ebe102823552d6850978e9da7983d8366ca" exitCode=0 Sep 30 14:01:41 crc kubenswrapper[4676]: I0930 14:01:41.323493 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4z4mt" event={"ID":"a878b8c0-a59b-4cc7-ac47-18164d4337af","Type":"ContainerDied","Data":"e51e99d99c81798227bd197549ee6ebe102823552d6850978e9da7983d8366ca"} Sep 30 14:01:42 crc kubenswrapper[4676]: I0930 14:01:42.341979 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ndlhq" event={"ID":"98511dcf-3151-4955-b80b-c744ff06da2c","Type":"ContainerStarted","Data":"ff80039c2ecff735693d89b945a7d9e16aa2e50037b6979d5055174b244d1e2a"} Sep 30 14:01:42 crc kubenswrapper[4676]: I0930 14:01:42.346467 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4z4mt" event={"ID":"a878b8c0-a59b-4cc7-ac47-18164d4337af","Type":"ContainerStarted","Data":"8eb7e6dd615cd38bc6804da4a4a165e5d8b6b77a71591fdbb8f343b38630b236"} Sep 30 14:01:42 crc kubenswrapper[4676]: I0930 14:01:42.349122 4676 generic.go:334] "Generic (PLEG): container finished" podID="cad652ea-7de0-4021-b594-7ea2a0681286" containerID="e00545ca5dce6946a03be66ddfead10b993270c41477e741d1a34ca9fc940105" exitCode=0 Sep 30 14:01:42 crc kubenswrapper[4676]: I0930 14:01:42.349198 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w8xnq" event={"ID":"cad652ea-7de0-4021-b594-7ea2a0681286","Type":"ContainerDied","Data":"e00545ca5dce6946a03be66ddfead10b993270c41477e741d1a34ca9fc940105"} Sep 30 14:01:42 crc kubenswrapper[4676]: I0930 14:01:42.352956 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t6gvw" event={"ID":"586d40ff-8404-4b50-bfab-bd99ba97daca","Type":"ContainerStarted","Data":"e20d6443891fb09bc00d4c3346a31ea6e239a36ec7c6710035347fb575d56dea"} Sep 30 14:01:42 crc kubenswrapper[4676]: I0930 14:01:42.365473 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ndlhq" podStartSLOduration=2.55762879 podStartE2EDuration="53.365454454s" podCreationTimestamp="2025-09-30 14:00:49 +0000 UTC" firstStartedPulling="2025-09-30 14:00:50.710121685 +0000 UTC m=+154.693210114" lastFinishedPulling="2025-09-30 14:01:41.517947349 +0000 UTC m=+205.501035778" observedRunningTime="2025-09-30 14:01:42.361313989 +0000 UTC m=+206.344402438" watchObservedRunningTime="2025-09-30 14:01:42.365454454 +0000 UTC m=+206.348542883" Sep 30 14:01:42 crc kubenswrapper[4676]: I0930 14:01:42.399743 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4z4mt" podStartSLOduration=1.836244914 podStartE2EDuration="56.399717368s" podCreationTimestamp="2025-09-30 14:00:46 +0000 UTC" firstStartedPulling="2025-09-30 14:00:47.511834493 +0000 UTC m=+151.494922922" lastFinishedPulling="2025-09-30 14:01:42.075306947 +0000 UTC m=+206.058395376" observedRunningTime="2025-09-30 14:01:42.39838674 +0000 UTC m=+206.381475179" watchObservedRunningTime="2025-09-30 14:01:42.399717368 +0000 UTC m=+206.382805797" Sep 30 14:01:42 crc kubenswrapper[4676]: I0930 14:01:42.417585 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-t6gvw" podStartSLOduration=2.3276493670000002 podStartE2EDuration="55.417562634s" podCreationTimestamp="2025-09-30 14:00:47 +0000 UTC" firstStartedPulling="2025-09-30 14:00:48.571553803 +0000 UTC m=+152.554642222" lastFinishedPulling="2025-09-30 14:01:41.66146706 +0000 UTC m=+205.644555489" observedRunningTime="2025-09-30 14:01:42.416426512 +0000 UTC m=+206.399514941" watchObservedRunningTime="2025-09-30 14:01:42.417562634 +0000 UTC m=+206.400651063" Sep 30 14:01:43 crc kubenswrapper[4676]: I0930 14:01:43.361245 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w8xnq" event={"ID":"cad652ea-7de0-4021-b594-7ea2a0681286","Type":"ContainerStarted","Data":"c9571b48b72cab3363f8243f6b38e063a59ceeb2c4a668a207ce79fb5489565d"} Sep 30 14:01:43 crc kubenswrapper[4676]: I0930 14:01:43.386345 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-w8xnq" podStartSLOduration=3.149946047 podStartE2EDuration="58.386313841s" podCreationTimestamp="2025-09-30 14:00:45 +0000 UTC" firstStartedPulling="2025-09-30 14:00:47.522712286 +0000 UTC m=+151.505800715" lastFinishedPulling="2025-09-30 14:01:42.75908008 +0000 UTC m=+206.742168509" observedRunningTime="2025-09-30 14:01:43.386163416 +0000 UTC m=+207.369251855" watchObservedRunningTime="2025-09-30 14:01:43.386313841 +0000 UTC m=+207.369402280" Sep 30 14:01:45 crc kubenswrapper[4676]: I0930 14:01:45.921908 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8bmsn" Sep 30 14:01:45 crc kubenswrapper[4676]: I0930 14:01:45.922006 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8bmsn" Sep 30 14:01:45 crc kubenswrapper[4676]: I0930 14:01:45.963018 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-w8xnq" Sep 30 14:01:45 crc kubenswrapper[4676]: I0930 14:01:45.963122 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-w8xnq" Sep 30 14:01:45 crc kubenswrapper[4676]: I0930 14:01:45.981736 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8bmsn" Sep 30 14:01:46 crc kubenswrapper[4676]: I0930 14:01:46.010646 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-w8xnq" Sep 30 14:01:46 crc kubenswrapper[4676]: I0930 14:01:46.144981 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7dvk6" Sep 30 14:01:46 crc kubenswrapper[4676]: I0930 14:01:46.145302 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7dvk6" Sep 30 14:01:46 crc kubenswrapper[4676]: I0930 14:01:46.183202 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7dvk6" Sep 30 14:01:46 crc kubenswrapper[4676]: I0930 14:01:46.362060 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4z4mt" Sep 30 14:01:46 crc kubenswrapper[4676]: I0930 14:01:46.362120 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4z4mt" Sep 30 14:01:46 crc kubenswrapper[4676]: I0930 14:01:46.402612 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4z4mt" Sep 30 14:01:46 crc kubenswrapper[4676]: I0930 14:01:46.422721 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8bmsn" Sep 30 14:01:46 crc kubenswrapper[4676]: I0930 14:01:46.446477 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7dvk6" Sep 30 14:01:47 crc kubenswrapper[4676]: I0930 14:01:47.799255 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-t6gvw" Sep 30 14:01:47 crc kubenswrapper[4676]: I0930 14:01:47.799312 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-t6gvw" Sep 30 14:01:47 crc kubenswrapper[4676]: I0930 14:01:47.837687 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-t6gvw" Sep 30 14:01:48 crc kubenswrapper[4676]: I0930 14:01:48.150766 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7sphw" Sep 30 14:01:48 crc kubenswrapper[4676]: I0930 14:01:48.150859 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7sphw" Sep 30 14:01:48 crc kubenswrapper[4676]: I0930 14:01:48.194054 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7sphw" Sep 30 14:01:48 crc kubenswrapper[4676]: I0930 14:01:48.427362 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-t6gvw" Sep 30 14:01:48 crc kubenswrapper[4676]: I0930 14:01:48.427950 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7sphw" Sep 30 14:01:48 crc kubenswrapper[4676]: I0930 14:01:48.867266 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7dvk6"] Sep 30 14:01:49 crc kubenswrapper[4676]: I0930 14:01:49.366732 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ndlhq" Sep 30 14:01:49 crc kubenswrapper[4676]: I0930 14:01:49.367438 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ndlhq" Sep 30 14:01:49 crc kubenswrapper[4676]: I0930 14:01:49.392949 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7dvk6" podUID="6bb07181-c62b-4440-9ab1-b41f968c8a05" containerName="registry-server" containerID="cri-o://b45196f199a258d0e0a414b6a3321aee06a7eb8dc13102b90116b0111eef439e" gracePeriod=2 Sep 30 14:01:49 crc kubenswrapper[4676]: I0930 14:01:49.404428 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ndlhq" Sep 30 14:01:49 crc kubenswrapper[4676]: I0930 14:01:49.450986 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ndlhq" Sep 30 14:01:50 crc kubenswrapper[4676]: I0930 14:01:50.399253 4676 generic.go:334] "Generic (PLEG): container finished" podID="6bb07181-c62b-4440-9ab1-b41f968c8a05" containerID="b45196f199a258d0e0a414b6a3321aee06a7eb8dc13102b90116b0111eef439e" exitCode=0 Sep 30 14:01:50 crc kubenswrapper[4676]: I0930 14:01:50.399338 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7dvk6" event={"ID":"6bb07181-c62b-4440-9ab1-b41f968c8a05","Type":"ContainerDied","Data":"b45196f199a258d0e0a414b6a3321aee06a7eb8dc13102b90116b0111eef439e"} Sep 30 14:01:50 crc kubenswrapper[4676]: I0930 14:01:50.967960 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7dvk6" Sep 30 14:01:51 crc kubenswrapper[4676]: I0930 14:01:51.130682 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bb07181-c62b-4440-9ab1-b41f968c8a05-catalog-content\") pod \"6bb07181-c62b-4440-9ab1-b41f968c8a05\" (UID: \"6bb07181-c62b-4440-9ab1-b41f968c8a05\") " Sep 30 14:01:51 crc kubenswrapper[4676]: I0930 14:01:51.130871 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bb07181-c62b-4440-9ab1-b41f968c8a05-utilities\") pod \"6bb07181-c62b-4440-9ab1-b41f968c8a05\" (UID: \"6bb07181-c62b-4440-9ab1-b41f968c8a05\") " Sep 30 14:01:51 crc kubenswrapper[4676]: I0930 14:01:51.131005 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75h65\" (UniqueName: \"kubernetes.io/projected/6bb07181-c62b-4440-9ab1-b41f968c8a05-kube-api-access-75h65\") pod \"6bb07181-c62b-4440-9ab1-b41f968c8a05\" (UID: \"6bb07181-c62b-4440-9ab1-b41f968c8a05\") " Sep 30 14:01:51 crc kubenswrapper[4676]: I0930 14:01:51.133467 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bb07181-c62b-4440-9ab1-b41f968c8a05-utilities" (OuterVolumeSpecName: "utilities") pod "6bb07181-c62b-4440-9ab1-b41f968c8a05" (UID: "6bb07181-c62b-4440-9ab1-b41f968c8a05"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:01:51 crc kubenswrapper[4676]: I0930 14:01:51.142651 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bb07181-c62b-4440-9ab1-b41f968c8a05-kube-api-access-75h65" (OuterVolumeSpecName: "kube-api-access-75h65") pod "6bb07181-c62b-4440-9ab1-b41f968c8a05" (UID: "6bb07181-c62b-4440-9ab1-b41f968c8a05"). InnerVolumeSpecName "kube-api-access-75h65". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:01:51 crc kubenswrapper[4676]: I0930 14:01:51.211501 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bb07181-c62b-4440-9ab1-b41f968c8a05-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6bb07181-c62b-4440-9ab1-b41f968c8a05" (UID: "6bb07181-c62b-4440-9ab1-b41f968c8a05"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:01:51 crc kubenswrapper[4676]: I0930 14:01:51.232391 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bb07181-c62b-4440-9ab1-b41f968c8a05-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 14:01:51 crc kubenswrapper[4676]: I0930 14:01:51.232438 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75h65\" (UniqueName: \"kubernetes.io/projected/6bb07181-c62b-4440-9ab1-b41f968c8a05-kube-api-access-75h65\") on node \"crc\" DevicePath \"\"" Sep 30 14:01:51 crc kubenswrapper[4676]: I0930 14:01:51.232450 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bb07181-c62b-4440-9ab1-b41f968c8a05-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 14:01:51 crc kubenswrapper[4676]: I0930 14:01:51.271562 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7sphw"] Sep 30 14:01:51 crc kubenswrapper[4676]: I0930 14:01:51.271987 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7sphw" podUID="ddc55b99-87b9-4f3c-a5a5-08b857e41975" containerName="registry-server" containerID="cri-o://8452d2b9d5e8c0bd147f2f1d86aa10baceb555116e0f809d81cbc1beac94edf2" gracePeriod=2 Sep 30 14:01:51 crc kubenswrapper[4676]: I0930 14:01:51.406801 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7dvk6" event={"ID":"6bb07181-c62b-4440-9ab1-b41f968c8a05","Type":"ContainerDied","Data":"6ea55ee2bb1c8724011da425e5c6a8f49fe4b2bb2f8872d24e5006b4bbf27052"} Sep 30 14:01:51 crc kubenswrapper[4676]: I0930 14:01:51.406909 4676 scope.go:117] "RemoveContainer" containerID="b45196f199a258d0e0a414b6a3321aee06a7eb8dc13102b90116b0111eef439e" Sep 30 14:01:51 crc kubenswrapper[4676]: I0930 14:01:51.406935 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7dvk6" Sep 30 14:01:51 crc kubenswrapper[4676]: I0930 14:01:51.422992 4676 scope.go:117] "RemoveContainer" containerID="e0ae77608f2f926e6e4fe78a61fb434753ce4dd52625563ab766cd891f532996" Sep 30 14:01:51 crc kubenswrapper[4676]: I0930 14:01:51.435663 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7dvk6"] Sep 30 14:01:51 crc kubenswrapper[4676]: I0930 14:01:51.453185 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7dvk6"] Sep 30 14:01:51 crc kubenswrapper[4676]: I0930 14:01:51.463391 4676 scope.go:117] "RemoveContainer" containerID="ea4e1dae03ca9464b2ec8d14207dc1271d626e142019a7310590b23a9c23e199" Sep 30 14:01:52 crc kubenswrapper[4676]: I0930 14:01:52.417003 4676 generic.go:334] "Generic (PLEG): container finished" podID="ddc55b99-87b9-4f3c-a5a5-08b857e41975" containerID="8452d2b9d5e8c0bd147f2f1d86aa10baceb555116e0f809d81cbc1beac94edf2" exitCode=0 Sep 30 14:01:52 crc kubenswrapper[4676]: I0930 14:01:52.417104 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7sphw" event={"ID":"ddc55b99-87b9-4f3c-a5a5-08b857e41975","Type":"ContainerDied","Data":"8452d2b9d5e8c0bd147f2f1d86aa10baceb555116e0f809d81cbc1beac94edf2"} Sep 30 14:01:52 crc kubenswrapper[4676]: I0930 14:01:52.751961 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7sphw" Sep 30 14:01:52 crc kubenswrapper[4676]: I0930 14:01:52.760270 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ddc55b99-87b9-4f3c-a5a5-08b857e41975-catalog-content\") pod \"ddc55b99-87b9-4f3c-a5a5-08b857e41975\" (UID: \"ddc55b99-87b9-4f3c-a5a5-08b857e41975\") " Sep 30 14:01:52 crc kubenswrapper[4676]: I0930 14:01:52.781309 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ddc55b99-87b9-4f3c-a5a5-08b857e41975-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ddc55b99-87b9-4f3c-a5a5-08b857e41975" (UID: "ddc55b99-87b9-4f3c-a5a5-08b857e41975"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:01:52 crc kubenswrapper[4676]: I0930 14:01:52.862095 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ddc55b99-87b9-4f3c-a5a5-08b857e41975-utilities\") pod \"ddc55b99-87b9-4f3c-a5a5-08b857e41975\" (UID: \"ddc55b99-87b9-4f3c-a5a5-08b857e41975\") " Sep 30 14:01:52 crc kubenswrapper[4676]: I0930 14:01:52.862172 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4w9l\" (UniqueName: \"kubernetes.io/projected/ddc55b99-87b9-4f3c-a5a5-08b857e41975-kube-api-access-s4w9l\") pod \"ddc55b99-87b9-4f3c-a5a5-08b857e41975\" (UID: \"ddc55b99-87b9-4f3c-a5a5-08b857e41975\") " Sep 30 14:01:52 crc kubenswrapper[4676]: I0930 14:01:52.862413 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ddc55b99-87b9-4f3c-a5a5-08b857e41975-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 14:01:52 crc kubenswrapper[4676]: I0930 14:01:52.863087 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ddc55b99-87b9-4f3c-a5a5-08b857e41975-utilities" (OuterVolumeSpecName: "utilities") pod "ddc55b99-87b9-4f3c-a5a5-08b857e41975" (UID: "ddc55b99-87b9-4f3c-a5a5-08b857e41975"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:01:52 crc kubenswrapper[4676]: I0930 14:01:52.869423 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddc55b99-87b9-4f3c-a5a5-08b857e41975-kube-api-access-s4w9l" (OuterVolumeSpecName: "kube-api-access-s4w9l") pod "ddc55b99-87b9-4f3c-a5a5-08b857e41975" (UID: "ddc55b99-87b9-4f3c-a5a5-08b857e41975"). InnerVolumeSpecName "kube-api-access-s4w9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:01:52 crc kubenswrapper[4676]: I0930 14:01:52.963965 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ddc55b99-87b9-4f3c-a5a5-08b857e41975-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 14:01:52 crc kubenswrapper[4676]: I0930 14:01:52.964015 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4w9l\" (UniqueName: \"kubernetes.io/projected/ddc55b99-87b9-4f3c-a5a5-08b857e41975-kube-api-access-s4w9l\") on node \"crc\" DevicePath \"\"" Sep 30 14:01:53 crc kubenswrapper[4676]: I0930 14:01:53.427036 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7sphw" event={"ID":"ddc55b99-87b9-4f3c-a5a5-08b857e41975","Type":"ContainerDied","Data":"9493c61515478d51b931a0656af6c818d8fae88c8445b37c3d1f05dec2ff11bf"} Sep 30 14:01:53 crc kubenswrapper[4676]: I0930 14:01:53.427083 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7sphw" Sep 30 14:01:53 crc kubenswrapper[4676]: I0930 14:01:53.427446 4676 scope.go:117] "RemoveContainer" containerID="8452d2b9d5e8c0bd147f2f1d86aa10baceb555116e0f809d81cbc1beac94edf2" Sep 30 14:01:53 crc kubenswrapper[4676]: I0930 14:01:53.444491 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bb07181-c62b-4440-9ab1-b41f968c8a05" path="/var/lib/kubelet/pods/6bb07181-c62b-4440-9ab1-b41f968c8a05/volumes" Sep 30 14:01:53 crc kubenswrapper[4676]: I0930 14:01:53.466421 4676 scope.go:117] "RemoveContainer" containerID="8b8b4287b4df976d9914490bf4cbc31aadd58541826b75c8bdb71f8f5603959c" Sep 30 14:01:53 crc kubenswrapper[4676]: I0930 14:01:53.467622 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7sphw"] Sep 30 14:01:53 crc kubenswrapper[4676]: I0930 14:01:53.479146 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7sphw"] Sep 30 14:01:53 crc kubenswrapper[4676]: I0930 14:01:53.488486 4676 scope.go:117] "RemoveContainer" containerID="94b5969bfcadc1cbb9f6022cde7f1c6fd569ff610dae2ec00833a0d62dfa37d6" Sep 30 14:01:53 crc kubenswrapper[4676]: I0930 14:01:53.667249 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ndlhq"] Sep 30 14:01:53 crc kubenswrapper[4676]: I0930 14:01:53.668007 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ndlhq" podUID="98511dcf-3151-4955-b80b-c744ff06da2c" containerName="registry-server" containerID="cri-o://ff80039c2ecff735693d89b945a7d9e16aa2e50037b6979d5055174b244d1e2a" gracePeriod=2 Sep 30 14:01:54 crc kubenswrapper[4676]: I0930 14:01:54.448312 4676 generic.go:334] "Generic (PLEG): container finished" podID="98511dcf-3151-4955-b80b-c744ff06da2c" containerID="ff80039c2ecff735693d89b945a7d9e16aa2e50037b6979d5055174b244d1e2a" exitCode=0 Sep 30 14:01:54 crc kubenswrapper[4676]: I0930 14:01:54.448373 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ndlhq" event={"ID":"98511dcf-3151-4955-b80b-c744ff06da2c","Type":"ContainerDied","Data":"ff80039c2ecff735693d89b945a7d9e16aa2e50037b6979d5055174b244d1e2a"} Sep 30 14:01:54 crc kubenswrapper[4676]: I0930 14:01:54.550039 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ndlhq" Sep 30 14:01:54 crc kubenswrapper[4676]: I0930 14:01:54.692010 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gkmf2\" (UniqueName: \"kubernetes.io/projected/98511dcf-3151-4955-b80b-c744ff06da2c-kube-api-access-gkmf2\") pod \"98511dcf-3151-4955-b80b-c744ff06da2c\" (UID: \"98511dcf-3151-4955-b80b-c744ff06da2c\") " Sep 30 14:01:54 crc kubenswrapper[4676]: I0930 14:01:54.693188 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98511dcf-3151-4955-b80b-c744ff06da2c-utilities\") pod \"98511dcf-3151-4955-b80b-c744ff06da2c\" (UID: \"98511dcf-3151-4955-b80b-c744ff06da2c\") " Sep 30 14:01:54 crc kubenswrapper[4676]: I0930 14:01:54.693400 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98511dcf-3151-4955-b80b-c744ff06da2c-catalog-content\") pod \"98511dcf-3151-4955-b80b-c744ff06da2c\" (UID: \"98511dcf-3151-4955-b80b-c744ff06da2c\") " Sep 30 14:01:54 crc kubenswrapper[4676]: I0930 14:01:54.694628 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98511dcf-3151-4955-b80b-c744ff06da2c-utilities" (OuterVolumeSpecName: "utilities") pod "98511dcf-3151-4955-b80b-c744ff06da2c" (UID: "98511dcf-3151-4955-b80b-c744ff06da2c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:01:54 crc kubenswrapper[4676]: I0930 14:01:54.700597 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98511dcf-3151-4955-b80b-c744ff06da2c-kube-api-access-gkmf2" (OuterVolumeSpecName: "kube-api-access-gkmf2") pod "98511dcf-3151-4955-b80b-c744ff06da2c" (UID: "98511dcf-3151-4955-b80b-c744ff06da2c"). InnerVolumeSpecName "kube-api-access-gkmf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:01:54 crc kubenswrapper[4676]: I0930 14:01:54.789216 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98511dcf-3151-4955-b80b-c744ff06da2c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "98511dcf-3151-4955-b80b-c744ff06da2c" (UID: "98511dcf-3151-4955-b80b-c744ff06da2c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:01:54 crc kubenswrapper[4676]: I0930 14:01:54.796165 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gkmf2\" (UniqueName: \"kubernetes.io/projected/98511dcf-3151-4955-b80b-c744ff06da2c-kube-api-access-gkmf2\") on node \"crc\" DevicePath \"\"" Sep 30 14:01:54 crc kubenswrapper[4676]: I0930 14:01:54.796212 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98511dcf-3151-4955-b80b-c744ff06da2c-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 14:01:54 crc kubenswrapper[4676]: I0930 14:01:54.796223 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98511dcf-3151-4955-b80b-c744ff06da2c-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 14:01:55 crc kubenswrapper[4676]: I0930 14:01:55.441902 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddc55b99-87b9-4f3c-a5a5-08b857e41975" path="/var/lib/kubelet/pods/ddc55b99-87b9-4f3c-a5a5-08b857e41975/volumes" Sep 30 14:01:55 crc kubenswrapper[4676]: I0930 14:01:55.459190 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ndlhq" event={"ID":"98511dcf-3151-4955-b80b-c744ff06da2c","Type":"ContainerDied","Data":"efdb624ca6188c86d10e2410546f4b386814050869b846e0c2e1303427eae893"} Sep 30 14:01:55 crc kubenswrapper[4676]: I0930 14:01:55.459247 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ndlhq" Sep 30 14:01:55 crc kubenswrapper[4676]: I0930 14:01:55.459251 4676 scope.go:117] "RemoveContainer" containerID="ff80039c2ecff735693d89b945a7d9e16aa2e50037b6979d5055174b244d1e2a" Sep 30 14:01:55 crc kubenswrapper[4676]: I0930 14:01:55.479966 4676 scope.go:117] "RemoveContainer" containerID="5fecd8a9c902f0b022748890b4b1ff08f5532b2fd09d0cb33c378b09bdd5edf1" Sep 30 14:01:55 crc kubenswrapper[4676]: I0930 14:01:55.491277 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ndlhq"] Sep 30 14:01:55 crc kubenswrapper[4676]: I0930 14:01:55.497180 4676 scope.go:117] "RemoveContainer" containerID="d5535043d281d56183c250243cfa6e6f7947d910b74e472d1d1e2b9224f38761" Sep 30 14:01:55 crc kubenswrapper[4676]: I0930 14:01:55.501609 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ndlhq"] Sep 30 14:01:56 crc kubenswrapper[4676]: I0930 14:01:56.002416 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-w8xnq" Sep 30 14:01:56 crc kubenswrapper[4676]: I0930 14:01:56.399050 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4z4mt" Sep 30 14:01:57 crc kubenswrapper[4676]: I0930 14:01:57.444955 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98511dcf-3151-4955-b80b-c744ff06da2c" path="/var/lib/kubelet/pods/98511dcf-3151-4955-b80b-c744ff06da2c/volumes" Sep 30 14:01:58 crc kubenswrapper[4676]: I0930 14:01:58.668704 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4z4mt"] Sep 30 14:01:58 crc kubenswrapper[4676]: I0930 14:01:58.670065 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4z4mt" podUID="a878b8c0-a59b-4cc7-ac47-18164d4337af" containerName="registry-server" containerID="cri-o://8eb7e6dd615cd38bc6804da4a4a165e5d8b6b77a71591fdbb8f343b38630b236" gracePeriod=2 Sep 30 14:01:59 crc kubenswrapper[4676]: I0930 14:01:59.042346 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4z4mt" Sep 30 14:01:59 crc kubenswrapper[4676]: I0930 14:01:59.153786 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a878b8c0-a59b-4cc7-ac47-18164d4337af-catalog-content\") pod \"a878b8c0-a59b-4cc7-ac47-18164d4337af\" (UID: \"a878b8c0-a59b-4cc7-ac47-18164d4337af\") " Sep 30 14:01:59 crc kubenswrapper[4676]: I0930 14:01:59.153863 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a878b8c0-a59b-4cc7-ac47-18164d4337af-utilities\") pod \"a878b8c0-a59b-4cc7-ac47-18164d4337af\" (UID: \"a878b8c0-a59b-4cc7-ac47-18164d4337af\") " Sep 30 14:01:59 crc kubenswrapper[4676]: I0930 14:01:59.153946 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwd8p\" (UniqueName: \"kubernetes.io/projected/a878b8c0-a59b-4cc7-ac47-18164d4337af-kube-api-access-qwd8p\") pod \"a878b8c0-a59b-4cc7-ac47-18164d4337af\" (UID: \"a878b8c0-a59b-4cc7-ac47-18164d4337af\") " Sep 30 14:01:59 crc kubenswrapper[4676]: I0930 14:01:59.156020 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a878b8c0-a59b-4cc7-ac47-18164d4337af-utilities" (OuterVolumeSpecName: "utilities") pod "a878b8c0-a59b-4cc7-ac47-18164d4337af" (UID: "a878b8c0-a59b-4cc7-ac47-18164d4337af"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:01:59 crc kubenswrapper[4676]: I0930 14:01:59.160617 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a878b8c0-a59b-4cc7-ac47-18164d4337af-kube-api-access-qwd8p" (OuterVolumeSpecName: "kube-api-access-qwd8p") pod "a878b8c0-a59b-4cc7-ac47-18164d4337af" (UID: "a878b8c0-a59b-4cc7-ac47-18164d4337af"). InnerVolumeSpecName "kube-api-access-qwd8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:01:59 crc kubenswrapper[4676]: I0930 14:01:59.222159 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a878b8c0-a59b-4cc7-ac47-18164d4337af-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a878b8c0-a59b-4cc7-ac47-18164d4337af" (UID: "a878b8c0-a59b-4cc7-ac47-18164d4337af"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:01:59 crc kubenswrapper[4676]: I0930 14:01:59.256108 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwd8p\" (UniqueName: \"kubernetes.io/projected/a878b8c0-a59b-4cc7-ac47-18164d4337af-kube-api-access-qwd8p\") on node \"crc\" DevicePath \"\"" Sep 30 14:01:59 crc kubenswrapper[4676]: I0930 14:01:59.256149 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a878b8c0-a59b-4cc7-ac47-18164d4337af-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 14:01:59 crc kubenswrapper[4676]: I0930 14:01:59.256164 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a878b8c0-a59b-4cc7-ac47-18164d4337af-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 14:01:59 crc kubenswrapper[4676]: I0930 14:01:59.484587 4676 generic.go:334] "Generic (PLEG): container finished" podID="a878b8c0-a59b-4cc7-ac47-18164d4337af" containerID="8eb7e6dd615cd38bc6804da4a4a165e5d8b6b77a71591fdbb8f343b38630b236" exitCode=0 Sep 30 14:01:59 crc kubenswrapper[4676]: I0930 14:01:59.484649 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4z4mt" event={"ID":"a878b8c0-a59b-4cc7-ac47-18164d4337af","Type":"ContainerDied","Data":"8eb7e6dd615cd38bc6804da4a4a165e5d8b6b77a71591fdbb8f343b38630b236"} Sep 30 14:01:59 crc kubenswrapper[4676]: I0930 14:01:59.485041 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4z4mt" event={"ID":"a878b8c0-a59b-4cc7-ac47-18164d4337af","Type":"ContainerDied","Data":"c11e4b5f46f2df7c3861e0c463f3ea35f366b21f1c89538e095ce7de1549d386"} Sep 30 14:01:59 crc kubenswrapper[4676]: I0930 14:01:59.484689 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4z4mt" Sep 30 14:01:59 crc kubenswrapper[4676]: I0930 14:01:59.485070 4676 scope.go:117] "RemoveContainer" containerID="8eb7e6dd615cd38bc6804da4a4a165e5d8b6b77a71591fdbb8f343b38630b236" Sep 30 14:01:59 crc kubenswrapper[4676]: I0930 14:01:59.503864 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4z4mt"] Sep 30 14:01:59 crc kubenswrapper[4676]: I0930 14:01:59.507448 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4z4mt"] Sep 30 14:01:59 crc kubenswrapper[4676]: I0930 14:01:59.515372 4676 scope.go:117] "RemoveContainer" containerID="e51e99d99c81798227bd197549ee6ebe102823552d6850978e9da7983d8366ca" Sep 30 14:01:59 crc kubenswrapper[4676]: I0930 14:01:59.530969 4676 scope.go:117] "RemoveContainer" containerID="b09edd7b0dc416e77fc7379f5bfcf28d6b3ff9ff2f74b3e346b3233a427bab99" Sep 30 14:01:59 crc kubenswrapper[4676]: I0930 14:01:59.551522 4676 scope.go:117] "RemoveContainer" containerID="8eb7e6dd615cd38bc6804da4a4a165e5d8b6b77a71591fdbb8f343b38630b236" Sep 30 14:01:59 crc kubenswrapper[4676]: E0930 14:01:59.552065 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8eb7e6dd615cd38bc6804da4a4a165e5d8b6b77a71591fdbb8f343b38630b236\": container with ID starting with 8eb7e6dd615cd38bc6804da4a4a165e5d8b6b77a71591fdbb8f343b38630b236 not found: ID does not exist" containerID="8eb7e6dd615cd38bc6804da4a4a165e5d8b6b77a71591fdbb8f343b38630b236" Sep 30 14:01:59 crc kubenswrapper[4676]: I0930 14:01:59.552108 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8eb7e6dd615cd38bc6804da4a4a165e5d8b6b77a71591fdbb8f343b38630b236"} err="failed to get container status \"8eb7e6dd615cd38bc6804da4a4a165e5d8b6b77a71591fdbb8f343b38630b236\": rpc error: code = NotFound desc = could not find container \"8eb7e6dd615cd38bc6804da4a4a165e5d8b6b77a71591fdbb8f343b38630b236\": container with ID starting with 8eb7e6dd615cd38bc6804da4a4a165e5d8b6b77a71591fdbb8f343b38630b236 not found: ID does not exist" Sep 30 14:01:59 crc kubenswrapper[4676]: I0930 14:01:59.552159 4676 scope.go:117] "RemoveContainer" containerID="e51e99d99c81798227bd197549ee6ebe102823552d6850978e9da7983d8366ca" Sep 30 14:01:59 crc kubenswrapper[4676]: E0930 14:01:59.552691 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e51e99d99c81798227bd197549ee6ebe102823552d6850978e9da7983d8366ca\": container with ID starting with e51e99d99c81798227bd197549ee6ebe102823552d6850978e9da7983d8366ca not found: ID does not exist" containerID="e51e99d99c81798227bd197549ee6ebe102823552d6850978e9da7983d8366ca" Sep 30 14:01:59 crc kubenswrapper[4676]: I0930 14:01:59.552718 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e51e99d99c81798227bd197549ee6ebe102823552d6850978e9da7983d8366ca"} err="failed to get container status \"e51e99d99c81798227bd197549ee6ebe102823552d6850978e9da7983d8366ca\": rpc error: code = NotFound desc = could not find container \"e51e99d99c81798227bd197549ee6ebe102823552d6850978e9da7983d8366ca\": container with ID starting with e51e99d99c81798227bd197549ee6ebe102823552d6850978e9da7983d8366ca not found: ID does not exist" Sep 30 14:01:59 crc kubenswrapper[4676]: I0930 14:01:59.552735 4676 scope.go:117] "RemoveContainer" containerID="b09edd7b0dc416e77fc7379f5bfcf28d6b3ff9ff2f74b3e346b3233a427bab99" Sep 30 14:01:59 crc kubenswrapper[4676]: E0930 14:01:59.553015 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b09edd7b0dc416e77fc7379f5bfcf28d6b3ff9ff2f74b3e346b3233a427bab99\": container with ID starting with b09edd7b0dc416e77fc7379f5bfcf28d6b3ff9ff2f74b3e346b3233a427bab99 not found: ID does not exist" containerID="b09edd7b0dc416e77fc7379f5bfcf28d6b3ff9ff2f74b3e346b3233a427bab99" Sep 30 14:01:59 crc kubenswrapper[4676]: I0930 14:01:59.553037 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b09edd7b0dc416e77fc7379f5bfcf28d6b3ff9ff2f74b3e346b3233a427bab99"} err="failed to get container status \"b09edd7b0dc416e77fc7379f5bfcf28d6b3ff9ff2f74b3e346b3233a427bab99\": rpc error: code = NotFound desc = could not find container \"b09edd7b0dc416e77fc7379f5bfcf28d6b3ff9ff2f74b3e346b3233a427bab99\": container with ID starting with b09edd7b0dc416e77fc7379f5bfcf28d6b3ff9ff2f74b3e346b3233a427bab99 not found: ID does not exist" Sep 30 14:01:59 crc kubenswrapper[4676]: I0930 14:01:59.919542 4676 patch_prober.go:28] interesting pod/machine-config-daemon-4k2dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:01:59 crc kubenswrapper[4676]: I0930 14:01:59.919637 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:01:59 crc kubenswrapper[4676]: I0930 14:01:59.919724 4676 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" Sep 30 14:01:59 crc kubenswrapper[4676]: I0930 14:01:59.920508 4676 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e6ba88aa74993edfc49871c7cb7c45c498fc9a6d1e2cc573864d4656d4ade55b"} pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 14:01:59 crc kubenswrapper[4676]: I0930 14:01:59.920589 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerName="machine-config-daemon" containerID="cri-o://e6ba88aa74993edfc49871c7cb7c45c498fc9a6d1e2cc573864d4656d4ade55b" gracePeriod=600 Sep 30 14:02:00 crc kubenswrapper[4676]: I0930 14:02:00.493039 4676 generic.go:334] "Generic (PLEG): container finished" podID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerID="e6ba88aa74993edfc49871c7cb7c45c498fc9a6d1e2cc573864d4656d4ade55b" exitCode=0 Sep 30 14:02:00 crc kubenswrapper[4676]: I0930 14:02:00.493137 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" event={"ID":"af133cb7-f0e4-428e-b348-c6e81493fc1d","Type":"ContainerDied","Data":"e6ba88aa74993edfc49871c7cb7c45c498fc9a6d1e2cc573864d4656d4ade55b"} Sep 30 14:02:00 crc kubenswrapper[4676]: I0930 14:02:00.493466 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" event={"ID":"af133cb7-f0e4-428e-b348-c6e81493fc1d","Type":"ContainerStarted","Data":"1be5dc03e37879b7c4a0ccbe08345138fa5cba0d902530b2119831d575d4b86d"} Sep 30 14:02:01 crc kubenswrapper[4676]: I0930 14:02:01.441310 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a878b8c0-a59b-4cc7-ac47-18164d4337af" path="/var/lib/kubelet/pods/a878b8c0-a59b-4cc7-ac47-18164d4337af/volumes" Sep 30 14:02:30 crc kubenswrapper[4676]: I0930 14:02:30.552353 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8bmsn"] Sep 30 14:02:30 crc kubenswrapper[4676]: I0930 14:02:30.553265 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8bmsn" podUID="19e521ab-fa8b-43c8-a6ca-c57b507fce2d" containerName="registry-server" containerID="cri-o://02bbce9deeb0e176c9fbc29ffe090d8aa25de3ddbab110a96fc3b86c5fd6f060" gracePeriod=30 Sep 30 14:02:30 crc kubenswrapper[4676]: I0930 14:02:30.560058 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-w8xnq"] Sep 30 14:02:30 crc kubenswrapper[4676]: I0930 14:02:30.560301 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-w8xnq" podUID="cad652ea-7de0-4021-b594-7ea2a0681286" containerName="registry-server" containerID="cri-o://c9571b48b72cab3363f8243f6b38e063a59ceeb2c4a668a207ce79fb5489565d" gracePeriod=30 Sep 30 14:02:30 crc kubenswrapper[4676]: I0930 14:02:30.577271 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7wrkd"] Sep 30 14:02:30 crc kubenswrapper[4676]: I0930 14:02:30.577500 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-7wrkd" podUID="4ec8c2c0-6649-4fce-bd49-c178b25d9da1" containerName="marketplace-operator" containerID="cri-o://2556a81d50a43dce8ab2ef1af4e228ec7834ac896071d3944068268a66d3370d" gracePeriod=30 Sep 30 14:02:30 crc kubenswrapper[4676]: I0930 14:02:30.580385 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-t6gvw"] Sep 30 14:02:30 crc kubenswrapper[4676]: I0930 14:02:30.580687 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-t6gvw" podUID="586d40ff-8404-4b50-bfab-bd99ba97daca" containerName="registry-server" containerID="cri-o://e20d6443891fb09bc00d4c3346a31ea6e239a36ec7c6710035347fb575d56dea" gracePeriod=30 Sep 30 14:02:30 crc kubenswrapper[4676]: I0930 14:02:30.594400 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vnxs8"] Sep 30 14:02:30 crc kubenswrapper[4676]: E0930 14:02:30.594716 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bb07181-c62b-4440-9ab1-b41f968c8a05" containerName="extract-utilities" Sep 30 14:02:30 crc kubenswrapper[4676]: I0930 14:02:30.594738 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bb07181-c62b-4440-9ab1-b41f968c8a05" containerName="extract-utilities" Sep 30 14:02:30 crc kubenswrapper[4676]: E0930 14:02:30.594750 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddc55b99-87b9-4f3c-a5a5-08b857e41975" containerName="registry-server" Sep 30 14:02:30 crc kubenswrapper[4676]: I0930 14:02:30.594758 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddc55b99-87b9-4f3c-a5a5-08b857e41975" containerName="registry-server" Sep 30 14:02:30 crc kubenswrapper[4676]: E0930 14:02:30.594769 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a878b8c0-a59b-4cc7-ac47-18164d4337af" containerName="extract-utilities" Sep 30 14:02:30 crc kubenswrapper[4676]: I0930 14:02:30.594777 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="a878b8c0-a59b-4cc7-ac47-18164d4337af" containerName="extract-utilities" Sep 30 14:02:30 crc kubenswrapper[4676]: E0930 14:02:30.594789 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bb07181-c62b-4440-9ab1-b41f968c8a05" containerName="registry-server" Sep 30 14:02:30 crc kubenswrapper[4676]: I0930 14:02:30.594797 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bb07181-c62b-4440-9ab1-b41f968c8a05" containerName="registry-server" Sep 30 14:02:30 crc kubenswrapper[4676]: E0930 14:02:30.594810 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddc55b99-87b9-4f3c-a5a5-08b857e41975" containerName="extract-content" Sep 30 14:02:30 crc kubenswrapper[4676]: I0930 14:02:30.594817 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddc55b99-87b9-4f3c-a5a5-08b857e41975" containerName="extract-content" Sep 30 14:02:30 crc kubenswrapper[4676]: E0930 14:02:30.594845 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4c8cc1a-bfb8-4999-9a79-be01ae514a3c" containerName="pruner" Sep 30 14:02:30 crc kubenswrapper[4676]: I0930 14:02:30.594855 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4c8cc1a-bfb8-4999-9a79-be01ae514a3c" containerName="pruner" Sep 30 14:02:30 crc kubenswrapper[4676]: E0930 14:02:30.594865 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bb07181-c62b-4440-9ab1-b41f968c8a05" containerName="extract-content" Sep 30 14:02:30 crc kubenswrapper[4676]: I0930 14:02:30.594873 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bb07181-c62b-4440-9ab1-b41f968c8a05" containerName="extract-content" Sep 30 14:02:30 crc kubenswrapper[4676]: E0930 14:02:30.594898 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98511dcf-3151-4955-b80b-c744ff06da2c" containerName="extract-content" Sep 30 14:02:30 crc kubenswrapper[4676]: I0930 14:02:30.594905 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="98511dcf-3151-4955-b80b-c744ff06da2c" containerName="extract-content" Sep 30 14:02:30 crc kubenswrapper[4676]: E0930 14:02:30.594917 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6468238-1faf-4dc1-82f2-1d2ade02bbeb" containerName="pruner" Sep 30 14:02:30 crc kubenswrapper[4676]: I0930 14:02:30.594925 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6468238-1faf-4dc1-82f2-1d2ade02bbeb" containerName="pruner" Sep 30 14:02:30 crc kubenswrapper[4676]: E0930 14:02:30.594937 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98511dcf-3151-4955-b80b-c744ff06da2c" containerName="registry-server" Sep 30 14:02:30 crc kubenswrapper[4676]: I0930 14:02:30.594944 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="98511dcf-3151-4955-b80b-c744ff06da2c" containerName="registry-server" Sep 30 14:02:30 crc kubenswrapper[4676]: E0930 14:02:30.594953 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98511dcf-3151-4955-b80b-c744ff06da2c" containerName="extract-utilities" Sep 30 14:02:30 crc kubenswrapper[4676]: I0930 14:02:30.594961 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="98511dcf-3151-4955-b80b-c744ff06da2c" containerName="extract-utilities" Sep 30 14:02:30 crc kubenswrapper[4676]: E0930 14:02:30.594977 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a878b8c0-a59b-4cc7-ac47-18164d4337af" containerName="extract-content" Sep 30 14:02:30 crc kubenswrapper[4676]: I0930 14:02:30.594984 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="a878b8c0-a59b-4cc7-ac47-18164d4337af" containerName="extract-content" Sep 30 14:02:30 crc kubenswrapper[4676]: E0930 14:02:30.594997 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddc55b99-87b9-4f3c-a5a5-08b857e41975" containerName="extract-utilities" Sep 30 14:02:30 crc kubenswrapper[4676]: I0930 14:02:30.595004 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddc55b99-87b9-4f3c-a5a5-08b857e41975" containerName="extract-utilities" Sep 30 14:02:30 crc kubenswrapper[4676]: E0930 14:02:30.595014 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a878b8c0-a59b-4cc7-ac47-18164d4337af" containerName="registry-server" Sep 30 14:02:30 crc kubenswrapper[4676]: I0930 14:02:30.595022 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="a878b8c0-a59b-4cc7-ac47-18164d4337af" containerName="registry-server" Sep 30 14:02:30 crc kubenswrapper[4676]: I0930 14:02:30.595138 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddc55b99-87b9-4f3c-a5a5-08b857e41975" containerName="registry-server" Sep 30 14:02:30 crc kubenswrapper[4676]: I0930 14:02:30.595155 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bb07181-c62b-4440-9ab1-b41f968c8a05" containerName="registry-server" Sep 30 14:02:30 crc kubenswrapper[4676]: I0930 14:02:30.595168 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="98511dcf-3151-4955-b80b-c744ff06da2c" containerName="registry-server" Sep 30 14:02:30 crc kubenswrapper[4676]: I0930 14:02:30.595178 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4c8cc1a-bfb8-4999-9a79-be01ae514a3c" containerName="pruner" Sep 30 14:02:30 crc kubenswrapper[4676]: I0930 14:02:30.595188 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6468238-1faf-4dc1-82f2-1d2ade02bbeb" containerName="pruner" Sep 30 14:02:30 crc kubenswrapper[4676]: I0930 14:02:30.595200 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="a878b8c0-a59b-4cc7-ac47-18164d4337af" containerName="registry-server" Sep 30 14:02:30 crc kubenswrapper[4676]: I0930 14:02:30.595757 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-vnxs8" Sep 30 14:02:30 crc kubenswrapper[4676]: I0930 14:02:30.598331 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5q6qs"] Sep 30 14:02:30 crc kubenswrapper[4676]: I0930 14:02:30.598605 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5q6qs" podUID="9d686c86-1f90-4466-a2e9-19f8f15c7e87" containerName="registry-server" containerID="cri-o://441d673a4747051b8c52cda9c97b09654d585d08777d32ddbf561ff0ad43736f" gracePeriod=30 Sep 30 14:02:30 crc kubenswrapper[4676]: I0930 14:02:30.602855 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vnxs8"] Sep 30 14:02:30 crc kubenswrapper[4676]: I0930 14:02:30.792993 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fbc2525f-9c5f-4639-908d-35fed61607f5-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-vnxs8\" (UID: \"fbc2525f-9c5f-4639-908d-35fed61607f5\") " pod="openshift-marketplace/marketplace-operator-79b997595-vnxs8" Sep 30 14:02:30 crc kubenswrapper[4676]: I0930 14:02:30.793047 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fbc2525f-9c5f-4639-908d-35fed61607f5-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-vnxs8\" (UID: \"fbc2525f-9c5f-4639-908d-35fed61607f5\") " pod="openshift-marketplace/marketplace-operator-79b997595-vnxs8" Sep 30 14:02:30 crc kubenswrapper[4676]: I0930 14:02:30.793107 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9zwr\" (UniqueName: \"kubernetes.io/projected/fbc2525f-9c5f-4639-908d-35fed61607f5-kube-api-access-b9zwr\") pod \"marketplace-operator-79b997595-vnxs8\" (UID: \"fbc2525f-9c5f-4639-908d-35fed61607f5\") " pod="openshift-marketplace/marketplace-operator-79b997595-vnxs8" Sep 30 14:02:30 crc kubenswrapper[4676]: I0930 14:02:30.895620 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fbc2525f-9c5f-4639-908d-35fed61607f5-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-vnxs8\" (UID: \"fbc2525f-9c5f-4639-908d-35fed61607f5\") " pod="openshift-marketplace/marketplace-operator-79b997595-vnxs8" Sep 30 14:02:30 crc kubenswrapper[4676]: I0930 14:02:30.895997 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fbc2525f-9c5f-4639-908d-35fed61607f5-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-vnxs8\" (UID: \"fbc2525f-9c5f-4639-908d-35fed61607f5\") " pod="openshift-marketplace/marketplace-operator-79b997595-vnxs8" Sep 30 14:02:30 crc kubenswrapper[4676]: I0930 14:02:30.896085 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9zwr\" (UniqueName: \"kubernetes.io/projected/fbc2525f-9c5f-4639-908d-35fed61607f5-kube-api-access-b9zwr\") pod \"marketplace-operator-79b997595-vnxs8\" (UID: \"fbc2525f-9c5f-4639-908d-35fed61607f5\") " pod="openshift-marketplace/marketplace-operator-79b997595-vnxs8" Sep 30 14:02:30 crc kubenswrapper[4676]: I0930 14:02:30.897198 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fbc2525f-9c5f-4639-908d-35fed61607f5-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-vnxs8\" (UID: \"fbc2525f-9c5f-4639-908d-35fed61607f5\") " pod="openshift-marketplace/marketplace-operator-79b997595-vnxs8" Sep 30 14:02:30 crc kubenswrapper[4676]: I0930 14:02:30.903767 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fbc2525f-9c5f-4639-908d-35fed61607f5-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-vnxs8\" (UID: \"fbc2525f-9c5f-4639-908d-35fed61607f5\") " pod="openshift-marketplace/marketplace-operator-79b997595-vnxs8" Sep 30 14:02:30 crc kubenswrapper[4676]: I0930 14:02:30.915325 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9zwr\" (UniqueName: \"kubernetes.io/projected/fbc2525f-9c5f-4639-908d-35fed61607f5-kube-api-access-b9zwr\") pod \"marketplace-operator-79b997595-vnxs8\" (UID: \"fbc2525f-9c5f-4639-908d-35fed61607f5\") " pod="openshift-marketplace/marketplace-operator-79b997595-vnxs8" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.050385 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8bmsn" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.054079 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7wrkd" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.067133 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w8xnq" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.069059 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5q6qs" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.099779 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d686c86-1f90-4466-a2e9-19f8f15c7e87-catalog-content\") pod \"9d686c86-1f90-4466-a2e9-19f8f15c7e87\" (UID: \"9d686c86-1f90-4466-a2e9-19f8f15c7e87\") " Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.099817 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d686c86-1f90-4466-a2e9-19f8f15c7e87-utilities\") pod \"9d686c86-1f90-4466-a2e9-19f8f15c7e87\" (UID: \"9d686c86-1f90-4466-a2e9-19f8f15c7e87\") " Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.100731 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d686c86-1f90-4466-a2e9-19f8f15c7e87-utilities" (OuterVolumeSpecName: "utilities") pod "9d686c86-1f90-4466-a2e9-19f8f15c7e87" (UID: "9d686c86-1f90-4466-a2e9-19f8f15c7e87"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.100854 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t6gvw" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.191609 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d686c86-1f90-4466-a2e9-19f8f15c7e87-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9d686c86-1f90-4466-a2e9-19f8f15c7e87" (UID: "9d686c86-1f90-4466-a2e9-19f8f15c7e87"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.204447 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19e521ab-fa8b-43c8-a6ca-c57b507fce2d-utilities\") pod \"19e521ab-fa8b-43c8-a6ca-c57b507fce2d\" (UID: \"19e521ab-fa8b-43c8-a6ca-c57b507fce2d\") " Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.204769 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9g5q\" (UniqueName: \"kubernetes.io/projected/cad652ea-7de0-4021-b594-7ea2a0681286-kube-api-access-h9g5q\") pod \"cad652ea-7de0-4021-b594-7ea2a0681286\" (UID: \"cad652ea-7de0-4021-b594-7ea2a0681286\") " Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.206588 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v68kh\" (UniqueName: \"kubernetes.io/projected/4ec8c2c0-6649-4fce-bd49-c178b25d9da1-kube-api-access-v68kh\") pod \"4ec8c2c0-6649-4fce-bd49-c178b25d9da1\" (UID: \"4ec8c2c0-6649-4fce-bd49-c178b25d9da1\") " Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.206758 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4ec8c2c0-6649-4fce-bd49-c178b25d9da1-marketplace-operator-metrics\") pod \"4ec8c2c0-6649-4fce-bd49-c178b25d9da1\" (UID: \"4ec8c2c0-6649-4fce-bd49-c178b25d9da1\") " Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.206866 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cad652ea-7de0-4021-b594-7ea2a0681286-utilities\") pod \"cad652ea-7de0-4021-b594-7ea2a0681286\" (UID: \"cad652ea-7de0-4021-b594-7ea2a0681286\") " Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.206970 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/586d40ff-8404-4b50-bfab-bd99ba97daca-catalog-content\") pod \"586d40ff-8404-4b50-bfab-bd99ba97daca\" (UID: \"586d40ff-8404-4b50-bfab-bd99ba97daca\") " Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.207039 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvr9w\" (UniqueName: \"kubernetes.io/projected/9d686c86-1f90-4466-a2e9-19f8f15c7e87-kube-api-access-tvr9w\") pod \"9d686c86-1f90-4466-a2e9-19f8f15c7e87\" (UID: \"9d686c86-1f90-4466-a2e9-19f8f15c7e87\") " Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.207110 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4ec8c2c0-6649-4fce-bd49-c178b25d9da1-marketplace-trusted-ca\") pod \"4ec8c2c0-6649-4fce-bd49-c178b25d9da1\" (UID: \"4ec8c2c0-6649-4fce-bd49-c178b25d9da1\") " Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.207194 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19e521ab-fa8b-43c8-a6ca-c57b507fce2d-catalog-content\") pod \"19e521ab-fa8b-43c8-a6ca-c57b507fce2d\" (UID: \"19e521ab-fa8b-43c8-a6ca-c57b507fce2d\") " Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.207277 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hkbr\" (UniqueName: \"kubernetes.io/projected/586d40ff-8404-4b50-bfab-bd99ba97daca-kube-api-access-7hkbr\") pod \"586d40ff-8404-4b50-bfab-bd99ba97daca\" (UID: \"586d40ff-8404-4b50-bfab-bd99ba97daca\") " Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.207354 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbfbm\" (UniqueName: \"kubernetes.io/projected/19e521ab-fa8b-43c8-a6ca-c57b507fce2d-kube-api-access-wbfbm\") pod \"19e521ab-fa8b-43c8-a6ca-c57b507fce2d\" (UID: \"19e521ab-fa8b-43c8-a6ca-c57b507fce2d\") " Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.207420 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cad652ea-7de0-4021-b594-7ea2a0681286-catalog-content\") pod \"cad652ea-7de0-4021-b594-7ea2a0681286\" (UID: \"cad652ea-7de0-4021-b594-7ea2a0681286\") " Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.207519 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/586d40ff-8404-4b50-bfab-bd99ba97daca-utilities\") pod \"586d40ff-8404-4b50-bfab-bd99ba97daca\" (UID: \"586d40ff-8404-4b50-bfab-bd99ba97daca\") " Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.205298 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19e521ab-fa8b-43c8-a6ca-c57b507fce2d-utilities" (OuterVolumeSpecName: "utilities") pod "19e521ab-fa8b-43c8-a6ca-c57b507fce2d" (UID: "19e521ab-fa8b-43c8-a6ca-c57b507fce2d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.207767 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cad652ea-7de0-4021-b594-7ea2a0681286-utilities" (OuterVolumeSpecName: "utilities") pod "cad652ea-7de0-4021-b594-7ea2a0681286" (UID: "cad652ea-7de0-4021-b594-7ea2a0681286"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.208849 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ec8c2c0-6649-4fce-bd49-c178b25d9da1-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "4ec8c2c0-6649-4fce-bd49-c178b25d9da1" (UID: "4ec8c2c0-6649-4fce-bd49-c178b25d9da1"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.209287 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/586d40ff-8404-4b50-bfab-bd99ba97daca-utilities" (OuterVolumeSpecName: "utilities") pod "586d40ff-8404-4b50-bfab-bd99ba97daca" (UID: "586d40ff-8404-4b50-bfab-bd99ba97daca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.210149 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/586d40ff-8404-4b50-bfab-bd99ba97daca-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.210409 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19e521ab-fa8b-43c8-a6ca-c57b507fce2d-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.210506 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cad652ea-7de0-4021-b594-7ea2a0681286-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.210708 4676 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4ec8c2c0-6649-4fce-bd49-c178b25d9da1-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.210798 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d686c86-1f90-4466-a2e9-19f8f15c7e87-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.210860 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d686c86-1f90-4466-a2e9-19f8f15c7e87-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.210205 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cad652ea-7de0-4021-b594-7ea2a0681286-kube-api-access-h9g5q" (OuterVolumeSpecName: "kube-api-access-h9g5q") pod "cad652ea-7de0-4021-b594-7ea2a0681286" (UID: "cad652ea-7de0-4021-b594-7ea2a0681286"). InnerVolumeSpecName "kube-api-access-h9g5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.210644 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19e521ab-fa8b-43c8-a6ca-c57b507fce2d-kube-api-access-wbfbm" (OuterVolumeSpecName: "kube-api-access-wbfbm") pod "19e521ab-fa8b-43c8-a6ca-c57b507fce2d" (UID: "19e521ab-fa8b-43c8-a6ca-c57b507fce2d"). InnerVolumeSpecName "kube-api-access-wbfbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.212012 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/586d40ff-8404-4b50-bfab-bd99ba97daca-kube-api-access-7hkbr" (OuterVolumeSpecName: "kube-api-access-7hkbr") pod "586d40ff-8404-4b50-bfab-bd99ba97daca" (UID: "586d40ff-8404-4b50-bfab-bd99ba97daca"). InnerVolumeSpecName "kube-api-access-7hkbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.212058 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ec8c2c0-6649-4fce-bd49-c178b25d9da1-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "4ec8c2c0-6649-4fce-bd49-c178b25d9da1" (UID: "4ec8c2c0-6649-4fce-bd49-c178b25d9da1"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.213142 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-vnxs8" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.216171 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d686c86-1f90-4466-a2e9-19f8f15c7e87-kube-api-access-tvr9w" (OuterVolumeSpecName: "kube-api-access-tvr9w") pod "9d686c86-1f90-4466-a2e9-19f8f15c7e87" (UID: "9d686c86-1f90-4466-a2e9-19f8f15c7e87"). InnerVolumeSpecName "kube-api-access-tvr9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.221599 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ec8c2c0-6649-4fce-bd49-c178b25d9da1-kube-api-access-v68kh" (OuterVolumeSpecName: "kube-api-access-v68kh") pod "4ec8c2c0-6649-4fce-bd49-c178b25d9da1" (UID: "4ec8c2c0-6649-4fce-bd49-c178b25d9da1"). InnerVolumeSpecName "kube-api-access-v68kh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.230303 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/586d40ff-8404-4b50-bfab-bd99ba97daca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "586d40ff-8404-4b50-bfab-bd99ba97daca" (UID: "586d40ff-8404-4b50-bfab-bd99ba97daca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.270932 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cad652ea-7de0-4021-b594-7ea2a0681286-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cad652ea-7de0-4021-b594-7ea2a0681286" (UID: "cad652ea-7de0-4021-b594-7ea2a0681286"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.280353 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19e521ab-fa8b-43c8-a6ca-c57b507fce2d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "19e521ab-fa8b-43c8-a6ca-c57b507fce2d" (UID: "19e521ab-fa8b-43c8-a6ca-c57b507fce2d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.311759 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9g5q\" (UniqueName: \"kubernetes.io/projected/cad652ea-7de0-4021-b594-7ea2a0681286-kube-api-access-h9g5q\") on node \"crc\" DevicePath \"\"" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.311790 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v68kh\" (UniqueName: \"kubernetes.io/projected/4ec8c2c0-6649-4fce-bd49-c178b25d9da1-kube-api-access-v68kh\") on node \"crc\" DevicePath \"\"" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.311800 4676 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4ec8c2c0-6649-4fce-bd49-c178b25d9da1-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.311810 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/586d40ff-8404-4b50-bfab-bd99ba97daca-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.311819 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvr9w\" (UniqueName: \"kubernetes.io/projected/9d686c86-1f90-4466-a2e9-19f8f15c7e87-kube-api-access-tvr9w\") on node \"crc\" DevicePath \"\"" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.311828 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19e521ab-fa8b-43c8-a6ca-c57b507fce2d-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.311839 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hkbr\" (UniqueName: \"kubernetes.io/projected/586d40ff-8404-4b50-bfab-bd99ba97daca-kube-api-access-7hkbr\") on node \"crc\" DevicePath \"\"" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.311848 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbfbm\" (UniqueName: \"kubernetes.io/projected/19e521ab-fa8b-43c8-a6ca-c57b507fce2d-kube-api-access-wbfbm\") on node \"crc\" DevicePath \"\"" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.311856 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cad652ea-7de0-4021-b594-7ea2a0681286-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.615942 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vnxs8"] Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.661806 4676 generic.go:334] "Generic (PLEG): container finished" podID="4ec8c2c0-6649-4fce-bd49-c178b25d9da1" containerID="2556a81d50a43dce8ab2ef1af4e228ec7834ac896071d3944068268a66d3370d" exitCode=0 Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.661866 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7wrkd" event={"ID":"4ec8c2c0-6649-4fce-bd49-c178b25d9da1","Type":"ContainerDied","Data":"2556a81d50a43dce8ab2ef1af4e228ec7834ac896071d3944068268a66d3370d"} Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.661942 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7wrkd" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.661982 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7wrkd" event={"ID":"4ec8c2c0-6649-4fce-bd49-c178b25d9da1","Type":"ContainerDied","Data":"bbd3ed4b1da8ec0fd64440ede714280bced06fa4169a8b6684fa9cfb0afe8791"} Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.662017 4676 scope.go:117] "RemoveContainer" containerID="2556a81d50a43dce8ab2ef1af4e228ec7834ac896071d3944068268a66d3370d" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.664490 4676 generic.go:334] "Generic (PLEG): container finished" podID="19e521ab-fa8b-43c8-a6ca-c57b507fce2d" containerID="02bbce9deeb0e176c9fbc29ffe090d8aa25de3ddbab110a96fc3b86c5fd6f060" exitCode=0 Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.664551 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8bmsn" event={"ID":"19e521ab-fa8b-43c8-a6ca-c57b507fce2d","Type":"ContainerDied","Data":"02bbce9deeb0e176c9fbc29ffe090d8aa25de3ddbab110a96fc3b86c5fd6f060"} Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.664610 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8bmsn" event={"ID":"19e521ab-fa8b-43c8-a6ca-c57b507fce2d","Type":"ContainerDied","Data":"2d7f69527f71ccc2d1781dc6a8c454f62e75d4b1f192fadb55e90dad1721bcf3"} Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.664681 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8bmsn" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.667955 4676 generic.go:334] "Generic (PLEG): container finished" podID="9d686c86-1f90-4466-a2e9-19f8f15c7e87" containerID="441d673a4747051b8c52cda9c97b09654d585d08777d32ddbf561ff0ad43736f" exitCode=0 Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.667999 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5q6qs" event={"ID":"9d686c86-1f90-4466-a2e9-19f8f15c7e87","Type":"ContainerDied","Data":"441d673a4747051b8c52cda9c97b09654d585d08777d32ddbf561ff0ad43736f"} Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.668018 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5q6qs" event={"ID":"9d686c86-1f90-4466-a2e9-19f8f15c7e87","Type":"ContainerDied","Data":"4ba66d080e0a95bf6280103cd79fbc8fb5f58534a93c729be94b3e5d126fb92e"} Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.668071 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5q6qs" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.670175 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vnxs8" event={"ID":"fbc2525f-9c5f-4639-908d-35fed61607f5","Type":"ContainerStarted","Data":"a0622ba012eaa14c5d544b98ac5e32c48afc6aa6d47406191df4acf0ed2fcf99"} Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.682427 4676 generic.go:334] "Generic (PLEG): container finished" podID="cad652ea-7de0-4021-b594-7ea2a0681286" containerID="c9571b48b72cab3363f8243f6b38e063a59ceeb2c4a668a207ce79fb5489565d" exitCode=0 Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.682580 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w8xnq" event={"ID":"cad652ea-7de0-4021-b594-7ea2a0681286","Type":"ContainerDied","Data":"c9571b48b72cab3363f8243f6b38e063a59ceeb2c4a668a207ce79fb5489565d"} Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.682629 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w8xnq" event={"ID":"cad652ea-7de0-4021-b594-7ea2a0681286","Type":"ContainerDied","Data":"e777f253c3e60f7a18141d1913731826cbf10f1e5c498e36e2934c061a8873b6"} Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.682765 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w8xnq" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.687074 4676 generic.go:334] "Generic (PLEG): container finished" podID="586d40ff-8404-4b50-bfab-bd99ba97daca" containerID="e20d6443891fb09bc00d4c3346a31ea6e239a36ec7c6710035347fb575d56dea" exitCode=0 Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.687147 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t6gvw" event={"ID":"586d40ff-8404-4b50-bfab-bd99ba97daca","Type":"ContainerDied","Data":"e20d6443891fb09bc00d4c3346a31ea6e239a36ec7c6710035347fb575d56dea"} Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.687189 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t6gvw" event={"ID":"586d40ff-8404-4b50-bfab-bd99ba97daca","Type":"ContainerDied","Data":"116cd9a9aadf880d483ebe7c4bb18311a2798147600f0200765b6e94fa3f9621"} Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.687318 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t6gvw" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.735340 4676 scope.go:117] "RemoveContainer" containerID="2556a81d50a43dce8ab2ef1af4e228ec7834ac896071d3944068268a66d3370d" Sep 30 14:02:31 crc kubenswrapper[4676]: E0930 14:02:31.735790 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2556a81d50a43dce8ab2ef1af4e228ec7834ac896071d3944068268a66d3370d\": container with ID starting with 2556a81d50a43dce8ab2ef1af4e228ec7834ac896071d3944068268a66d3370d not found: ID does not exist" containerID="2556a81d50a43dce8ab2ef1af4e228ec7834ac896071d3944068268a66d3370d" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.735826 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2556a81d50a43dce8ab2ef1af4e228ec7834ac896071d3944068268a66d3370d"} err="failed to get container status \"2556a81d50a43dce8ab2ef1af4e228ec7834ac896071d3944068268a66d3370d\": rpc error: code = NotFound desc = could not find container \"2556a81d50a43dce8ab2ef1af4e228ec7834ac896071d3944068268a66d3370d\": container with ID starting with 2556a81d50a43dce8ab2ef1af4e228ec7834ac896071d3944068268a66d3370d not found: ID does not exist" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.735852 4676 scope.go:117] "RemoveContainer" containerID="02bbce9deeb0e176c9fbc29ffe090d8aa25de3ddbab110a96fc3b86c5fd6f060" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.755186 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8bmsn"] Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.757534 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8bmsn"] Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.764980 4676 scope.go:117] "RemoveContainer" containerID="6b790c37c7a855220e503a0220c33cfa909afe18a7d0817c6e52e44bf1528420" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.775331 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-t6gvw"] Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.777944 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-t6gvw"] Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.783028 4676 scope.go:117] "RemoveContainer" containerID="bb4a9e0b3ab2ff339c01cf98a8d22afcaf60713b61bc17a6789fa3a15ae8a237" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.787688 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7wrkd"] Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.790558 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7wrkd"] Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.800925 4676 scope.go:117] "RemoveContainer" containerID="02bbce9deeb0e176c9fbc29ffe090d8aa25de3ddbab110a96fc3b86c5fd6f060" Sep 30 14:02:31 crc kubenswrapper[4676]: E0930 14:02:31.803451 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02bbce9deeb0e176c9fbc29ffe090d8aa25de3ddbab110a96fc3b86c5fd6f060\": container with ID starting with 02bbce9deeb0e176c9fbc29ffe090d8aa25de3ddbab110a96fc3b86c5fd6f060 not found: ID does not exist" containerID="02bbce9deeb0e176c9fbc29ffe090d8aa25de3ddbab110a96fc3b86c5fd6f060" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.803493 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02bbce9deeb0e176c9fbc29ffe090d8aa25de3ddbab110a96fc3b86c5fd6f060"} err="failed to get container status \"02bbce9deeb0e176c9fbc29ffe090d8aa25de3ddbab110a96fc3b86c5fd6f060\": rpc error: code = NotFound desc = could not find container \"02bbce9deeb0e176c9fbc29ffe090d8aa25de3ddbab110a96fc3b86c5fd6f060\": container with ID starting with 02bbce9deeb0e176c9fbc29ffe090d8aa25de3ddbab110a96fc3b86c5fd6f060 not found: ID does not exist" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.803520 4676 scope.go:117] "RemoveContainer" containerID="6b790c37c7a855220e503a0220c33cfa909afe18a7d0817c6e52e44bf1528420" Sep 30 14:02:31 crc kubenswrapper[4676]: E0930 14:02:31.805293 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b790c37c7a855220e503a0220c33cfa909afe18a7d0817c6e52e44bf1528420\": container with ID starting with 6b790c37c7a855220e503a0220c33cfa909afe18a7d0817c6e52e44bf1528420 not found: ID does not exist" containerID="6b790c37c7a855220e503a0220c33cfa909afe18a7d0817c6e52e44bf1528420" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.805337 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b790c37c7a855220e503a0220c33cfa909afe18a7d0817c6e52e44bf1528420"} err="failed to get container status \"6b790c37c7a855220e503a0220c33cfa909afe18a7d0817c6e52e44bf1528420\": rpc error: code = NotFound desc = could not find container \"6b790c37c7a855220e503a0220c33cfa909afe18a7d0817c6e52e44bf1528420\": container with ID starting with 6b790c37c7a855220e503a0220c33cfa909afe18a7d0817c6e52e44bf1528420 not found: ID does not exist" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.805355 4676 scope.go:117] "RemoveContainer" containerID="bb4a9e0b3ab2ff339c01cf98a8d22afcaf60713b61bc17a6789fa3a15ae8a237" Sep 30 14:02:31 crc kubenswrapper[4676]: E0930 14:02:31.805847 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb4a9e0b3ab2ff339c01cf98a8d22afcaf60713b61bc17a6789fa3a15ae8a237\": container with ID starting with bb4a9e0b3ab2ff339c01cf98a8d22afcaf60713b61bc17a6789fa3a15ae8a237 not found: ID does not exist" containerID="bb4a9e0b3ab2ff339c01cf98a8d22afcaf60713b61bc17a6789fa3a15ae8a237" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.805920 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb4a9e0b3ab2ff339c01cf98a8d22afcaf60713b61bc17a6789fa3a15ae8a237"} err="failed to get container status \"bb4a9e0b3ab2ff339c01cf98a8d22afcaf60713b61bc17a6789fa3a15ae8a237\": rpc error: code = NotFound desc = could not find container \"bb4a9e0b3ab2ff339c01cf98a8d22afcaf60713b61bc17a6789fa3a15ae8a237\": container with ID starting with bb4a9e0b3ab2ff339c01cf98a8d22afcaf60713b61bc17a6789fa3a15ae8a237 not found: ID does not exist" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.805964 4676 scope.go:117] "RemoveContainer" containerID="441d673a4747051b8c52cda9c97b09654d585d08777d32ddbf561ff0ad43736f" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.807008 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5q6qs"] Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.813920 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5q6qs"] Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.821243 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-w8xnq"] Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.823752 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-w8xnq"] Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.826707 4676 scope.go:117] "RemoveContainer" containerID="5e112d1e0e150545da3ff5aaa353c0bf4e2b223ca78bb4307ddeb478b281fd90" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.841627 4676 scope.go:117] "RemoveContainer" containerID="e0790c752393455f9fb4c8f94625283230db41e64814ea96115862574c253ae1" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.857214 4676 scope.go:117] "RemoveContainer" containerID="441d673a4747051b8c52cda9c97b09654d585d08777d32ddbf561ff0ad43736f" Sep 30 14:02:31 crc kubenswrapper[4676]: E0930 14:02:31.857837 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"441d673a4747051b8c52cda9c97b09654d585d08777d32ddbf561ff0ad43736f\": container with ID starting with 441d673a4747051b8c52cda9c97b09654d585d08777d32ddbf561ff0ad43736f not found: ID does not exist" containerID="441d673a4747051b8c52cda9c97b09654d585d08777d32ddbf561ff0ad43736f" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.857972 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"441d673a4747051b8c52cda9c97b09654d585d08777d32ddbf561ff0ad43736f"} err="failed to get container status \"441d673a4747051b8c52cda9c97b09654d585d08777d32ddbf561ff0ad43736f\": rpc error: code = NotFound desc = could not find container \"441d673a4747051b8c52cda9c97b09654d585d08777d32ddbf561ff0ad43736f\": container with ID starting with 441d673a4747051b8c52cda9c97b09654d585d08777d32ddbf561ff0ad43736f not found: ID does not exist" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.858006 4676 scope.go:117] "RemoveContainer" containerID="5e112d1e0e150545da3ff5aaa353c0bf4e2b223ca78bb4307ddeb478b281fd90" Sep 30 14:02:31 crc kubenswrapper[4676]: E0930 14:02:31.858363 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e112d1e0e150545da3ff5aaa353c0bf4e2b223ca78bb4307ddeb478b281fd90\": container with ID starting with 5e112d1e0e150545da3ff5aaa353c0bf4e2b223ca78bb4307ddeb478b281fd90 not found: ID does not exist" containerID="5e112d1e0e150545da3ff5aaa353c0bf4e2b223ca78bb4307ddeb478b281fd90" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.858384 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e112d1e0e150545da3ff5aaa353c0bf4e2b223ca78bb4307ddeb478b281fd90"} err="failed to get container status \"5e112d1e0e150545da3ff5aaa353c0bf4e2b223ca78bb4307ddeb478b281fd90\": rpc error: code = NotFound desc = could not find container \"5e112d1e0e150545da3ff5aaa353c0bf4e2b223ca78bb4307ddeb478b281fd90\": container with ID starting with 5e112d1e0e150545da3ff5aaa353c0bf4e2b223ca78bb4307ddeb478b281fd90 not found: ID does not exist" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.858424 4676 scope.go:117] "RemoveContainer" containerID="e0790c752393455f9fb4c8f94625283230db41e64814ea96115862574c253ae1" Sep 30 14:02:31 crc kubenswrapper[4676]: E0930 14:02:31.858658 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0790c752393455f9fb4c8f94625283230db41e64814ea96115862574c253ae1\": container with ID starting with e0790c752393455f9fb4c8f94625283230db41e64814ea96115862574c253ae1 not found: ID does not exist" containerID="e0790c752393455f9fb4c8f94625283230db41e64814ea96115862574c253ae1" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.858681 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0790c752393455f9fb4c8f94625283230db41e64814ea96115862574c253ae1"} err="failed to get container status \"e0790c752393455f9fb4c8f94625283230db41e64814ea96115862574c253ae1\": rpc error: code = NotFound desc = could not find container \"e0790c752393455f9fb4c8f94625283230db41e64814ea96115862574c253ae1\": container with ID starting with e0790c752393455f9fb4c8f94625283230db41e64814ea96115862574c253ae1 not found: ID does not exist" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.858695 4676 scope.go:117] "RemoveContainer" containerID="c9571b48b72cab3363f8243f6b38e063a59ceeb2c4a668a207ce79fb5489565d" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.872787 4676 scope.go:117] "RemoveContainer" containerID="e00545ca5dce6946a03be66ddfead10b993270c41477e741d1a34ca9fc940105" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.893374 4676 scope.go:117] "RemoveContainer" containerID="67d6562e91b061be4c580510460c491552061166877e34c5e0c845be22e9cf27" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.911494 4676 scope.go:117] "RemoveContainer" containerID="c9571b48b72cab3363f8243f6b38e063a59ceeb2c4a668a207ce79fb5489565d" Sep 30 14:02:31 crc kubenswrapper[4676]: E0930 14:02:31.912015 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9571b48b72cab3363f8243f6b38e063a59ceeb2c4a668a207ce79fb5489565d\": container with ID starting with c9571b48b72cab3363f8243f6b38e063a59ceeb2c4a668a207ce79fb5489565d not found: ID does not exist" containerID="c9571b48b72cab3363f8243f6b38e063a59ceeb2c4a668a207ce79fb5489565d" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.912050 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9571b48b72cab3363f8243f6b38e063a59ceeb2c4a668a207ce79fb5489565d"} err="failed to get container status \"c9571b48b72cab3363f8243f6b38e063a59ceeb2c4a668a207ce79fb5489565d\": rpc error: code = NotFound desc = could not find container \"c9571b48b72cab3363f8243f6b38e063a59ceeb2c4a668a207ce79fb5489565d\": container with ID starting with c9571b48b72cab3363f8243f6b38e063a59ceeb2c4a668a207ce79fb5489565d not found: ID does not exist" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.912075 4676 scope.go:117] "RemoveContainer" containerID="e00545ca5dce6946a03be66ddfead10b993270c41477e741d1a34ca9fc940105" Sep 30 14:02:31 crc kubenswrapper[4676]: E0930 14:02:31.912770 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e00545ca5dce6946a03be66ddfead10b993270c41477e741d1a34ca9fc940105\": container with ID starting with e00545ca5dce6946a03be66ddfead10b993270c41477e741d1a34ca9fc940105 not found: ID does not exist" containerID="e00545ca5dce6946a03be66ddfead10b993270c41477e741d1a34ca9fc940105" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.912794 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e00545ca5dce6946a03be66ddfead10b993270c41477e741d1a34ca9fc940105"} err="failed to get container status \"e00545ca5dce6946a03be66ddfead10b993270c41477e741d1a34ca9fc940105\": rpc error: code = NotFound desc = could not find container \"e00545ca5dce6946a03be66ddfead10b993270c41477e741d1a34ca9fc940105\": container with ID starting with e00545ca5dce6946a03be66ddfead10b993270c41477e741d1a34ca9fc940105 not found: ID does not exist" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.912807 4676 scope.go:117] "RemoveContainer" containerID="67d6562e91b061be4c580510460c491552061166877e34c5e0c845be22e9cf27" Sep 30 14:02:31 crc kubenswrapper[4676]: E0930 14:02:31.913119 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67d6562e91b061be4c580510460c491552061166877e34c5e0c845be22e9cf27\": container with ID starting with 67d6562e91b061be4c580510460c491552061166877e34c5e0c845be22e9cf27 not found: ID does not exist" containerID="67d6562e91b061be4c580510460c491552061166877e34c5e0c845be22e9cf27" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.913160 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67d6562e91b061be4c580510460c491552061166877e34c5e0c845be22e9cf27"} err="failed to get container status \"67d6562e91b061be4c580510460c491552061166877e34c5e0c845be22e9cf27\": rpc error: code = NotFound desc = could not find container \"67d6562e91b061be4c580510460c491552061166877e34c5e0c845be22e9cf27\": container with ID starting with 67d6562e91b061be4c580510460c491552061166877e34c5e0c845be22e9cf27 not found: ID does not exist" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.913178 4676 scope.go:117] "RemoveContainer" containerID="e20d6443891fb09bc00d4c3346a31ea6e239a36ec7c6710035347fb575d56dea" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.932296 4676 scope.go:117] "RemoveContainer" containerID="19f1b2f8867d8e868c93e8fefdb2f397d937851c15e0e7ff5b66da0f901491d2" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.961451 4676 scope.go:117] "RemoveContainer" containerID="c03fe5212b408772d7c7718143d27cea6e1a71fad7488f0cf38cc07d692947fe" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.974043 4676 scope.go:117] "RemoveContainer" containerID="e20d6443891fb09bc00d4c3346a31ea6e239a36ec7c6710035347fb575d56dea" Sep 30 14:02:31 crc kubenswrapper[4676]: E0930 14:02:31.974631 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e20d6443891fb09bc00d4c3346a31ea6e239a36ec7c6710035347fb575d56dea\": container with ID starting with e20d6443891fb09bc00d4c3346a31ea6e239a36ec7c6710035347fb575d56dea not found: ID does not exist" containerID="e20d6443891fb09bc00d4c3346a31ea6e239a36ec7c6710035347fb575d56dea" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.974665 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e20d6443891fb09bc00d4c3346a31ea6e239a36ec7c6710035347fb575d56dea"} err="failed to get container status \"e20d6443891fb09bc00d4c3346a31ea6e239a36ec7c6710035347fb575d56dea\": rpc error: code = NotFound desc = could not find container \"e20d6443891fb09bc00d4c3346a31ea6e239a36ec7c6710035347fb575d56dea\": container with ID starting with e20d6443891fb09bc00d4c3346a31ea6e239a36ec7c6710035347fb575d56dea not found: ID does not exist" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.974690 4676 scope.go:117] "RemoveContainer" containerID="19f1b2f8867d8e868c93e8fefdb2f397d937851c15e0e7ff5b66da0f901491d2" Sep 30 14:02:31 crc kubenswrapper[4676]: E0930 14:02:31.975112 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19f1b2f8867d8e868c93e8fefdb2f397d937851c15e0e7ff5b66da0f901491d2\": container with ID starting with 19f1b2f8867d8e868c93e8fefdb2f397d937851c15e0e7ff5b66da0f901491d2 not found: ID does not exist" containerID="19f1b2f8867d8e868c93e8fefdb2f397d937851c15e0e7ff5b66da0f901491d2" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.975135 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19f1b2f8867d8e868c93e8fefdb2f397d937851c15e0e7ff5b66da0f901491d2"} err="failed to get container status \"19f1b2f8867d8e868c93e8fefdb2f397d937851c15e0e7ff5b66da0f901491d2\": rpc error: code = NotFound desc = could not find container \"19f1b2f8867d8e868c93e8fefdb2f397d937851c15e0e7ff5b66da0f901491d2\": container with ID starting with 19f1b2f8867d8e868c93e8fefdb2f397d937851c15e0e7ff5b66da0f901491d2 not found: ID does not exist" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.975150 4676 scope.go:117] "RemoveContainer" containerID="c03fe5212b408772d7c7718143d27cea6e1a71fad7488f0cf38cc07d692947fe" Sep 30 14:02:31 crc kubenswrapper[4676]: E0930 14:02:31.975729 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c03fe5212b408772d7c7718143d27cea6e1a71fad7488f0cf38cc07d692947fe\": container with ID starting with c03fe5212b408772d7c7718143d27cea6e1a71fad7488f0cf38cc07d692947fe not found: ID does not exist" containerID="c03fe5212b408772d7c7718143d27cea6e1a71fad7488f0cf38cc07d692947fe" Sep 30 14:02:31 crc kubenswrapper[4676]: I0930 14:02:31.975755 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c03fe5212b408772d7c7718143d27cea6e1a71fad7488f0cf38cc07d692947fe"} err="failed to get container status \"c03fe5212b408772d7c7718143d27cea6e1a71fad7488f0cf38cc07d692947fe\": rpc error: code = NotFound desc = could not find container \"c03fe5212b408772d7c7718143d27cea6e1a71fad7488f0cf38cc07d692947fe\": container with ID starting with c03fe5212b408772d7c7718143d27cea6e1a71fad7488f0cf38cc07d692947fe not found: ID does not exist" Sep 30 14:02:32 crc kubenswrapper[4676]: I0930 14:02:32.451508 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-k8mk5"] Sep 30 14:02:32 crc kubenswrapper[4676]: I0930 14:02:32.697983 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vnxs8" event={"ID":"fbc2525f-9c5f-4639-908d-35fed61607f5","Type":"ContainerStarted","Data":"f972b006c266a7b197f6243d9e5c3395c4adb0e68b21dbd271325a7018d6ad10"} Sep 30 14:02:32 crc kubenswrapper[4676]: I0930 14:02:32.698903 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-vnxs8" Sep 30 14:02:32 crc kubenswrapper[4676]: I0930 14:02:32.703009 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-vnxs8" Sep 30 14:02:32 crc kubenswrapper[4676]: I0930 14:02:32.714664 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-vnxs8" podStartSLOduration=2.714645562 podStartE2EDuration="2.714645562s" podCreationTimestamp="2025-09-30 14:02:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:02:32.713101689 +0000 UTC m=+256.696190128" watchObservedRunningTime="2025-09-30 14:02:32.714645562 +0000 UTC m=+256.697733991" Sep 30 14:02:32 crc kubenswrapper[4676]: I0930 14:02:32.776392 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pw8ff"] Sep 30 14:02:32 crc kubenswrapper[4676]: E0930 14:02:32.776631 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d686c86-1f90-4466-a2e9-19f8f15c7e87" containerName="registry-server" Sep 30 14:02:32 crc kubenswrapper[4676]: I0930 14:02:32.776648 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d686c86-1f90-4466-a2e9-19f8f15c7e87" containerName="registry-server" Sep 30 14:02:32 crc kubenswrapper[4676]: E0930 14:02:32.776660 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19e521ab-fa8b-43c8-a6ca-c57b507fce2d" containerName="registry-server" Sep 30 14:02:32 crc kubenswrapper[4676]: I0930 14:02:32.776668 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="19e521ab-fa8b-43c8-a6ca-c57b507fce2d" containerName="registry-server" Sep 30 14:02:32 crc kubenswrapper[4676]: E0930 14:02:32.776681 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cad652ea-7de0-4021-b594-7ea2a0681286" containerName="extract-utilities" Sep 30 14:02:32 crc kubenswrapper[4676]: I0930 14:02:32.776689 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="cad652ea-7de0-4021-b594-7ea2a0681286" containerName="extract-utilities" Sep 30 14:02:32 crc kubenswrapper[4676]: E0930 14:02:32.776697 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19e521ab-fa8b-43c8-a6ca-c57b507fce2d" containerName="extract-utilities" Sep 30 14:02:32 crc kubenswrapper[4676]: I0930 14:02:32.776705 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="19e521ab-fa8b-43c8-a6ca-c57b507fce2d" containerName="extract-utilities" Sep 30 14:02:32 crc kubenswrapper[4676]: E0930 14:02:32.776720 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d686c86-1f90-4466-a2e9-19f8f15c7e87" containerName="extract-content" Sep 30 14:02:32 crc kubenswrapper[4676]: I0930 14:02:32.776729 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d686c86-1f90-4466-a2e9-19f8f15c7e87" containerName="extract-content" Sep 30 14:02:32 crc kubenswrapper[4676]: E0930 14:02:32.776743 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19e521ab-fa8b-43c8-a6ca-c57b507fce2d" containerName="extract-content" Sep 30 14:02:32 crc kubenswrapper[4676]: I0930 14:02:32.776751 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="19e521ab-fa8b-43c8-a6ca-c57b507fce2d" containerName="extract-content" Sep 30 14:02:32 crc kubenswrapper[4676]: E0930 14:02:32.776765 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ec8c2c0-6649-4fce-bd49-c178b25d9da1" containerName="marketplace-operator" Sep 30 14:02:32 crc kubenswrapper[4676]: I0930 14:02:32.776773 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ec8c2c0-6649-4fce-bd49-c178b25d9da1" containerName="marketplace-operator" Sep 30 14:02:32 crc kubenswrapper[4676]: E0930 14:02:32.776783 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d686c86-1f90-4466-a2e9-19f8f15c7e87" containerName="extract-utilities" Sep 30 14:02:32 crc kubenswrapper[4676]: I0930 14:02:32.776791 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d686c86-1f90-4466-a2e9-19f8f15c7e87" containerName="extract-utilities" Sep 30 14:02:32 crc kubenswrapper[4676]: E0930 14:02:32.776802 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="586d40ff-8404-4b50-bfab-bd99ba97daca" containerName="extract-utilities" Sep 30 14:02:32 crc kubenswrapper[4676]: I0930 14:02:32.776809 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="586d40ff-8404-4b50-bfab-bd99ba97daca" containerName="extract-utilities" Sep 30 14:02:32 crc kubenswrapper[4676]: E0930 14:02:32.776821 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="586d40ff-8404-4b50-bfab-bd99ba97daca" containerName="registry-server" Sep 30 14:02:32 crc kubenswrapper[4676]: I0930 14:02:32.776829 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="586d40ff-8404-4b50-bfab-bd99ba97daca" containerName="registry-server" Sep 30 14:02:32 crc kubenswrapper[4676]: E0930 14:02:32.776840 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cad652ea-7de0-4021-b594-7ea2a0681286" containerName="extract-content" Sep 30 14:02:32 crc kubenswrapper[4676]: I0930 14:02:32.776848 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="cad652ea-7de0-4021-b594-7ea2a0681286" containerName="extract-content" Sep 30 14:02:32 crc kubenswrapper[4676]: E0930 14:02:32.776858 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="586d40ff-8404-4b50-bfab-bd99ba97daca" containerName="extract-content" Sep 30 14:02:32 crc kubenswrapper[4676]: I0930 14:02:32.776865 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="586d40ff-8404-4b50-bfab-bd99ba97daca" containerName="extract-content" Sep 30 14:02:32 crc kubenswrapper[4676]: E0930 14:02:32.776872 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cad652ea-7de0-4021-b594-7ea2a0681286" containerName="registry-server" Sep 30 14:02:32 crc kubenswrapper[4676]: I0930 14:02:32.776919 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="cad652ea-7de0-4021-b594-7ea2a0681286" containerName="registry-server" Sep 30 14:02:32 crc kubenswrapper[4676]: I0930 14:02:32.777034 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="19e521ab-fa8b-43c8-a6ca-c57b507fce2d" containerName="registry-server" Sep 30 14:02:32 crc kubenswrapper[4676]: I0930 14:02:32.777052 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="cad652ea-7de0-4021-b594-7ea2a0681286" containerName="registry-server" Sep 30 14:02:32 crc kubenswrapper[4676]: I0930 14:02:32.777068 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ec8c2c0-6649-4fce-bd49-c178b25d9da1" containerName="marketplace-operator" Sep 30 14:02:32 crc kubenswrapper[4676]: I0930 14:02:32.777076 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d686c86-1f90-4466-a2e9-19f8f15c7e87" containerName="registry-server" Sep 30 14:02:32 crc kubenswrapper[4676]: I0930 14:02:32.777087 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="586d40ff-8404-4b50-bfab-bd99ba97daca" containerName="registry-server" Sep 30 14:02:32 crc kubenswrapper[4676]: I0930 14:02:32.777899 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pw8ff" Sep 30 14:02:32 crc kubenswrapper[4676]: I0930 14:02:32.781369 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Sep 30 14:02:32 crc kubenswrapper[4676]: I0930 14:02:32.793333 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pw8ff"] Sep 30 14:02:32 crc kubenswrapper[4676]: I0930 14:02:32.831735 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbm67\" (UniqueName: \"kubernetes.io/projected/6ffc9090-e9c1-4f42-b73c-1ec86c49e317-kube-api-access-pbm67\") pod \"redhat-marketplace-pw8ff\" (UID: \"6ffc9090-e9c1-4f42-b73c-1ec86c49e317\") " pod="openshift-marketplace/redhat-marketplace-pw8ff" Sep 30 14:02:32 crc kubenswrapper[4676]: I0930 14:02:32.831785 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ffc9090-e9c1-4f42-b73c-1ec86c49e317-utilities\") pod \"redhat-marketplace-pw8ff\" (UID: \"6ffc9090-e9c1-4f42-b73c-1ec86c49e317\") " pod="openshift-marketplace/redhat-marketplace-pw8ff" Sep 30 14:02:32 crc kubenswrapper[4676]: I0930 14:02:32.831825 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ffc9090-e9c1-4f42-b73c-1ec86c49e317-catalog-content\") pod \"redhat-marketplace-pw8ff\" (UID: \"6ffc9090-e9c1-4f42-b73c-1ec86c49e317\") " pod="openshift-marketplace/redhat-marketplace-pw8ff" Sep 30 14:02:32 crc kubenswrapper[4676]: I0930 14:02:32.932986 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ffc9090-e9c1-4f42-b73c-1ec86c49e317-catalog-content\") pod \"redhat-marketplace-pw8ff\" (UID: \"6ffc9090-e9c1-4f42-b73c-1ec86c49e317\") " pod="openshift-marketplace/redhat-marketplace-pw8ff" Sep 30 14:02:32 crc kubenswrapper[4676]: I0930 14:02:32.933075 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbm67\" (UniqueName: \"kubernetes.io/projected/6ffc9090-e9c1-4f42-b73c-1ec86c49e317-kube-api-access-pbm67\") pod \"redhat-marketplace-pw8ff\" (UID: \"6ffc9090-e9c1-4f42-b73c-1ec86c49e317\") " pod="openshift-marketplace/redhat-marketplace-pw8ff" Sep 30 14:02:32 crc kubenswrapper[4676]: I0930 14:02:32.933092 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ffc9090-e9c1-4f42-b73c-1ec86c49e317-utilities\") pod \"redhat-marketplace-pw8ff\" (UID: \"6ffc9090-e9c1-4f42-b73c-1ec86c49e317\") " pod="openshift-marketplace/redhat-marketplace-pw8ff" Sep 30 14:02:32 crc kubenswrapper[4676]: I0930 14:02:32.933496 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ffc9090-e9c1-4f42-b73c-1ec86c49e317-catalog-content\") pod \"redhat-marketplace-pw8ff\" (UID: \"6ffc9090-e9c1-4f42-b73c-1ec86c49e317\") " pod="openshift-marketplace/redhat-marketplace-pw8ff" Sep 30 14:02:32 crc kubenswrapper[4676]: I0930 14:02:32.933499 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ffc9090-e9c1-4f42-b73c-1ec86c49e317-utilities\") pod \"redhat-marketplace-pw8ff\" (UID: \"6ffc9090-e9c1-4f42-b73c-1ec86c49e317\") " pod="openshift-marketplace/redhat-marketplace-pw8ff" Sep 30 14:02:32 crc kubenswrapper[4676]: I0930 14:02:32.952026 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbm67\" (UniqueName: \"kubernetes.io/projected/6ffc9090-e9c1-4f42-b73c-1ec86c49e317-kube-api-access-pbm67\") pod \"redhat-marketplace-pw8ff\" (UID: \"6ffc9090-e9c1-4f42-b73c-1ec86c49e317\") " pod="openshift-marketplace/redhat-marketplace-pw8ff" Sep 30 14:02:32 crc kubenswrapper[4676]: I0930 14:02:32.975136 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-l9lpt"] Sep 30 14:02:32 crc kubenswrapper[4676]: I0930 14:02:32.976761 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l9lpt" Sep 30 14:02:32 crc kubenswrapper[4676]: I0930 14:02:32.979373 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Sep 30 14:02:32 crc kubenswrapper[4676]: I0930 14:02:32.983705 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l9lpt"] Sep 30 14:02:33 crc kubenswrapper[4676]: I0930 14:02:33.033800 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2edaf2e-4def-4177-87e6-6e9e1a62f16b-catalog-content\") pod \"redhat-operators-l9lpt\" (UID: \"c2edaf2e-4def-4177-87e6-6e9e1a62f16b\") " pod="openshift-marketplace/redhat-operators-l9lpt" Sep 30 14:02:33 crc kubenswrapper[4676]: I0930 14:02:33.033895 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsnp4\" (UniqueName: \"kubernetes.io/projected/c2edaf2e-4def-4177-87e6-6e9e1a62f16b-kube-api-access-gsnp4\") pod \"redhat-operators-l9lpt\" (UID: \"c2edaf2e-4def-4177-87e6-6e9e1a62f16b\") " pod="openshift-marketplace/redhat-operators-l9lpt" Sep 30 14:02:33 crc kubenswrapper[4676]: I0930 14:02:33.033950 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2edaf2e-4def-4177-87e6-6e9e1a62f16b-utilities\") pod \"redhat-operators-l9lpt\" (UID: \"c2edaf2e-4def-4177-87e6-6e9e1a62f16b\") " pod="openshift-marketplace/redhat-operators-l9lpt" Sep 30 14:02:33 crc kubenswrapper[4676]: I0930 14:02:33.097258 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pw8ff" Sep 30 14:02:33 crc kubenswrapper[4676]: I0930 14:02:33.135075 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsnp4\" (UniqueName: \"kubernetes.io/projected/c2edaf2e-4def-4177-87e6-6e9e1a62f16b-kube-api-access-gsnp4\") pod \"redhat-operators-l9lpt\" (UID: \"c2edaf2e-4def-4177-87e6-6e9e1a62f16b\") " pod="openshift-marketplace/redhat-operators-l9lpt" Sep 30 14:02:33 crc kubenswrapper[4676]: I0930 14:02:33.135162 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2edaf2e-4def-4177-87e6-6e9e1a62f16b-utilities\") pod \"redhat-operators-l9lpt\" (UID: \"c2edaf2e-4def-4177-87e6-6e9e1a62f16b\") " pod="openshift-marketplace/redhat-operators-l9lpt" Sep 30 14:02:33 crc kubenswrapper[4676]: I0930 14:02:33.135198 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2edaf2e-4def-4177-87e6-6e9e1a62f16b-catalog-content\") pod \"redhat-operators-l9lpt\" (UID: \"c2edaf2e-4def-4177-87e6-6e9e1a62f16b\") " pod="openshift-marketplace/redhat-operators-l9lpt" Sep 30 14:02:33 crc kubenswrapper[4676]: I0930 14:02:33.135697 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2edaf2e-4def-4177-87e6-6e9e1a62f16b-catalog-content\") pod \"redhat-operators-l9lpt\" (UID: \"c2edaf2e-4def-4177-87e6-6e9e1a62f16b\") " pod="openshift-marketplace/redhat-operators-l9lpt" Sep 30 14:02:33 crc kubenswrapper[4676]: I0930 14:02:33.135794 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2edaf2e-4def-4177-87e6-6e9e1a62f16b-utilities\") pod \"redhat-operators-l9lpt\" (UID: \"c2edaf2e-4def-4177-87e6-6e9e1a62f16b\") " pod="openshift-marketplace/redhat-operators-l9lpt" Sep 30 14:02:33 crc kubenswrapper[4676]: I0930 14:02:33.153254 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsnp4\" (UniqueName: \"kubernetes.io/projected/c2edaf2e-4def-4177-87e6-6e9e1a62f16b-kube-api-access-gsnp4\") pod \"redhat-operators-l9lpt\" (UID: \"c2edaf2e-4def-4177-87e6-6e9e1a62f16b\") " pod="openshift-marketplace/redhat-operators-l9lpt" Sep 30 14:02:33 crc kubenswrapper[4676]: I0930 14:02:33.307522 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l9lpt" Sep 30 14:02:33 crc kubenswrapper[4676]: I0930 14:02:33.446015 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19e521ab-fa8b-43c8-a6ca-c57b507fce2d" path="/var/lib/kubelet/pods/19e521ab-fa8b-43c8-a6ca-c57b507fce2d/volumes" Sep 30 14:02:33 crc kubenswrapper[4676]: I0930 14:02:33.446821 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ec8c2c0-6649-4fce-bd49-c178b25d9da1" path="/var/lib/kubelet/pods/4ec8c2c0-6649-4fce-bd49-c178b25d9da1/volumes" Sep 30 14:02:33 crc kubenswrapper[4676]: I0930 14:02:33.447258 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="586d40ff-8404-4b50-bfab-bd99ba97daca" path="/var/lib/kubelet/pods/586d40ff-8404-4b50-bfab-bd99ba97daca/volumes" Sep 30 14:02:33 crc kubenswrapper[4676]: I0930 14:02:33.448334 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d686c86-1f90-4466-a2e9-19f8f15c7e87" path="/var/lib/kubelet/pods/9d686c86-1f90-4466-a2e9-19f8f15c7e87/volumes" Sep 30 14:02:33 crc kubenswrapper[4676]: I0930 14:02:33.449034 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cad652ea-7de0-4021-b594-7ea2a0681286" path="/var/lib/kubelet/pods/cad652ea-7de0-4021-b594-7ea2a0681286/volumes" Sep 30 14:02:33 crc kubenswrapper[4676]: I0930 14:02:33.540374 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pw8ff"] Sep 30 14:02:33 crc kubenswrapper[4676]: I0930 14:02:33.675492 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l9lpt"] Sep 30 14:02:33 crc kubenswrapper[4676]: W0930 14:02:33.679643 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2edaf2e_4def_4177_87e6_6e9e1a62f16b.slice/crio-2809c953b8fe8193b35538cd811c4ddd9c97f5d39151b90bde5bf9a25821baca WatchSource:0}: Error finding container 2809c953b8fe8193b35538cd811c4ddd9c97f5d39151b90bde5bf9a25821baca: Status 404 returned error can't find the container with id 2809c953b8fe8193b35538cd811c4ddd9c97f5d39151b90bde5bf9a25821baca Sep 30 14:02:33 crc kubenswrapper[4676]: I0930 14:02:33.717055 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l9lpt" event={"ID":"c2edaf2e-4def-4177-87e6-6e9e1a62f16b","Type":"ContainerStarted","Data":"2809c953b8fe8193b35538cd811c4ddd9c97f5d39151b90bde5bf9a25821baca"} Sep 30 14:02:33 crc kubenswrapper[4676]: I0930 14:02:33.721043 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pw8ff" event={"ID":"6ffc9090-e9c1-4f42-b73c-1ec86c49e317","Type":"ContainerStarted","Data":"23e42d232df5c46300e80e41b187bc5a622f23f6279101bb591aaeed72455bd6"} Sep 30 14:02:34 crc kubenswrapper[4676]: I0930 14:02:34.730714 4676 generic.go:334] "Generic (PLEG): container finished" podID="c2edaf2e-4def-4177-87e6-6e9e1a62f16b" containerID="cf235d82ff40a1de080292475ad9c354382c1b82a530b8fa88f2f0e58edd0ae1" exitCode=0 Sep 30 14:02:34 crc kubenswrapper[4676]: I0930 14:02:34.730816 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l9lpt" event={"ID":"c2edaf2e-4def-4177-87e6-6e9e1a62f16b","Type":"ContainerDied","Data":"cf235d82ff40a1de080292475ad9c354382c1b82a530b8fa88f2f0e58edd0ae1"} Sep 30 14:02:34 crc kubenswrapper[4676]: I0930 14:02:34.732549 4676 generic.go:334] "Generic (PLEG): container finished" podID="6ffc9090-e9c1-4f42-b73c-1ec86c49e317" containerID="fcba23cc0d9d9920395bb13a4cad056821c7153e37e9dc8d6a799fe0228f3ee3" exitCode=0 Sep 30 14:02:34 crc kubenswrapper[4676]: I0930 14:02:34.733461 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pw8ff" event={"ID":"6ffc9090-e9c1-4f42-b73c-1ec86c49e317","Type":"ContainerDied","Data":"fcba23cc0d9d9920395bb13a4cad056821c7153e37e9dc8d6a799fe0228f3ee3"} Sep 30 14:02:35 crc kubenswrapper[4676]: I0930 14:02:35.177086 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qcljw"] Sep 30 14:02:35 crc kubenswrapper[4676]: I0930 14:02:35.178818 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qcljw" Sep 30 14:02:35 crc kubenswrapper[4676]: I0930 14:02:35.181853 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Sep 30 14:02:35 crc kubenswrapper[4676]: I0930 14:02:35.186417 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qcljw"] Sep 30 14:02:35 crc kubenswrapper[4676]: I0930 14:02:35.260525 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f1bd977-799f-432c-80c7-cec958829e37-catalog-content\") pod \"community-operators-qcljw\" (UID: \"6f1bd977-799f-432c-80c7-cec958829e37\") " pod="openshift-marketplace/community-operators-qcljw" Sep 30 14:02:35 crc kubenswrapper[4676]: I0930 14:02:35.261335 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f1bd977-799f-432c-80c7-cec958829e37-utilities\") pod \"community-operators-qcljw\" (UID: \"6f1bd977-799f-432c-80c7-cec958829e37\") " pod="openshift-marketplace/community-operators-qcljw" Sep 30 14:02:35 crc kubenswrapper[4676]: I0930 14:02:35.261542 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96npb\" (UniqueName: \"kubernetes.io/projected/6f1bd977-799f-432c-80c7-cec958829e37-kube-api-access-96npb\") pod \"community-operators-qcljw\" (UID: \"6f1bd977-799f-432c-80c7-cec958829e37\") " pod="openshift-marketplace/community-operators-qcljw" Sep 30 14:02:35 crc kubenswrapper[4676]: I0930 14:02:35.362947 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f1bd977-799f-432c-80c7-cec958829e37-utilities\") pod \"community-operators-qcljw\" (UID: \"6f1bd977-799f-432c-80c7-cec958829e37\") " pod="openshift-marketplace/community-operators-qcljw" Sep 30 14:02:35 crc kubenswrapper[4676]: I0930 14:02:35.363045 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96npb\" (UniqueName: \"kubernetes.io/projected/6f1bd977-799f-432c-80c7-cec958829e37-kube-api-access-96npb\") pod \"community-operators-qcljw\" (UID: \"6f1bd977-799f-432c-80c7-cec958829e37\") " pod="openshift-marketplace/community-operators-qcljw" Sep 30 14:02:35 crc kubenswrapper[4676]: I0930 14:02:35.363085 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f1bd977-799f-432c-80c7-cec958829e37-catalog-content\") pod \"community-operators-qcljw\" (UID: \"6f1bd977-799f-432c-80c7-cec958829e37\") " pod="openshift-marketplace/community-operators-qcljw" Sep 30 14:02:35 crc kubenswrapper[4676]: I0930 14:02:35.363568 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f1bd977-799f-432c-80c7-cec958829e37-catalog-content\") pod \"community-operators-qcljw\" (UID: \"6f1bd977-799f-432c-80c7-cec958829e37\") " pod="openshift-marketplace/community-operators-qcljw" Sep 30 14:02:35 crc kubenswrapper[4676]: I0930 14:02:35.363590 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f1bd977-799f-432c-80c7-cec958829e37-utilities\") pod \"community-operators-qcljw\" (UID: \"6f1bd977-799f-432c-80c7-cec958829e37\") " pod="openshift-marketplace/community-operators-qcljw" Sep 30 14:02:35 crc kubenswrapper[4676]: I0930 14:02:35.375723 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qxnqk"] Sep 30 14:02:35 crc kubenswrapper[4676]: I0930 14:02:35.376827 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qxnqk" Sep 30 14:02:35 crc kubenswrapper[4676]: I0930 14:02:35.378751 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Sep 30 14:02:35 crc kubenswrapper[4676]: I0930 14:02:35.391577 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qxnqk"] Sep 30 14:02:35 crc kubenswrapper[4676]: I0930 14:02:35.391586 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96npb\" (UniqueName: \"kubernetes.io/projected/6f1bd977-799f-432c-80c7-cec958829e37-kube-api-access-96npb\") pod \"community-operators-qcljw\" (UID: \"6f1bd977-799f-432c-80c7-cec958829e37\") " pod="openshift-marketplace/community-operators-qcljw" Sep 30 14:02:35 crc kubenswrapper[4676]: I0930 14:02:35.464826 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b3ca6ec-bdbc-44be-90e5-21fa0d2cd5f1-utilities\") pod \"certified-operators-qxnqk\" (UID: \"6b3ca6ec-bdbc-44be-90e5-21fa0d2cd5f1\") " pod="openshift-marketplace/certified-operators-qxnqk" Sep 30 14:02:35 crc kubenswrapper[4676]: I0930 14:02:35.464943 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqbks\" (UniqueName: \"kubernetes.io/projected/6b3ca6ec-bdbc-44be-90e5-21fa0d2cd5f1-kube-api-access-jqbks\") pod \"certified-operators-qxnqk\" (UID: \"6b3ca6ec-bdbc-44be-90e5-21fa0d2cd5f1\") " pod="openshift-marketplace/certified-operators-qxnqk" Sep 30 14:02:35 crc kubenswrapper[4676]: I0930 14:02:35.465101 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b3ca6ec-bdbc-44be-90e5-21fa0d2cd5f1-catalog-content\") pod \"certified-operators-qxnqk\" (UID: \"6b3ca6ec-bdbc-44be-90e5-21fa0d2cd5f1\") " pod="openshift-marketplace/certified-operators-qxnqk" Sep 30 14:02:35 crc kubenswrapper[4676]: I0930 14:02:35.497491 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qcljw" Sep 30 14:02:35 crc kubenswrapper[4676]: I0930 14:02:35.567026 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqbks\" (UniqueName: \"kubernetes.io/projected/6b3ca6ec-bdbc-44be-90e5-21fa0d2cd5f1-kube-api-access-jqbks\") pod \"certified-operators-qxnqk\" (UID: \"6b3ca6ec-bdbc-44be-90e5-21fa0d2cd5f1\") " pod="openshift-marketplace/certified-operators-qxnqk" Sep 30 14:02:35 crc kubenswrapper[4676]: I0930 14:02:35.567871 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b3ca6ec-bdbc-44be-90e5-21fa0d2cd5f1-catalog-content\") pod \"certified-operators-qxnqk\" (UID: \"6b3ca6ec-bdbc-44be-90e5-21fa0d2cd5f1\") " pod="openshift-marketplace/certified-operators-qxnqk" Sep 30 14:02:35 crc kubenswrapper[4676]: I0930 14:02:35.568062 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b3ca6ec-bdbc-44be-90e5-21fa0d2cd5f1-utilities\") pod \"certified-operators-qxnqk\" (UID: \"6b3ca6ec-bdbc-44be-90e5-21fa0d2cd5f1\") " pod="openshift-marketplace/certified-operators-qxnqk" Sep 30 14:02:35 crc kubenswrapper[4676]: I0930 14:02:35.568441 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b3ca6ec-bdbc-44be-90e5-21fa0d2cd5f1-catalog-content\") pod \"certified-operators-qxnqk\" (UID: \"6b3ca6ec-bdbc-44be-90e5-21fa0d2cd5f1\") " pod="openshift-marketplace/certified-operators-qxnqk" Sep 30 14:02:35 crc kubenswrapper[4676]: I0930 14:02:35.568667 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b3ca6ec-bdbc-44be-90e5-21fa0d2cd5f1-utilities\") pod \"certified-operators-qxnqk\" (UID: \"6b3ca6ec-bdbc-44be-90e5-21fa0d2cd5f1\") " pod="openshift-marketplace/certified-operators-qxnqk" Sep 30 14:02:35 crc kubenswrapper[4676]: I0930 14:02:35.609121 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqbks\" (UniqueName: \"kubernetes.io/projected/6b3ca6ec-bdbc-44be-90e5-21fa0d2cd5f1-kube-api-access-jqbks\") pod \"certified-operators-qxnqk\" (UID: \"6b3ca6ec-bdbc-44be-90e5-21fa0d2cd5f1\") " pod="openshift-marketplace/certified-operators-qxnqk" Sep 30 14:02:35 crc kubenswrapper[4676]: I0930 14:02:35.719641 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qcljw"] Sep 30 14:02:35 crc kubenswrapper[4676]: I0930 14:02:35.719918 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qxnqk" Sep 30 14:02:35 crc kubenswrapper[4676]: W0930 14:02:35.731958 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f1bd977_799f_432c_80c7_cec958829e37.slice/crio-37f6b1a68e08a82b49398e4a8ca45984e737fb12acf0b8149c73890e082b56c9 WatchSource:0}: Error finding container 37f6b1a68e08a82b49398e4a8ca45984e737fb12acf0b8149c73890e082b56c9: Status 404 returned error can't find the container with id 37f6b1a68e08a82b49398e4a8ca45984e737fb12acf0b8149c73890e082b56c9 Sep 30 14:02:35 crc kubenswrapper[4676]: I0930 14:02:35.744389 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qcljw" event={"ID":"6f1bd977-799f-432c-80c7-cec958829e37","Type":"ContainerStarted","Data":"37f6b1a68e08a82b49398e4a8ca45984e737fb12acf0b8149c73890e082b56c9"} Sep 30 14:02:35 crc kubenswrapper[4676]: I0930 14:02:35.748323 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pw8ff" event={"ID":"6ffc9090-e9c1-4f42-b73c-1ec86c49e317","Type":"ContainerStarted","Data":"351d6e8712a8ef60ee9bf23cfa0b77ebed5d59f778f149eb99d059ebb08800dc"} Sep 30 14:02:35 crc kubenswrapper[4676]: I0930 14:02:35.758789 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l9lpt" event={"ID":"c2edaf2e-4def-4177-87e6-6e9e1a62f16b","Type":"ContainerStarted","Data":"97823dcb6c5d319e48de696227056449e81b62d8716c2e63cb70278bda65d77f"} Sep 30 14:02:36 crc kubenswrapper[4676]: I0930 14:02:36.159976 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qxnqk"] Sep 30 14:02:36 crc kubenswrapper[4676]: I0930 14:02:36.770222 4676 generic.go:334] "Generic (PLEG): container finished" podID="6ffc9090-e9c1-4f42-b73c-1ec86c49e317" containerID="351d6e8712a8ef60ee9bf23cfa0b77ebed5d59f778f149eb99d059ebb08800dc" exitCode=0 Sep 30 14:02:36 crc kubenswrapper[4676]: I0930 14:02:36.770281 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pw8ff" event={"ID":"6ffc9090-e9c1-4f42-b73c-1ec86c49e317","Type":"ContainerDied","Data":"351d6e8712a8ef60ee9bf23cfa0b77ebed5d59f778f149eb99d059ebb08800dc"} Sep 30 14:02:36 crc kubenswrapper[4676]: I0930 14:02:36.778306 4676 generic.go:334] "Generic (PLEG): container finished" podID="c2edaf2e-4def-4177-87e6-6e9e1a62f16b" containerID="97823dcb6c5d319e48de696227056449e81b62d8716c2e63cb70278bda65d77f" exitCode=0 Sep 30 14:02:36 crc kubenswrapper[4676]: I0930 14:02:36.778378 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l9lpt" event={"ID":"c2edaf2e-4def-4177-87e6-6e9e1a62f16b","Type":"ContainerDied","Data":"97823dcb6c5d319e48de696227056449e81b62d8716c2e63cb70278bda65d77f"} Sep 30 14:02:36 crc kubenswrapper[4676]: I0930 14:02:36.780797 4676 generic.go:334] "Generic (PLEG): container finished" podID="6f1bd977-799f-432c-80c7-cec958829e37" containerID="de995028abcf78fb09d871fa5c53c9c43166a6aaef10898177900d17fa71fff5" exitCode=0 Sep 30 14:02:36 crc kubenswrapper[4676]: I0930 14:02:36.780914 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qcljw" event={"ID":"6f1bd977-799f-432c-80c7-cec958829e37","Type":"ContainerDied","Data":"de995028abcf78fb09d871fa5c53c9c43166a6aaef10898177900d17fa71fff5"} Sep 30 14:02:36 crc kubenswrapper[4676]: I0930 14:02:36.785812 4676 generic.go:334] "Generic (PLEG): container finished" podID="6b3ca6ec-bdbc-44be-90e5-21fa0d2cd5f1" containerID="7d859f983b8e3ce7fd198fa25bda72bfae8bd7fb7f96331cd94b50912a7fa6ce" exitCode=0 Sep 30 14:02:36 crc kubenswrapper[4676]: I0930 14:02:36.785940 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qxnqk" event={"ID":"6b3ca6ec-bdbc-44be-90e5-21fa0d2cd5f1","Type":"ContainerDied","Data":"7d859f983b8e3ce7fd198fa25bda72bfae8bd7fb7f96331cd94b50912a7fa6ce"} Sep 30 14:02:36 crc kubenswrapper[4676]: I0930 14:02:36.785981 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qxnqk" event={"ID":"6b3ca6ec-bdbc-44be-90e5-21fa0d2cd5f1","Type":"ContainerStarted","Data":"68db5486f9302a7a11df36f7224b5f6c2fc9b8bfe4da2512784925b0e6cf3786"} Sep 30 14:02:37 crc kubenswrapper[4676]: I0930 14:02:37.793855 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pw8ff" event={"ID":"6ffc9090-e9c1-4f42-b73c-1ec86c49e317","Type":"ContainerStarted","Data":"fcbeb53819748438df771b9cf0c956f61edc45fa3b41da192dbb189044fecd56"} Sep 30 14:02:37 crc kubenswrapper[4676]: I0930 14:02:37.796740 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l9lpt" event={"ID":"c2edaf2e-4def-4177-87e6-6e9e1a62f16b","Type":"ContainerStarted","Data":"e0032e8cd9fe7d955fedefcfc265b528b16990ebcf898e7d0071de4fe5ca07e8"} Sep 30 14:02:37 crc kubenswrapper[4676]: I0930 14:02:37.815094 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pw8ff" podStartSLOduration=3.315237046 podStartE2EDuration="5.815074136s" podCreationTimestamp="2025-09-30 14:02:32 +0000 UTC" firstStartedPulling="2025-09-30 14:02:34.733894241 +0000 UTC m=+258.716982670" lastFinishedPulling="2025-09-30 14:02:37.233731331 +0000 UTC m=+261.216819760" observedRunningTime="2025-09-30 14:02:37.813291727 +0000 UTC m=+261.796380166" watchObservedRunningTime="2025-09-30 14:02:37.815074136 +0000 UTC m=+261.798162555" Sep 30 14:02:38 crc kubenswrapper[4676]: I0930 14:02:38.805840 4676 generic.go:334] "Generic (PLEG): container finished" podID="6f1bd977-799f-432c-80c7-cec958829e37" containerID="1d7e1d673056b3d5155f6ceea152eea9fa776e14fe7796044b64eff4fa6fcb73" exitCode=0 Sep 30 14:02:38 crc kubenswrapper[4676]: I0930 14:02:38.805917 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qcljw" event={"ID":"6f1bd977-799f-432c-80c7-cec958829e37","Type":"ContainerDied","Data":"1d7e1d673056b3d5155f6ceea152eea9fa776e14fe7796044b64eff4fa6fcb73"} Sep 30 14:02:38 crc kubenswrapper[4676]: I0930 14:02:38.809234 4676 generic.go:334] "Generic (PLEG): container finished" podID="6b3ca6ec-bdbc-44be-90e5-21fa0d2cd5f1" containerID="aa55c3d6bb5daa6e2b1af590a7ed93a79652da0593fe3f20449b8c922427d976" exitCode=0 Sep 30 14:02:38 crc kubenswrapper[4676]: I0930 14:02:38.809378 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qxnqk" event={"ID":"6b3ca6ec-bdbc-44be-90e5-21fa0d2cd5f1","Type":"ContainerDied","Data":"aa55c3d6bb5daa6e2b1af590a7ed93a79652da0593fe3f20449b8c922427d976"} Sep 30 14:02:38 crc kubenswrapper[4676]: I0930 14:02:38.840802 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-l9lpt" podStartSLOduration=4.431937077 podStartE2EDuration="6.840762387s" podCreationTimestamp="2025-09-30 14:02:32 +0000 UTC" firstStartedPulling="2025-09-30 14:02:34.732046579 +0000 UTC m=+258.715135018" lastFinishedPulling="2025-09-30 14:02:37.140871899 +0000 UTC m=+261.123960328" observedRunningTime="2025-09-30 14:02:37.839705571 +0000 UTC m=+261.822794020" watchObservedRunningTime="2025-09-30 14:02:38.840762387 +0000 UTC m=+262.823850816" Sep 30 14:02:39 crc kubenswrapper[4676]: I0930 14:02:39.818215 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qxnqk" event={"ID":"6b3ca6ec-bdbc-44be-90e5-21fa0d2cd5f1","Type":"ContainerStarted","Data":"717d8fe1c1ee0b4bf844c19337d43d760923d65831e7318039c761f0872431cf"} Sep 30 14:02:39 crc kubenswrapper[4676]: I0930 14:02:39.852308 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qxnqk" podStartSLOduration=2.394067056 podStartE2EDuration="4.852288055s" podCreationTimestamp="2025-09-30 14:02:35 +0000 UTC" firstStartedPulling="2025-09-30 14:02:36.795388703 +0000 UTC m=+260.778477132" lastFinishedPulling="2025-09-30 14:02:39.253609702 +0000 UTC m=+263.236698131" observedRunningTime="2025-09-30 14:02:39.848857037 +0000 UTC m=+263.831945476" watchObservedRunningTime="2025-09-30 14:02:39.852288055 +0000 UTC m=+263.835376484" Sep 30 14:02:40 crc kubenswrapper[4676]: I0930 14:02:40.826604 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qcljw" event={"ID":"6f1bd977-799f-432c-80c7-cec958829e37","Type":"ContainerStarted","Data":"1711df3767b1228cec6ea1c39049dc07a8ad949da5745a8d23141718b5efe012"} Sep 30 14:02:40 crc kubenswrapper[4676]: I0930 14:02:40.849452 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qcljw" podStartSLOduration=2.972107448 podStartE2EDuration="5.849430939s" podCreationTimestamp="2025-09-30 14:02:35 +0000 UTC" firstStartedPulling="2025-09-30 14:02:36.781822135 +0000 UTC m=+260.764910574" lastFinishedPulling="2025-09-30 14:02:39.659145626 +0000 UTC m=+263.642234065" observedRunningTime="2025-09-30 14:02:40.848827732 +0000 UTC m=+264.831916161" watchObservedRunningTime="2025-09-30 14:02:40.849430939 +0000 UTC m=+264.832519368" Sep 30 14:02:43 crc kubenswrapper[4676]: I0930 14:02:43.097815 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pw8ff" Sep 30 14:02:43 crc kubenswrapper[4676]: I0930 14:02:43.098224 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pw8ff" Sep 30 14:02:43 crc kubenswrapper[4676]: I0930 14:02:43.171192 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pw8ff" Sep 30 14:02:43 crc kubenswrapper[4676]: I0930 14:02:43.308864 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-l9lpt" Sep 30 14:02:43 crc kubenswrapper[4676]: I0930 14:02:43.309292 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-l9lpt" Sep 30 14:02:43 crc kubenswrapper[4676]: I0930 14:02:43.347178 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-l9lpt" Sep 30 14:02:43 crc kubenswrapper[4676]: I0930 14:02:43.881148 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-l9lpt" Sep 30 14:02:43 crc kubenswrapper[4676]: I0930 14:02:43.885323 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pw8ff" Sep 30 14:02:45 crc kubenswrapper[4676]: I0930 14:02:45.498442 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qcljw" Sep 30 14:02:45 crc kubenswrapper[4676]: I0930 14:02:45.499175 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qcljw" Sep 30 14:02:45 crc kubenswrapper[4676]: I0930 14:02:45.540140 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qcljw" Sep 30 14:02:45 crc kubenswrapper[4676]: I0930 14:02:45.720261 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qxnqk" Sep 30 14:02:45 crc kubenswrapper[4676]: I0930 14:02:45.720315 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qxnqk" Sep 30 14:02:45 crc kubenswrapper[4676]: I0930 14:02:45.777235 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qxnqk" Sep 30 14:02:45 crc kubenswrapper[4676]: I0930 14:02:45.888959 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qcljw" Sep 30 14:02:45 crc kubenswrapper[4676]: I0930 14:02:45.892636 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qxnqk" Sep 30 14:02:57 crc kubenswrapper[4676]: I0930 14:02:57.493998 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-k8mk5" podUID="3c5e85dd-ceb0-40eb-86b8-353702e07379" containerName="oauth-openshift" containerID="cri-o://014d9ba740334fd634f6fbc27edf28fa9755a2f261ac738d71b22dd711bf3428" gracePeriod=15 Sep 30 14:02:57 crc kubenswrapper[4676]: I0930 14:02:57.856981 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-k8mk5" Sep 30 14:02:57 crc kubenswrapper[4676]: I0930 14:02:57.883492 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-9fc86467f-jj5nh"] Sep 30 14:02:57 crc kubenswrapper[4676]: E0930 14:02:57.883705 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c5e85dd-ceb0-40eb-86b8-353702e07379" containerName="oauth-openshift" Sep 30 14:02:57 crc kubenswrapper[4676]: I0930 14:02:57.883717 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c5e85dd-ceb0-40eb-86b8-353702e07379" containerName="oauth-openshift" Sep 30 14:02:57 crc kubenswrapper[4676]: I0930 14:02:57.883800 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c5e85dd-ceb0-40eb-86b8-353702e07379" containerName="oauth-openshift" Sep 30 14:02:57 crc kubenswrapper[4676]: I0930 14:02:57.884181 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-9fc86467f-jj5nh" Sep 30 14:02:57 crc kubenswrapper[4676]: I0930 14:02:57.903494 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-9fc86467f-jj5nh"] Sep 30 14:02:57 crc kubenswrapper[4676]: I0930 14:02:57.922508 4676 generic.go:334] "Generic (PLEG): container finished" podID="3c5e85dd-ceb0-40eb-86b8-353702e07379" containerID="014d9ba740334fd634f6fbc27edf28fa9755a2f261ac738d71b22dd711bf3428" exitCode=0 Sep 30 14:02:57 crc kubenswrapper[4676]: I0930 14:02:57.922563 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-k8mk5" event={"ID":"3c5e85dd-ceb0-40eb-86b8-353702e07379","Type":"ContainerDied","Data":"014d9ba740334fd634f6fbc27edf28fa9755a2f261ac738d71b22dd711bf3428"} Sep 30 14:02:57 crc kubenswrapper[4676]: I0930 14:02:57.922569 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-k8mk5" Sep 30 14:02:57 crc kubenswrapper[4676]: I0930 14:02:57.922602 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-k8mk5" event={"ID":"3c5e85dd-ceb0-40eb-86b8-353702e07379","Type":"ContainerDied","Data":"265d5ec9c55f6c2f5e7f65806973a49fdaf6b19740357cccadb896a1717162f9"} Sep 30 14:02:57 crc kubenswrapper[4676]: I0930 14:02:57.922622 4676 scope.go:117] "RemoveContainer" containerID="014d9ba740334fd634f6fbc27edf28fa9755a2f261ac738d71b22dd711bf3428" Sep 30 14:02:57 crc kubenswrapper[4676]: I0930 14:02:57.941479 4676 scope.go:117] "RemoveContainer" containerID="014d9ba740334fd634f6fbc27edf28fa9755a2f261ac738d71b22dd711bf3428" Sep 30 14:02:57 crc kubenswrapper[4676]: E0930 14:02:57.942108 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"014d9ba740334fd634f6fbc27edf28fa9755a2f261ac738d71b22dd711bf3428\": container with ID starting with 014d9ba740334fd634f6fbc27edf28fa9755a2f261ac738d71b22dd711bf3428 not found: ID does not exist" containerID="014d9ba740334fd634f6fbc27edf28fa9755a2f261ac738d71b22dd711bf3428" Sep 30 14:02:57 crc kubenswrapper[4676]: I0930 14:02:57.942172 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"014d9ba740334fd634f6fbc27edf28fa9755a2f261ac738d71b22dd711bf3428"} err="failed to get container status \"014d9ba740334fd634f6fbc27edf28fa9755a2f261ac738d71b22dd711bf3428\": rpc error: code = NotFound desc = could not find container \"014d9ba740334fd634f6fbc27edf28fa9755a2f261ac738d71b22dd711bf3428\": container with ID starting with 014d9ba740334fd634f6fbc27edf28fa9755a2f261ac738d71b22dd711bf3428 not found: ID does not exist" Sep 30 14:02:57 crc kubenswrapper[4676]: I0930 14:02:57.960538 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3c5e85dd-ceb0-40eb-86b8-353702e07379-v4-0-config-user-idp-0-file-data\") pod \"3c5e85dd-ceb0-40eb-86b8-353702e07379\" (UID: \"3c5e85dd-ceb0-40eb-86b8-353702e07379\") " Sep 30 14:02:57 crc kubenswrapper[4676]: I0930 14:02:57.960587 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3c5e85dd-ceb0-40eb-86b8-353702e07379-v4-0-config-system-cliconfig\") pod \"3c5e85dd-ceb0-40eb-86b8-353702e07379\" (UID: \"3c5e85dd-ceb0-40eb-86b8-353702e07379\") " Sep 30 14:02:57 crc kubenswrapper[4676]: I0930 14:02:57.960618 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3c5e85dd-ceb0-40eb-86b8-353702e07379-v4-0-config-system-trusted-ca-bundle\") pod \"3c5e85dd-ceb0-40eb-86b8-353702e07379\" (UID: \"3c5e85dd-ceb0-40eb-86b8-353702e07379\") " Sep 30 14:02:57 crc kubenswrapper[4676]: I0930 14:02:57.960640 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3c5e85dd-ceb0-40eb-86b8-353702e07379-v4-0-config-system-session\") pod \"3c5e85dd-ceb0-40eb-86b8-353702e07379\" (UID: \"3c5e85dd-ceb0-40eb-86b8-353702e07379\") " Sep 30 14:02:57 crc kubenswrapper[4676]: I0930 14:02:57.960701 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3c5e85dd-ceb0-40eb-86b8-353702e07379-v4-0-config-system-serving-cert\") pod \"3c5e85dd-ceb0-40eb-86b8-353702e07379\" (UID: \"3c5e85dd-ceb0-40eb-86b8-353702e07379\") " Sep 30 14:02:57 crc kubenswrapper[4676]: I0930 14:02:57.960724 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3c5e85dd-ceb0-40eb-86b8-353702e07379-v4-0-config-system-service-ca\") pod \"3c5e85dd-ceb0-40eb-86b8-353702e07379\" (UID: \"3c5e85dd-ceb0-40eb-86b8-353702e07379\") " Sep 30 14:02:57 crc kubenswrapper[4676]: I0930 14:02:57.960756 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3c5e85dd-ceb0-40eb-86b8-353702e07379-v4-0-config-user-template-error\") pod \"3c5e85dd-ceb0-40eb-86b8-353702e07379\" (UID: \"3c5e85dd-ceb0-40eb-86b8-353702e07379\") " Sep 30 14:02:57 crc kubenswrapper[4676]: I0930 14:02:57.960782 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3c5e85dd-ceb0-40eb-86b8-353702e07379-audit-dir\") pod \"3c5e85dd-ceb0-40eb-86b8-353702e07379\" (UID: \"3c5e85dd-ceb0-40eb-86b8-353702e07379\") " Sep 30 14:02:57 crc kubenswrapper[4676]: I0930 14:02:57.960807 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3c5e85dd-ceb0-40eb-86b8-353702e07379-audit-policies\") pod \"3c5e85dd-ceb0-40eb-86b8-353702e07379\" (UID: \"3c5e85dd-ceb0-40eb-86b8-353702e07379\") " Sep 30 14:02:57 crc kubenswrapper[4676]: I0930 14:02:57.961371 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c5e85dd-ceb0-40eb-86b8-353702e07379-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "3c5e85dd-ceb0-40eb-86b8-353702e07379" (UID: "3c5e85dd-ceb0-40eb-86b8-353702e07379"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:02:57 crc kubenswrapper[4676]: I0930 14:02:57.961446 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3c5e85dd-ceb0-40eb-86b8-353702e07379-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "3c5e85dd-ceb0-40eb-86b8-353702e07379" (UID: "3c5e85dd-ceb0-40eb-86b8-353702e07379"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 14:02:57 crc kubenswrapper[4676]: I0930 14:02:57.961458 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c5e85dd-ceb0-40eb-86b8-353702e07379-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "3c5e85dd-ceb0-40eb-86b8-353702e07379" (UID: "3c5e85dd-ceb0-40eb-86b8-353702e07379"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:02:57 crc kubenswrapper[4676]: I0930 14:02:57.961509 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c5e85dd-ceb0-40eb-86b8-353702e07379-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "3c5e85dd-ceb0-40eb-86b8-353702e07379" (UID: "3c5e85dd-ceb0-40eb-86b8-353702e07379"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:02:57 crc kubenswrapper[4676]: I0930 14:02:57.961623 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3c5e85dd-ceb0-40eb-86b8-353702e07379-v4-0-config-system-router-certs\") pod \"3c5e85dd-ceb0-40eb-86b8-353702e07379\" (UID: \"3c5e85dd-ceb0-40eb-86b8-353702e07379\") " Sep 30 14:02:57 crc kubenswrapper[4676]: I0930 14:02:57.961655 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9kv7h\" (UniqueName: \"kubernetes.io/projected/3c5e85dd-ceb0-40eb-86b8-353702e07379-kube-api-access-9kv7h\") pod \"3c5e85dd-ceb0-40eb-86b8-353702e07379\" (UID: \"3c5e85dd-ceb0-40eb-86b8-353702e07379\") " Sep 30 14:02:57 crc kubenswrapper[4676]: I0930 14:02:57.961684 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3c5e85dd-ceb0-40eb-86b8-353702e07379-v4-0-config-system-ocp-branding-template\") pod \"3c5e85dd-ceb0-40eb-86b8-353702e07379\" (UID: \"3c5e85dd-ceb0-40eb-86b8-353702e07379\") " Sep 30 14:02:57 crc kubenswrapper[4676]: I0930 14:02:57.961705 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3c5e85dd-ceb0-40eb-86b8-353702e07379-v4-0-config-user-template-provider-selection\") pod \"3c5e85dd-ceb0-40eb-86b8-353702e07379\" (UID: \"3c5e85dd-ceb0-40eb-86b8-353702e07379\") " Sep 30 14:02:57 crc kubenswrapper[4676]: I0930 14:02:57.961753 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3c5e85dd-ceb0-40eb-86b8-353702e07379-v4-0-config-user-template-login\") pod \"3c5e85dd-ceb0-40eb-86b8-353702e07379\" (UID: \"3c5e85dd-ceb0-40eb-86b8-353702e07379\") " Sep 30 14:02:57 crc kubenswrapper[4676]: I0930 14:02:57.962427 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/631b6fa2-2f6b-4805-bb74-394b726ae347-v4-0-config-system-serving-cert\") pod \"oauth-openshift-9fc86467f-jj5nh\" (UID: \"631b6fa2-2f6b-4805-bb74-394b726ae347\") " pod="openshift-authentication/oauth-openshift-9fc86467f-jj5nh" Sep 30 14:02:57 crc kubenswrapper[4676]: I0930 14:02:57.962457 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/631b6fa2-2f6b-4805-bb74-394b726ae347-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-9fc86467f-jj5nh\" (UID: \"631b6fa2-2f6b-4805-bb74-394b726ae347\") " pod="openshift-authentication/oauth-openshift-9fc86467f-jj5nh" Sep 30 14:02:57 crc kubenswrapper[4676]: I0930 14:02:57.962478 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/631b6fa2-2f6b-4805-bb74-394b726ae347-v4-0-config-system-router-certs\") pod \"oauth-openshift-9fc86467f-jj5nh\" (UID: \"631b6fa2-2f6b-4805-bb74-394b726ae347\") " pod="openshift-authentication/oauth-openshift-9fc86467f-jj5nh" Sep 30 14:02:57 crc kubenswrapper[4676]: I0930 14:02:57.962498 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/631b6fa2-2f6b-4805-bb74-394b726ae347-v4-0-config-user-template-error\") pod \"oauth-openshift-9fc86467f-jj5nh\" (UID: \"631b6fa2-2f6b-4805-bb74-394b726ae347\") " pod="openshift-authentication/oauth-openshift-9fc86467f-jj5nh" Sep 30 14:02:57 crc kubenswrapper[4676]: I0930 14:02:57.962521 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t66xl\" (UniqueName: \"kubernetes.io/projected/631b6fa2-2f6b-4805-bb74-394b726ae347-kube-api-access-t66xl\") pod \"oauth-openshift-9fc86467f-jj5nh\" (UID: \"631b6fa2-2f6b-4805-bb74-394b726ae347\") " pod="openshift-authentication/oauth-openshift-9fc86467f-jj5nh" Sep 30 14:02:57 crc kubenswrapper[4676]: I0930 14:02:57.962561 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/631b6fa2-2f6b-4805-bb74-394b726ae347-audit-dir\") pod \"oauth-openshift-9fc86467f-jj5nh\" (UID: \"631b6fa2-2f6b-4805-bb74-394b726ae347\") " pod="openshift-authentication/oauth-openshift-9fc86467f-jj5nh" Sep 30 14:02:57 crc kubenswrapper[4676]: I0930 14:02:57.962599 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/631b6fa2-2f6b-4805-bb74-394b726ae347-audit-policies\") pod \"oauth-openshift-9fc86467f-jj5nh\" (UID: \"631b6fa2-2f6b-4805-bb74-394b726ae347\") " pod="openshift-authentication/oauth-openshift-9fc86467f-jj5nh" Sep 30 14:02:57 crc kubenswrapper[4676]: I0930 14:02:57.962633 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/631b6fa2-2f6b-4805-bb74-394b726ae347-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-9fc86467f-jj5nh\" (UID: \"631b6fa2-2f6b-4805-bb74-394b726ae347\") " pod="openshift-authentication/oauth-openshift-9fc86467f-jj5nh" Sep 30 14:02:57 crc kubenswrapper[4676]: I0930 14:02:57.962654 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/631b6fa2-2f6b-4805-bb74-394b726ae347-v4-0-config-system-service-ca\") pod \"oauth-openshift-9fc86467f-jj5nh\" (UID: \"631b6fa2-2f6b-4805-bb74-394b726ae347\") " pod="openshift-authentication/oauth-openshift-9fc86467f-jj5nh" Sep 30 14:02:57 crc kubenswrapper[4676]: I0930 14:02:57.962672 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/631b6fa2-2f6b-4805-bb74-394b726ae347-v4-0-config-system-session\") pod \"oauth-openshift-9fc86467f-jj5nh\" (UID: \"631b6fa2-2f6b-4805-bb74-394b726ae347\") " pod="openshift-authentication/oauth-openshift-9fc86467f-jj5nh" Sep 30 14:02:57 crc kubenswrapper[4676]: I0930 14:02:57.962689 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/631b6fa2-2f6b-4805-bb74-394b726ae347-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-9fc86467f-jj5nh\" (UID: \"631b6fa2-2f6b-4805-bb74-394b726ae347\") " pod="openshift-authentication/oauth-openshift-9fc86467f-jj5nh" Sep 30 14:02:57 crc kubenswrapper[4676]: I0930 14:02:57.962707 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/631b6fa2-2f6b-4805-bb74-394b726ae347-v4-0-config-system-cliconfig\") pod \"oauth-openshift-9fc86467f-jj5nh\" (UID: \"631b6fa2-2f6b-4805-bb74-394b726ae347\") " pod="openshift-authentication/oauth-openshift-9fc86467f-jj5nh" Sep 30 14:02:57 crc kubenswrapper[4676]: I0930 14:02:57.962723 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/631b6fa2-2f6b-4805-bb74-394b726ae347-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-9fc86467f-jj5nh\" (UID: \"631b6fa2-2f6b-4805-bb74-394b726ae347\") " pod="openshift-authentication/oauth-openshift-9fc86467f-jj5nh" Sep 30 14:02:57 crc kubenswrapper[4676]: I0930 14:02:57.962747 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/631b6fa2-2f6b-4805-bb74-394b726ae347-v4-0-config-user-template-login\") pod \"oauth-openshift-9fc86467f-jj5nh\" (UID: \"631b6fa2-2f6b-4805-bb74-394b726ae347\") " pod="openshift-authentication/oauth-openshift-9fc86467f-jj5nh" Sep 30 14:02:57 crc kubenswrapper[4676]: I0930 14:02:57.962793 4676 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3c5e85dd-ceb0-40eb-86b8-353702e07379-audit-policies\") on node \"crc\" DevicePath \"\"" Sep 30 14:02:57 crc kubenswrapper[4676]: I0930 14:02:57.962806 4676 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3c5e85dd-ceb0-40eb-86b8-353702e07379-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Sep 30 14:02:57 crc kubenswrapper[4676]: I0930 14:02:57.962815 4676 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3c5e85dd-ceb0-40eb-86b8-353702e07379-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:02:57 crc kubenswrapper[4676]: I0930 14:02:57.962826 4676 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3c5e85dd-ceb0-40eb-86b8-353702e07379-audit-dir\") on node \"crc\" DevicePath \"\"" Sep 30 14:02:57 crc kubenswrapper[4676]: I0930 14:02:57.963709 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c5e85dd-ceb0-40eb-86b8-353702e07379-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "3c5e85dd-ceb0-40eb-86b8-353702e07379" (UID: "3c5e85dd-ceb0-40eb-86b8-353702e07379"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:02:57 crc kubenswrapper[4676]: I0930 14:02:57.974064 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c5e85dd-ceb0-40eb-86b8-353702e07379-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "3c5e85dd-ceb0-40eb-86b8-353702e07379" (UID: "3c5e85dd-ceb0-40eb-86b8-353702e07379"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:02:57 crc kubenswrapper[4676]: I0930 14:02:57.974214 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c5e85dd-ceb0-40eb-86b8-353702e07379-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "3c5e85dd-ceb0-40eb-86b8-353702e07379" (UID: "3c5e85dd-ceb0-40eb-86b8-353702e07379"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:02:57 crc kubenswrapper[4676]: I0930 14:02:57.974907 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c5e85dd-ceb0-40eb-86b8-353702e07379-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "3c5e85dd-ceb0-40eb-86b8-353702e07379" (UID: "3c5e85dd-ceb0-40eb-86b8-353702e07379"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:02:57 crc kubenswrapper[4676]: I0930 14:02:57.975167 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c5e85dd-ceb0-40eb-86b8-353702e07379-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "3c5e85dd-ceb0-40eb-86b8-353702e07379" (UID: "3c5e85dd-ceb0-40eb-86b8-353702e07379"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:02:57 crc kubenswrapper[4676]: I0930 14:02:57.975372 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c5e85dd-ceb0-40eb-86b8-353702e07379-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "3c5e85dd-ceb0-40eb-86b8-353702e07379" (UID: "3c5e85dd-ceb0-40eb-86b8-353702e07379"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:02:57 crc kubenswrapper[4676]: I0930 14:02:57.975394 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c5e85dd-ceb0-40eb-86b8-353702e07379-kube-api-access-9kv7h" (OuterVolumeSpecName: "kube-api-access-9kv7h") pod "3c5e85dd-ceb0-40eb-86b8-353702e07379" (UID: "3c5e85dd-ceb0-40eb-86b8-353702e07379"). InnerVolumeSpecName "kube-api-access-9kv7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:02:57 crc kubenswrapper[4676]: I0930 14:02:57.975609 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c5e85dd-ceb0-40eb-86b8-353702e07379-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "3c5e85dd-ceb0-40eb-86b8-353702e07379" (UID: "3c5e85dd-ceb0-40eb-86b8-353702e07379"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:02:57 crc kubenswrapper[4676]: I0930 14:02:57.975606 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c5e85dd-ceb0-40eb-86b8-353702e07379-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "3c5e85dd-ceb0-40eb-86b8-353702e07379" (UID: "3c5e85dd-ceb0-40eb-86b8-353702e07379"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:02:57 crc kubenswrapper[4676]: I0930 14:02:57.975755 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c5e85dd-ceb0-40eb-86b8-353702e07379-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "3c5e85dd-ceb0-40eb-86b8-353702e07379" (UID: "3c5e85dd-ceb0-40eb-86b8-353702e07379"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:02:58 crc kubenswrapper[4676]: I0930 14:02:58.063708 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/631b6fa2-2f6b-4805-bb74-394b726ae347-v4-0-config-system-service-ca\") pod \"oauth-openshift-9fc86467f-jj5nh\" (UID: \"631b6fa2-2f6b-4805-bb74-394b726ae347\") " pod="openshift-authentication/oauth-openshift-9fc86467f-jj5nh" Sep 30 14:02:58 crc kubenswrapper[4676]: I0930 14:02:58.063755 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/631b6fa2-2f6b-4805-bb74-394b726ae347-v4-0-config-system-session\") pod \"oauth-openshift-9fc86467f-jj5nh\" (UID: \"631b6fa2-2f6b-4805-bb74-394b726ae347\") " pod="openshift-authentication/oauth-openshift-9fc86467f-jj5nh" Sep 30 14:02:58 crc kubenswrapper[4676]: I0930 14:02:58.063784 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/631b6fa2-2f6b-4805-bb74-394b726ae347-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-9fc86467f-jj5nh\" (UID: \"631b6fa2-2f6b-4805-bb74-394b726ae347\") " pod="openshift-authentication/oauth-openshift-9fc86467f-jj5nh" Sep 30 14:02:58 crc kubenswrapper[4676]: I0930 14:02:58.063812 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/631b6fa2-2f6b-4805-bb74-394b726ae347-v4-0-config-system-cliconfig\") pod \"oauth-openshift-9fc86467f-jj5nh\" (UID: \"631b6fa2-2f6b-4805-bb74-394b726ae347\") " pod="openshift-authentication/oauth-openshift-9fc86467f-jj5nh" Sep 30 14:02:58 crc kubenswrapper[4676]: I0930 14:02:58.063835 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/631b6fa2-2f6b-4805-bb74-394b726ae347-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-9fc86467f-jj5nh\" (UID: \"631b6fa2-2f6b-4805-bb74-394b726ae347\") " pod="openshift-authentication/oauth-openshift-9fc86467f-jj5nh" Sep 30 14:02:58 crc kubenswrapper[4676]: I0930 14:02:58.063864 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/631b6fa2-2f6b-4805-bb74-394b726ae347-v4-0-config-user-template-login\") pod \"oauth-openshift-9fc86467f-jj5nh\" (UID: \"631b6fa2-2f6b-4805-bb74-394b726ae347\") " pod="openshift-authentication/oauth-openshift-9fc86467f-jj5nh" Sep 30 14:02:58 crc kubenswrapper[4676]: I0930 14:02:58.063933 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/631b6fa2-2f6b-4805-bb74-394b726ae347-v4-0-config-system-serving-cert\") pod \"oauth-openshift-9fc86467f-jj5nh\" (UID: \"631b6fa2-2f6b-4805-bb74-394b726ae347\") " pod="openshift-authentication/oauth-openshift-9fc86467f-jj5nh" Sep 30 14:02:58 crc kubenswrapper[4676]: I0930 14:02:58.063957 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/631b6fa2-2f6b-4805-bb74-394b726ae347-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-9fc86467f-jj5nh\" (UID: \"631b6fa2-2f6b-4805-bb74-394b726ae347\") " pod="openshift-authentication/oauth-openshift-9fc86467f-jj5nh" Sep 30 14:02:58 crc kubenswrapper[4676]: I0930 14:02:58.063977 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/631b6fa2-2f6b-4805-bb74-394b726ae347-v4-0-config-system-router-certs\") pod \"oauth-openshift-9fc86467f-jj5nh\" (UID: \"631b6fa2-2f6b-4805-bb74-394b726ae347\") " pod="openshift-authentication/oauth-openshift-9fc86467f-jj5nh" Sep 30 14:02:58 crc kubenswrapper[4676]: I0930 14:02:58.063999 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/631b6fa2-2f6b-4805-bb74-394b726ae347-v4-0-config-user-template-error\") pod \"oauth-openshift-9fc86467f-jj5nh\" (UID: \"631b6fa2-2f6b-4805-bb74-394b726ae347\") " pod="openshift-authentication/oauth-openshift-9fc86467f-jj5nh" Sep 30 14:02:58 crc kubenswrapper[4676]: I0930 14:02:58.064022 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t66xl\" (UniqueName: \"kubernetes.io/projected/631b6fa2-2f6b-4805-bb74-394b726ae347-kube-api-access-t66xl\") pod \"oauth-openshift-9fc86467f-jj5nh\" (UID: \"631b6fa2-2f6b-4805-bb74-394b726ae347\") " pod="openshift-authentication/oauth-openshift-9fc86467f-jj5nh" Sep 30 14:02:58 crc kubenswrapper[4676]: I0930 14:02:58.064047 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/631b6fa2-2f6b-4805-bb74-394b726ae347-audit-dir\") pod \"oauth-openshift-9fc86467f-jj5nh\" (UID: \"631b6fa2-2f6b-4805-bb74-394b726ae347\") " pod="openshift-authentication/oauth-openshift-9fc86467f-jj5nh" Sep 30 14:02:58 crc kubenswrapper[4676]: I0930 14:02:58.064078 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/631b6fa2-2f6b-4805-bb74-394b726ae347-audit-policies\") pod \"oauth-openshift-9fc86467f-jj5nh\" (UID: \"631b6fa2-2f6b-4805-bb74-394b726ae347\") " pod="openshift-authentication/oauth-openshift-9fc86467f-jj5nh" Sep 30 14:02:58 crc kubenswrapper[4676]: I0930 14:02:58.064116 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/631b6fa2-2f6b-4805-bb74-394b726ae347-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-9fc86467f-jj5nh\" (UID: \"631b6fa2-2f6b-4805-bb74-394b726ae347\") " pod="openshift-authentication/oauth-openshift-9fc86467f-jj5nh" Sep 30 14:02:58 crc kubenswrapper[4676]: I0930 14:02:58.064163 4676 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3c5e85dd-ceb0-40eb-86b8-353702e07379-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 14:02:58 crc kubenswrapper[4676]: I0930 14:02:58.064178 4676 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3c5e85dd-ceb0-40eb-86b8-353702e07379-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 14:02:58 crc kubenswrapper[4676]: I0930 14:02:58.064191 4676 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3c5e85dd-ceb0-40eb-86b8-353702e07379-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Sep 30 14:02:58 crc kubenswrapper[4676]: I0930 14:02:58.064204 4676 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3c5e85dd-ceb0-40eb-86b8-353702e07379-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Sep 30 14:02:58 crc kubenswrapper[4676]: I0930 14:02:58.064215 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9kv7h\" (UniqueName: \"kubernetes.io/projected/3c5e85dd-ceb0-40eb-86b8-353702e07379-kube-api-access-9kv7h\") on node \"crc\" DevicePath \"\"" Sep 30 14:02:58 crc kubenswrapper[4676]: I0930 14:02:58.064228 4676 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3c5e85dd-ceb0-40eb-86b8-353702e07379-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Sep 30 14:02:58 crc kubenswrapper[4676]: I0930 14:02:58.064242 4676 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3c5e85dd-ceb0-40eb-86b8-353702e07379-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Sep 30 14:02:58 crc kubenswrapper[4676]: I0930 14:02:58.064363 4676 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3c5e85dd-ceb0-40eb-86b8-353702e07379-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Sep 30 14:02:58 crc kubenswrapper[4676]: I0930 14:02:58.064393 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/631b6fa2-2f6b-4805-bb74-394b726ae347-audit-dir\") pod \"oauth-openshift-9fc86467f-jj5nh\" (UID: \"631b6fa2-2f6b-4805-bb74-394b726ae347\") " pod="openshift-authentication/oauth-openshift-9fc86467f-jj5nh" Sep 30 14:02:58 crc kubenswrapper[4676]: I0930 14:02:58.065197 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/631b6fa2-2f6b-4805-bb74-394b726ae347-v4-0-config-system-cliconfig\") pod \"oauth-openshift-9fc86467f-jj5nh\" (UID: \"631b6fa2-2f6b-4805-bb74-394b726ae347\") " pod="openshift-authentication/oauth-openshift-9fc86467f-jj5nh" Sep 30 14:02:58 crc kubenswrapper[4676]: I0930 14:02:58.065244 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/631b6fa2-2f6b-4805-bb74-394b726ae347-v4-0-config-system-service-ca\") pod \"oauth-openshift-9fc86467f-jj5nh\" (UID: \"631b6fa2-2f6b-4805-bb74-394b726ae347\") " pod="openshift-authentication/oauth-openshift-9fc86467f-jj5nh" Sep 30 14:02:58 crc kubenswrapper[4676]: I0930 14:02:58.065255 4676 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3c5e85dd-ceb0-40eb-86b8-353702e07379-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Sep 30 14:02:58 crc kubenswrapper[4676]: I0930 14:02:58.065292 4676 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3c5e85dd-ceb0-40eb-86b8-353702e07379-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Sep 30 14:02:58 crc kubenswrapper[4676]: I0930 14:02:58.065331 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/631b6fa2-2f6b-4805-bb74-394b726ae347-audit-policies\") pod \"oauth-openshift-9fc86467f-jj5nh\" (UID: \"631b6fa2-2f6b-4805-bb74-394b726ae347\") " pod="openshift-authentication/oauth-openshift-9fc86467f-jj5nh" Sep 30 14:02:58 crc kubenswrapper[4676]: I0930 14:02:58.065400 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/631b6fa2-2f6b-4805-bb74-394b726ae347-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-9fc86467f-jj5nh\" (UID: \"631b6fa2-2f6b-4805-bb74-394b726ae347\") " pod="openshift-authentication/oauth-openshift-9fc86467f-jj5nh" Sep 30 14:02:58 crc kubenswrapper[4676]: I0930 14:02:58.067554 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/631b6fa2-2f6b-4805-bb74-394b726ae347-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-9fc86467f-jj5nh\" (UID: \"631b6fa2-2f6b-4805-bb74-394b726ae347\") " pod="openshift-authentication/oauth-openshift-9fc86467f-jj5nh" Sep 30 14:02:58 crc kubenswrapper[4676]: I0930 14:02:58.067567 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/631b6fa2-2f6b-4805-bb74-394b726ae347-v4-0-config-system-serving-cert\") pod \"oauth-openshift-9fc86467f-jj5nh\" (UID: \"631b6fa2-2f6b-4805-bb74-394b726ae347\") " pod="openshift-authentication/oauth-openshift-9fc86467f-jj5nh" Sep 30 14:02:58 crc kubenswrapper[4676]: I0930 14:02:58.068181 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/631b6fa2-2f6b-4805-bb74-394b726ae347-v4-0-config-user-template-error\") pod \"oauth-openshift-9fc86467f-jj5nh\" (UID: \"631b6fa2-2f6b-4805-bb74-394b726ae347\") " pod="openshift-authentication/oauth-openshift-9fc86467f-jj5nh" Sep 30 14:02:58 crc kubenswrapper[4676]: I0930 14:02:58.068373 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/631b6fa2-2f6b-4805-bb74-394b726ae347-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-9fc86467f-jj5nh\" (UID: \"631b6fa2-2f6b-4805-bb74-394b726ae347\") " pod="openshift-authentication/oauth-openshift-9fc86467f-jj5nh" Sep 30 14:02:58 crc kubenswrapper[4676]: I0930 14:02:58.068767 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/631b6fa2-2f6b-4805-bb74-394b726ae347-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-9fc86467f-jj5nh\" (UID: \"631b6fa2-2f6b-4805-bb74-394b726ae347\") " pod="openshift-authentication/oauth-openshift-9fc86467f-jj5nh" Sep 30 14:02:58 crc kubenswrapper[4676]: I0930 14:02:58.069068 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/631b6fa2-2f6b-4805-bb74-394b726ae347-v4-0-config-system-session\") pod \"oauth-openshift-9fc86467f-jj5nh\" (UID: \"631b6fa2-2f6b-4805-bb74-394b726ae347\") " pod="openshift-authentication/oauth-openshift-9fc86467f-jj5nh" Sep 30 14:02:58 crc kubenswrapper[4676]: I0930 14:02:58.070096 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/631b6fa2-2f6b-4805-bb74-394b726ae347-v4-0-config-user-template-login\") pod \"oauth-openshift-9fc86467f-jj5nh\" (UID: \"631b6fa2-2f6b-4805-bb74-394b726ae347\") " pod="openshift-authentication/oauth-openshift-9fc86467f-jj5nh" Sep 30 14:02:58 crc kubenswrapper[4676]: I0930 14:02:58.076432 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/631b6fa2-2f6b-4805-bb74-394b726ae347-v4-0-config-system-router-certs\") pod \"oauth-openshift-9fc86467f-jj5nh\" (UID: \"631b6fa2-2f6b-4805-bb74-394b726ae347\") " pod="openshift-authentication/oauth-openshift-9fc86467f-jj5nh" Sep 30 14:02:58 crc kubenswrapper[4676]: I0930 14:02:58.079697 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t66xl\" (UniqueName: \"kubernetes.io/projected/631b6fa2-2f6b-4805-bb74-394b726ae347-kube-api-access-t66xl\") pod \"oauth-openshift-9fc86467f-jj5nh\" (UID: \"631b6fa2-2f6b-4805-bb74-394b726ae347\") " pod="openshift-authentication/oauth-openshift-9fc86467f-jj5nh" Sep 30 14:02:58 crc kubenswrapper[4676]: I0930 14:02:58.212254 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-9fc86467f-jj5nh" Sep 30 14:02:58 crc kubenswrapper[4676]: I0930 14:02:58.249486 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-k8mk5"] Sep 30 14:02:58 crc kubenswrapper[4676]: I0930 14:02:58.257591 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-k8mk5"] Sep 30 14:02:58 crc kubenswrapper[4676]: I0930 14:02:58.634770 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-9fc86467f-jj5nh"] Sep 30 14:02:58 crc kubenswrapper[4676]: I0930 14:02:58.931424 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-9fc86467f-jj5nh" event={"ID":"631b6fa2-2f6b-4805-bb74-394b726ae347","Type":"ContainerStarted","Data":"cd80b0947bb9ac74c6afca9d82ffb3add8fd3e4c6db1bcbae3108558428fea82"} Sep 30 14:02:59 crc kubenswrapper[4676]: I0930 14:02:59.447308 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c5e85dd-ceb0-40eb-86b8-353702e07379" path="/var/lib/kubelet/pods/3c5e85dd-ceb0-40eb-86b8-353702e07379/volumes" Sep 30 14:02:59 crc kubenswrapper[4676]: I0930 14:02:59.941350 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-9fc86467f-jj5nh" event={"ID":"631b6fa2-2f6b-4805-bb74-394b726ae347","Type":"ContainerStarted","Data":"650c19090f4c31f1afe1c34048bb91be8621f81c7b22a3e5eb0b847009668ea3"} Sep 30 14:02:59 crc kubenswrapper[4676]: I0930 14:02:59.941934 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-9fc86467f-jj5nh" Sep 30 14:02:59 crc kubenswrapper[4676]: I0930 14:02:59.948415 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-9fc86467f-jj5nh" Sep 30 14:02:59 crc kubenswrapper[4676]: I0930 14:02:59.993786 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-9fc86467f-jj5nh" podStartSLOduration=27.993748911 podStartE2EDuration="27.993748911s" podCreationTimestamp="2025-09-30 14:02:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:02:59.964001353 +0000 UTC m=+283.947089802" watchObservedRunningTime="2025-09-30 14:02:59.993748911 +0000 UTC m=+283.976837350" Sep 30 14:04:29 crc kubenswrapper[4676]: I0930 14:04:29.919829 4676 patch_prober.go:28] interesting pod/machine-config-daemon-4k2dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:04:29 crc kubenswrapper[4676]: I0930 14:04:29.920709 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:04:59 crc kubenswrapper[4676]: I0930 14:04:59.919249 4676 patch_prober.go:28] interesting pod/machine-config-daemon-4k2dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:04:59 crc kubenswrapper[4676]: I0930 14:04:59.919838 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:05:29 crc kubenswrapper[4676]: I0930 14:05:29.919668 4676 patch_prober.go:28] interesting pod/machine-config-daemon-4k2dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:05:29 crc kubenswrapper[4676]: I0930 14:05:29.921142 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:05:29 crc kubenswrapper[4676]: I0930 14:05:29.921221 4676 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" Sep 30 14:05:29 crc kubenswrapper[4676]: I0930 14:05:29.923666 4676 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1be5dc03e37879b7c4a0ccbe08345138fa5cba0d902530b2119831d575d4b86d"} pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 14:05:29 crc kubenswrapper[4676]: I0930 14:05:29.923764 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerName="machine-config-daemon" containerID="cri-o://1be5dc03e37879b7c4a0ccbe08345138fa5cba0d902530b2119831d575d4b86d" gracePeriod=600 Sep 30 14:05:30 crc kubenswrapper[4676]: I0930 14:05:30.818870 4676 generic.go:334] "Generic (PLEG): container finished" podID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerID="1be5dc03e37879b7c4a0ccbe08345138fa5cba0d902530b2119831d575d4b86d" exitCode=0 Sep 30 14:05:30 crc kubenswrapper[4676]: I0930 14:05:30.818959 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" event={"ID":"af133cb7-f0e4-428e-b348-c6e81493fc1d","Type":"ContainerDied","Data":"1be5dc03e37879b7c4a0ccbe08345138fa5cba0d902530b2119831d575d4b86d"} Sep 30 14:05:30 crc kubenswrapper[4676]: I0930 14:05:30.819671 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" event={"ID":"af133cb7-f0e4-428e-b348-c6e81493fc1d","Type":"ContainerStarted","Data":"3afeb7a072f24b3d5a2bdb9ea432b2f6f591ebb112783320e64d4cf45e06e1b0"} Sep 30 14:05:30 crc kubenswrapper[4676]: I0930 14:05:30.819732 4676 scope.go:117] "RemoveContainer" containerID="e6ba88aa74993edfc49871c7cb7c45c498fc9a6d1e2cc573864d4656d4ade55b" Sep 30 14:05:45 crc kubenswrapper[4676]: I0930 14:05:45.570732 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-vh7bp"] Sep 30 14:05:45 crc kubenswrapper[4676]: I0930 14:05:45.572125 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-vh7bp" Sep 30 14:05:45 crc kubenswrapper[4676]: I0930 14:05:45.597122 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-vh7bp"] Sep 30 14:05:45 crc kubenswrapper[4676]: I0930 14:05:45.709694 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/08e8a379-c698-4b9e-9df6-41d9720e7f96-installation-pull-secrets\") pod \"image-registry-66df7c8f76-vh7bp\" (UID: \"08e8a379-c698-4b9e-9df6-41d9720e7f96\") " pod="openshift-image-registry/image-registry-66df7c8f76-vh7bp" Sep 30 14:05:45 crc kubenswrapper[4676]: I0930 14:05:45.709782 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/08e8a379-c698-4b9e-9df6-41d9720e7f96-bound-sa-token\") pod \"image-registry-66df7c8f76-vh7bp\" (UID: \"08e8a379-c698-4b9e-9df6-41d9720e7f96\") " pod="openshift-image-registry/image-registry-66df7c8f76-vh7bp" Sep 30 14:05:45 crc kubenswrapper[4676]: I0930 14:05:45.709809 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/08e8a379-c698-4b9e-9df6-41d9720e7f96-ca-trust-extracted\") pod \"image-registry-66df7c8f76-vh7bp\" (UID: \"08e8a379-c698-4b9e-9df6-41d9720e7f96\") " pod="openshift-image-registry/image-registry-66df7c8f76-vh7bp" Sep 30 14:05:45 crc kubenswrapper[4676]: I0930 14:05:45.709835 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/08e8a379-c698-4b9e-9df6-41d9720e7f96-trusted-ca\") pod \"image-registry-66df7c8f76-vh7bp\" (UID: \"08e8a379-c698-4b9e-9df6-41d9720e7f96\") " pod="openshift-image-registry/image-registry-66df7c8f76-vh7bp" Sep 30 14:05:45 crc kubenswrapper[4676]: I0930 14:05:45.709858 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/08e8a379-c698-4b9e-9df6-41d9720e7f96-registry-certificates\") pod \"image-registry-66df7c8f76-vh7bp\" (UID: \"08e8a379-c698-4b9e-9df6-41d9720e7f96\") " pod="openshift-image-registry/image-registry-66df7c8f76-vh7bp" Sep 30 14:05:45 crc kubenswrapper[4676]: I0930 14:05:45.709900 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4wzq\" (UniqueName: \"kubernetes.io/projected/08e8a379-c698-4b9e-9df6-41d9720e7f96-kube-api-access-f4wzq\") pod \"image-registry-66df7c8f76-vh7bp\" (UID: \"08e8a379-c698-4b9e-9df6-41d9720e7f96\") " pod="openshift-image-registry/image-registry-66df7c8f76-vh7bp" Sep 30 14:05:45 crc kubenswrapper[4676]: I0930 14:05:45.710062 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-vh7bp\" (UID: \"08e8a379-c698-4b9e-9df6-41d9720e7f96\") " pod="openshift-image-registry/image-registry-66df7c8f76-vh7bp" Sep 30 14:05:45 crc kubenswrapper[4676]: I0930 14:05:45.710096 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/08e8a379-c698-4b9e-9df6-41d9720e7f96-registry-tls\") pod \"image-registry-66df7c8f76-vh7bp\" (UID: \"08e8a379-c698-4b9e-9df6-41d9720e7f96\") " pod="openshift-image-registry/image-registry-66df7c8f76-vh7bp" Sep 30 14:05:45 crc kubenswrapper[4676]: I0930 14:05:45.739972 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-vh7bp\" (UID: \"08e8a379-c698-4b9e-9df6-41d9720e7f96\") " pod="openshift-image-registry/image-registry-66df7c8f76-vh7bp" Sep 30 14:05:45 crc kubenswrapper[4676]: I0930 14:05:45.813323 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/08e8a379-c698-4b9e-9df6-41d9720e7f96-installation-pull-secrets\") pod \"image-registry-66df7c8f76-vh7bp\" (UID: \"08e8a379-c698-4b9e-9df6-41d9720e7f96\") " pod="openshift-image-registry/image-registry-66df7c8f76-vh7bp" Sep 30 14:05:45 crc kubenswrapper[4676]: I0930 14:05:45.813517 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/08e8a379-c698-4b9e-9df6-41d9720e7f96-bound-sa-token\") pod \"image-registry-66df7c8f76-vh7bp\" (UID: \"08e8a379-c698-4b9e-9df6-41d9720e7f96\") " pod="openshift-image-registry/image-registry-66df7c8f76-vh7bp" Sep 30 14:05:45 crc kubenswrapper[4676]: I0930 14:05:45.813634 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/08e8a379-c698-4b9e-9df6-41d9720e7f96-ca-trust-extracted\") pod \"image-registry-66df7c8f76-vh7bp\" (UID: \"08e8a379-c698-4b9e-9df6-41d9720e7f96\") " pod="openshift-image-registry/image-registry-66df7c8f76-vh7bp" Sep 30 14:05:45 crc kubenswrapper[4676]: I0930 14:05:45.813748 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/08e8a379-c698-4b9e-9df6-41d9720e7f96-trusted-ca\") pod \"image-registry-66df7c8f76-vh7bp\" (UID: \"08e8a379-c698-4b9e-9df6-41d9720e7f96\") " pod="openshift-image-registry/image-registry-66df7c8f76-vh7bp" Sep 30 14:05:45 crc kubenswrapper[4676]: I0930 14:05:45.813773 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/08e8a379-c698-4b9e-9df6-41d9720e7f96-registry-certificates\") pod \"image-registry-66df7c8f76-vh7bp\" (UID: \"08e8a379-c698-4b9e-9df6-41d9720e7f96\") " pod="openshift-image-registry/image-registry-66df7c8f76-vh7bp" Sep 30 14:05:45 crc kubenswrapper[4676]: I0930 14:05:45.814217 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/08e8a379-c698-4b9e-9df6-41d9720e7f96-ca-trust-extracted\") pod \"image-registry-66df7c8f76-vh7bp\" (UID: \"08e8a379-c698-4b9e-9df6-41d9720e7f96\") " pod="openshift-image-registry/image-registry-66df7c8f76-vh7bp" Sep 30 14:05:45 crc kubenswrapper[4676]: I0930 14:05:45.815294 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/08e8a379-c698-4b9e-9df6-41d9720e7f96-registry-certificates\") pod \"image-registry-66df7c8f76-vh7bp\" (UID: \"08e8a379-c698-4b9e-9df6-41d9720e7f96\") " pod="openshift-image-registry/image-registry-66df7c8f76-vh7bp" Sep 30 14:05:45 crc kubenswrapper[4676]: I0930 14:05:45.815684 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/08e8a379-c698-4b9e-9df6-41d9720e7f96-trusted-ca\") pod \"image-registry-66df7c8f76-vh7bp\" (UID: \"08e8a379-c698-4b9e-9df6-41d9720e7f96\") " pod="openshift-image-registry/image-registry-66df7c8f76-vh7bp" Sep 30 14:05:45 crc kubenswrapper[4676]: I0930 14:05:45.815784 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4wzq\" (UniqueName: \"kubernetes.io/projected/08e8a379-c698-4b9e-9df6-41d9720e7f96-kube-api-access-f4wzq\") pod \"image-registry-66df7c8f76-vh7bp\" (UID: \"08e8a379-c698-4b9e-9df6-41d9720e7f96\") " pod="openshift-image-registry/image-registry-66df7c8f76-vh7bp" Sep 30 14:05:45 crc kubenswrapper[4676]: I0930 14:05:45.816388 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/08e8a379-c698-4b9e-9df6-41d9720e7f96-registry-tls\") pod \"image-registry-66df7c8f76-vh7bp\" (UID: \"08e8a379-c698-4b9e-9df6-41d9720e7f96\") " pod="openshift-image-registry/image-registry-66df7c8f76-vh7bp" Sep 30 14:05:45 crc kubenswrapper[4676]: I0930 14:05:45.825697 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/08e8a379-c698-4b9e-9df6-41d9720e7f96-registry-tls\") pod \"image-registry-66df7c8f76-vh7bp\" (UID: \"08e8a379-c698-4b9e-9df6-41d9720e7f96\") " pod="openshift-image-registry/image-registry-66df7c8f76-vh7bp" Sep 30 14:05:45 crc kubenswrapper[4676]: I0930 14:05:45.829506 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/08e8a379-c698-4b9e-9df6-41d9720e7f96-installation-pull-secrets\") pod \"image-registry-66df7c8f76-vh7bp\" (UID: \"08e8a379-c698-4b9e-9df6-41d9720e7f96\") " pod="openshift-image-registry/image-registry-66df7c8f76-vh7bp" Sep 30 14:05:45 crc kubenswrapper[4676]: I0930 14:05:45.831409 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/08e8a379-c698-4b9e-9df6-41d9720e7f96-bound-sa-token\") pod \"image-registry-66df7c8f76-vh7bp\" (UID: \"08e8a379-c698-4b9e-9df6-41d9720e7f96\") " pod="openshift-image-registry/image-registry-66df7c8f76-vh7bp" Sep 30 14:05:45 crc kubenswrapper[4676]: I0930 14:05:45.832653 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4wzq\" (UniqueName: \"kubernetes.io/projected/08e8a379-c698-4b9e-9df6-41d9720e7f96-kube-api-access-f4wzq\") pod \"image-registry-66df7c8f76-vh7bp\" (UID: \"08e8a379-c698-4b9e-9df6-41d9720e7f96\") " pod="openshift-image-registry/image-registry-66df7c8f76-vh7bp" Sep 30 14:05:45 crc kubenswrapper[4676]: I0930 14:05:45.893711 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-vh7bp" Sep 30 14:05:46 crc kubenswrapper[4676]: I0930 14:05:46.082247 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-vh7bp"] Sep 30 14:05:46 crc kubenswrapper[4676]: I0930 14:05:46.911821 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-vh7bp" event={"ID":"08e8a379-c698-4b9e-9df6-41d9720e7f96","Type":"ContainerStarted","Data":"fbe4a18534bf67822ba2615d5673c8964a9ca1320e1d3fdcefca03123c0b3dd8"} Sep 30 14:05:46 crc kubenswrapper[4676]: I0930 14:05:46.912269 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-vh7bp" Sep 30 14:05:46 crc kubenswrapper[4676]: I0930 14:05:46.912287 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-vh7bp" event={"ID":"08e8a379-c698-4b9e-9df6-41d9720e7f96","Type":"ContainerStarted","Data":"62e30facb228eea4e450cc763e95de7fdac08aeac7218a7643a1d9da7e745bad"} Sep 30 14:05:46 crc kubenswrapper[4676]: I0930 14:05:46.933501 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-vh7bp" podStartSLOduration=1.933470185 podStartE2EDuration="1.933470185s" podCreationTimestamp="2025-09-30 14:05:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:05:46.929918542 +0000 UTC m=+450.913006981" watchObservedRunningTime="2025-09-30 14:05:46.933470185 +0000 UTC m=+450.916558614" Sep 30 14:06:05 crc kubenswrapper[4676]: I0930 14:06:05.901737 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-vh7bp" Sep 30 14:06:05 crc kubenswrapper[4676]: I0930 14:06:05.957499 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-rgs57"] Sep 30 14:06:31 crc kubenswrapper[4676]: I0930 14:06:31.013054 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" podUID="aa6b14cb-7f79-4bc6-bc14-58325daf3c86" containerName="registry" containerID="cri-o://b67ff64aee72ee79fb252d1c7eeebbd63cc03ce7e2a3d4756d92cb1e85f362ff" gracePeriod=30 Sep 30 14:06:31 crc kubenswrapper[4676]: I0930 14:06:31.149285 4676 generic.go:334] "Generic (PLEG): container finished" podID="aa6b14cb-7f79-4bc6-bc14-58325daf3c86" containerID="b67ff64aee72ee79fb252d1c7eeebbd63cc03ce7e2a3d4756d92cb1e85f362ff" exitCode=0 Sep 30 14:06:31 crc kubenswrapper[4676]: I0930 14:06:31.149397 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" event={"ID":"aa6b14cb-7f79-4bc6-bc14-58325daf3c86","Type":"ContainerDied","Data":"b67ff64aee72ee79fb252d1c7eeebbd63cc03ce7e2a3d4756d92cb1e85f362ff"} Sep 30 14:06:31 crc kubenswrapper[4676]: I0930 14:06:31.357991 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:06:31 crc kubenswrapper[4676]: I0930 14:06:31.488913 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aa6b14cb-7f79-4bc6-bc14-58325daf3c86-bound-sa-token\") pod \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " Sep 30 14:06:31 crc kubenswrapper[4676]: I0930 14:06:31.489044 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/aa6b14cb-7f79-4bc6-bc14-58325daf3c86-installation-pull-secrets\") pod \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " Sep 30 14:06:31 crc kubenswrapper[4676]: I0930 14:06:31.489100 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aa6b14cb-7f79-4bc6-bc14-58325daf3c86-trusted-ca\") pod \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " Sep 30 14:06:31 crc kubenswrapper[4676]: I0930 14:06:31.489160 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aa6b14cb-7f79-4bc6-bc14-58325daf3c86-registry-tls\") pod \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " Sep 30 14:06:31 crc kubenswrapper[4676]: I0930 14:06:31.489192 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/aa6b14cb-7f79-4bc6-bc14-58325daf3c86-registry-certificates\") pod \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " Sep 30 14:06:31 crc kubenswrapper[4676]: I0930 14:06:31.489237 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/aa6b14cb-7f79-4bc6-bc14-58325daf3c86-ca-trust-extracted\") pod \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " Sep 30 14:06:31 crc kubenswrapper[4676]: I0930 14:06:31.489574 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " Sep 30 14:06:31 crc kubenswrapper[4676]: I0930 14:06:31.489620 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fndjg\" (UniqueName: \"kubernetes.io/projected/aa6b14cb-7f79-4bc6-bc14-58325daf3c86-kube-api-access-fndjg\") pod \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\" (UID: \"aa6b14cb-7f79-4bc6-bc14-58325daf3c86\") " Sep 30 14:06:31 crc kubenswrapper[4676]: I0930 14:06:31.490647 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa6b14cb-7f79-4bc6-bc14-58325daf3c86-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "aa6b14cb-7f79-4bc6-bc14-58325daf3c86" (UID: "aa6b14cb-7f79-4bc6-bc14-58325daf3c86"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:06:31 crc kubenswrapper[4676]: I0930 14:06:31.491307 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa6b14cb-7f79-4bc6-bc14-58325daf3c86-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "aa6b14cb-7f79-4bc6-bc14-58325daf3c86" (UID: "aa6b14cb-7f79-4bc6-bc14-58325daf3c86"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:06:31 crc kubenswrapper[4676]: I0930 14:06:31.496529 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa6b14cb-7f79-4bc6-bc14-58325daf3c86-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "aa6b14cb-7f79-4bc6-bc14-58325daf3c86" (UID: "aa6b14cb-7f79-4bc6-bc14-58325daf3c86"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:06:31 crc kubenswrapper[4676]: I0930 14:06:31.496718 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa6b14cb-7f79-4bc6-bc14-58325daf3c86-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "aa6b14cb-7f79-4bc6-bc14-58325daf3c86" (UID: "aa6b14cb-7f79-4bc6-bc14-58325daf3c86"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:06:31 crc kubenswrapper[4676]: I0930 14:06:31.502172 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa6b14cb-7f79-4bc6-bc14-58325daf3c86-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "aa6b14cb-7f79-4bc6-bc14-58325daf3c86" (UID: "aa6b14cb-7f79-4bc6-bc14-58325daf3c86"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:06:31 crc kubenswrapper[4676]: I0930 14:06:31.502693 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa6b14cb-7f79-4bc6-bc14-58325daf3c86-kube-api-access-fndjg" (OuterVolumeSpecName: "kube-api-access-fndjg") pod "aa6b14cb-7f79-4bc6-bc14-58325daf3c86" (UID: "aa6b14cb-7f79-4bc6-bc14-58325daf3c86"). InnerVolumeSpecName "kube-api-access-fndjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:06:31 crc kubenswrapper[4676]: I0930 14:06:31.503070 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "aa6b14cb-7f79-4bc6-bc14-58325daf3c86" (UID: "aa6b14cb-7f79-4bc6-bc14-58325daf3c86"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Sep 30 14:06:31 crc kubenswrapper[4676]: I0930 14:06:31.509354 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa6b14cb-7f79-4bc6-bc14-58325daf3c86-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "aa6b14cb-7f79-4bc6-bc14-58325daf3c86" (UID: "aa6b14cb-7f79-4bc6-bc14-58325daf3c86"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:06:31 crc kubenswrapper[4676]: I0930 14:06:31.591032 4676 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aa6b14cb-7f79-4bc6-bc14-58325daf3c86-registry-tls\") on node \"crc\" DevicePath \"\"" Sep 30 14:06:31 crc kubenswrapper[4676]: I0930 14:06:31.591091 4676 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/aa6b14cb-7f79-4bc6-bc14-58325daf3c86-registry-certificates\") on node \"crc\" DevicePath \"\"" Sep 30 14:06:31 crc kubenswrapper[4676]: I0930 14:06:31.591109 4676 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/aa6b14cb-7f79-4bc6-bc14-58325daf3c86-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Sep 30 14:06:31 crc kubenswrapper[4676]: I0930 14:06:31.591122 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fndjg\" (UniqueName: \"kubernetes.io/projected/aa6b14cb-7f79-4bc6-bc14-58325daf3c86-kube-api-access-fndjg\") on node \"crc\" DevicePath \"\"" Sep 30 14:06:31 crc kubenswrapper[4676]: I0930 14:06:31.591134 4676 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aa6b14cb-7f79-4bc6-bc14-58325daf3c86-bound-sa-token\") on node \"crc\" DevicePath \"\"" Sep 30 14:06:31 crc kubenswrapper[4676]: I0930 14:06:31.591145 4676 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/aa6b14cb-7f79-4bc6-bc14-58325daf3c86-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Sep 30 14:06:31 crc kubenswrapper[4676]: I0930 14:06:31.591157 4676 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aa6b14cb-7f79-4bc6-bc14-58325daf3c86-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 14:06:32 crc kubenswrapper[4676]: I0930 14:06:32.157757 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" event={"ID":"aa6b14cb-7f79-4bc6-bc14-58325daf3c86","Type":"ContainerDied","Data":"45aaf30474aff199fb5b474ee9cb4ac4253ccc78ab319223fa996ad2992a212c"} Sep 30 14:06:32 crc kubenswrapper[4676]: I0930 14:06:32.158104 4676 scope.go:117] "RemoveContainer" containerID="b67ff64aee72ee79fb252d1c7eeebbd63cc03ce7e2a3d4756d92cb1e85f362ff" Sep 30 14:06:32 crc kubenswrapper[4676]: I0930 14:06:32.158110 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-rgs57" Sep 30 14:06:32 crc kubenswrapper[4676]: I0930 14:06:32.184300 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-rgs57"] Sep 30 14:06:32 crc kubenswrapper[4676]: I0930 14:06:32.187152 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-rgs57"] Sep 30 14:06:33 crc kubenswrapper[4676]: I0930 14:06:33.440893 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa6b14cb-7f79-4bc6-bc14-58325daf3c86" path="/var/lib/kubelet/pods/aa6b14cb-7f79-4bc6-bc14-58325daf3c86/volumes" Sep 30 14:07:59 crc kubenswrapper[4676]: I0930 14:07:59.919434 4676 patch_prober.go:28] interesting pod/machine-config-daemon-4k2dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:07:59 crc kubenswrapper[4676]: I0930 14:07:59.920066 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:08:29 crc kubenswrapper[4676]: I0930 14:08:29.919436 4676 patch_prober.go:28] interesting pod/machine-config-daemon-4k2dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:08:29 crc kubenswrapper[4676]: I0930 14:08:29.920539 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:08:59 crc kubenswrapper[4676]: I0930 14:08:59.920141 4676 patch_prober.go:28] interesting pod/machine-config-daemon-4k2dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:08:59 crc kubenswrapper[4676]: I0930 14:08:59.921387 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:08:59 crc kubenswrapper[4676]: I0930 14:08:59.921495 4676 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" Sep 30 14:08:59 crc kubenswrapper[4676]: I0930 14:08:59.922791 4676 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3afeb7a072f24b3d5a2bdb9ea432b2f6f591ebb112783320e64d4cf45e06e1b0"} pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 14:08:59 crc kubenswrapper[4676]: I0930 14:08:59.923010 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerName="machine-config-daemon" containerID="cri-o://3afeb7a072f24b3d5a2bdb9ea432b2f6f591ebb112783320e64d4cf45e06e1b0" gracePeriod=600 Sep 30 14:09:00 crc kubenswrapper[4676]: I0930 14:09:00.080357 4676 generic.go:334] "Generic (PLEG): container finished" podID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerID="3afeb7a072f24b3d5a2bdb9ea432b2f6f591ebb112783320e64d4cf45e06e1b0" exitCode=0 Sep 30 14:09:00 crc kubenswrapper[4676]: I0930 14:09:00.080427 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" event={"ID":"af133cb7-f0e4-428e-b348-c6e81493fc1d","Type":"ContainerDied","Data":"3afeb7a072f24b3d5a2bdb9ea432b2f6f591ebb112783320e64d4cf45e06e1b0"} Sep 30 14:09:00 crc kubenswrapper[4676]: I0930 14:09:00.081116 4676 scope.go:117] "RemoveContainer" containerID="1be5dc03e37879b7c4a0ccbe08345138fa5cba0d902530b2119831d575d4b86d" Sep 30 14:09:01 crc kubenswrapper[4676]: I0930 14:09:01.094399 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" event={"ID":"af133cb7-f0e4-428e-b348-c6e81493fc1d","Type":"ContainerStarted","Data":"8220f97a191b5b5caecda4ca09b3bb938693c25c347212483dfba93ae90b896c"} Sep 30 14:09:04 crc kubenswrapper[4676]: I0930 14:09:04.946474 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-qd84c"] Sep 30 14:09:04 crc kubenswrapper[4676]: E0930 14:09:04.948585 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa6b14cb-7f79-4bc6-bc14-58325daf3c86" containerName="registry" Sep 30 14:09:04 crc kubenswrapper[4676]: I0930 14:09:04.948675 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa6b14cb-7f79-4bc6-bc14-58325daf3c86" containerName="registry" Sep 30 14:09:04 crc kubenswrapper[4676]: I0930 14:09:04.948897 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa6b14cb-7f79-4bc6-bc14-58325daf3c86" containerName="registry" Sep 30 14:09:04 crc kubenswrapper[4676]: I0930 14:09:04.949555 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-qd84c" Sep 30 14:09:04 crc kubenswrapper[4676]: I0930 14:09:04.951551 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-7k5cw"] Sep 30 14:09:04 crc kubenswrapper[4676]: I0930 14:09:04.952126 4676 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-xgxw5" Sep 30 14:09:04 crc kubenswrapper[4676]: I0930 14:09:04.952745 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Sep 30 14:09:04 crc kubenswrapper[4676]: I0930 14:09:04.953032 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Sep 30 14:09:04 crc kubenswrapper[4676]: I0930 14:09:04.953078 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-7k5cw" Sep 30 14:09:04 crc kubenswrapper[4676]: I0930 14:09:04.958378 4676 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-xjtrd" Sep 30 14:09:04 crc kubenswrapper[4676]: I0930 14:09:04.962843 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-qd84c"] Sep 30 14:09:04 crc kubenswrapper[4676]: I0930 14:09:04.968714 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-srpgg"] Sep 30 14:09:04 crc kubenswrapper[4676]: I0930 14:09:04.970776 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-srpgg" Sep 30 14:09:04 crc kubenswrapper[4676]: I0930 14:09:04.973148 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-7k5cw"] Sep 30 14:09:04 crc kubenswrapper[4676]: I0930 14:09:04.973426 4676 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-x8f4r" Sep 30 14:09:04 crc kubenswrapper[4676]: I0930 14:09:04.990139 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-srpgg"] Sep 30 14:09:05 crc kubenswrapper[4676]: I0930 14:09:05.131169 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncp5n\" (UniqueName: \"kubernetes.io/projected/f593d783-2014-4870-b97a-66bd22eba1b4-kube-api-access-ncp5n\") pod \"cert-manager-cainjector-7f985d654d-qd84c\" (UID: \"f593d783-2014-4870-b97a-66bd22eba1b4\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-qd84c" Sep 30 14:09:05 crc kubenswrapper[4676]: I0930 14:09:05.131224 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bc5lm\" (UniqueName: \"kubernetes.io/projected/74eb08ee-8ebc-4f31-a952-22b99cfb68ac-kube-api-access-bc5lm\") pod \"cert-manager-5b446d88c5-7k5cw\" (UID: \"74eb08ee-8ebc-4f31-a952-22b99cfb68ac\") " pod="cert-manager/cert-manager-5b446d88c5-7k5cw" Sep 30 14:09:05 crc kubenswrapper[4676]: I0930 14:09:05.131324 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ts45\" (UniqueName: \"kubernetes.io/projected/faf2d116-cfb0-451e-b919-5ff1e93ee944-kube-api-access-9ts45\") pod \"cert-manager-webhook-5655c58dd6-srpgg\" (UID: \"faf2d116-cfb0-451e-b919-5ff1e93ee944\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-srpgg" Sep 30 14:09:05 crc kubenswrapper[4676]: I0930 14:09:05.231975 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncp5n\" (UniqueName: \"kubernetes.io/projected/f593d783-2014-4870-b97a-66bd22eba1b4-kube-api-access-ncp5n\") pod \"cert-manager-cainjector-7f985d654d-qd84c\" (UID: \"f593d783-2014-4870-b97a-66bd22eba1b4\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-qd84c" Sep 30 14:09:05 crc kubenswrapper[4676]: I0930 14:09:05.232251 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bc5lm\" (UniqueName: \"kubernetes.io/projected/74eb08ee-8ebc-4f31-a952-22b99cfb68ac-kube-api-access-bc5lm\") pod \"cert-manager-5b446d88c5-7k5cw\" (UID: \"74eb08ee-8ebc-4f31-a952-22b99cfb68ac\") " pod="cert-manager/cert-manager-5b446d88c5-7k5cw" Sep 30 14:09:05 crc kubenswrapper[4676]: I0930 14:09:05.232368 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ts45\" (UniqueName: \"kubernetes.io/projected/faf2d116-cfb0-451e-b919-5ff1e93ee944-kube-api-access-9ts45\") pod \"cert-manager-webhook-5655c58dd6-srpgg\" (UID: \"faf2d116-cfb0-451e-b919-5ff1e93ee944\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-srpgg" Sep 30 14:09:05 crc kubenswrapper[4676]: I0930 14:09:05.251120 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bc5lm\" (UniqueName: \"kubernetes.io/projected/74eb08ee-8ebc-4f31-a952-22b99cfb68ac-kube-api-access-bc5lm\") pod \"cert-manager-5b446d88c5-7k5cw\" (UID: \"74eb08ee-8ebc-4f31-a952-22b99cfb68ac\") " pod="cert-manager/cert-manager-5b446d88c5-7k5cw" Sep 30 14:09:05 crc kubenswrapper[4676]: I0930 14:09:05.252277 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncp5n\" (UniqueName: \"kubernetes.io/projected/f593d783-2014-4870-b97a-66bd22eba1b4-kube-api-access-ncp5n\") pod \"cert-manager-cainjector-7f985d654d-qd84c\" (UID: \"f593d783-2014-4870-b97a-66bd22eba1b4\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-qd84c" Sep 30 14:09:05 crc kubenswrapper[4676]: I0930 14:09:05.254021 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ts45\" (UniqueName: \"kubernetes.io/projected/faf2d116-cfb0-451e-b919-5ff1e93ee944-kube-api-access-9ts45\") pod \"cert-manager-webhook-5655c58dd6-srpgg\" (UID: \"faf2d116-cfb0-451e-b919-5ff1e93ee944\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-srpgg" Sep 30 14:09:05 crc kubenswrapper[4676]: I0930 14:09:05.278101 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-qd84c" Sep 30 14:09:05 crc kubenswrapper[4676]: I0930 14:09:05.284627 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-7k5cw" Sep 30 14:09:05 crc kubenswrapper[4676]: I0930 14:09:05.297421 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-srpgg" Sep 30 14:09:05 crc kubenswrapper[4676]: I0930 14:09:05.518851 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-7k5cw"] Sep 30 14:09:05 crc kubenswrapper[4676]: I0930 14:09:05.548655 4676 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 14:09:05 crc kubenswrapper[4676]: W0930 14:09:05.772748 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf593d783_2014_4870_b97a_66bd22eba1b4.slice/crio-35689ff3869a023d5c5e1ae5a283d12367f9c07f80016e0ccbf20ced929ccd38 WatchSource:0}: Error finding container 35689ff3869a023d5c5e1ae5a283d12367f9c07f80016e0ccbf20ced929ccd38: Status 404 returned error can't find the container with id 35689ff3869a023d5c5e1ae5a283d12367f9c07f80016e0ccbf20ced929ccd38 Sep 30 14:09:05 crc kubenswrapper[4676]: I0930 14:09:05.775637 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-qd84c"] Sep 30 14:09:05 crc kubenswrapper[4676]: I0930 14:09:05.788350 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-srpgg"] Sep 30 14:09:05 crc kubenswrapper[4676]: W0930 14:09:05.795527 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfaf2d116_cfb0_451e_b919_5ff1e93ee944.slice/crio-c1f802ec5c4c59ed9bd2a0877690ed21da8bf15f214b8ff20ce20a58c307f8ee WatchSource:0}: Error finding container c1f802ec5c4c59ed9bd2a0877690ed21da8bf15f214b8ff20ce20a58c307f8ee: Status 404 returned error can't find the container with id c1f802ec5c4c59ed9bd2a0877690ed21da8bf15f214b8ff20ce20a58c307f8ee Sep 30 14:09:06 crc kubenswrapper[4676]: I0930 14:09:06.127604 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-qd84c" event={"ID":"f593d783-2014-4870-b97a-66bd22eba1b4","Type":"ContainerStarted","Data":"35689ff3869a023d5c5e1ae5a283d12367f9c07f80016e0ccbf20ced929ccd38"} Sep 30 14:09:06 crc kubenswrapper[4676]: I0930 14:09:06.129435 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-7k5cw" event={"ID":"74eb08ee-8ebc-4f31-a952-22b99cfb68ac","Type":"ContainerStarted","Data":"8d196307b34b8f1f1160f47f2af81b903f3e57ceca9bbbb50556f6983fdb70b9"} Sep 30 14:09:06 crc kubenswrapper[4676]: I0930 14:09:06.130593 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-srpgg" event={"ID":"faf2d116-cfb0-451e-b919-5ff1e93ee944","Type":"ContainerStarted","Data":"c1f802ec5c4c59ed9bd2a0877690ed21da8bf15f214b8ff20ce20a58c307f8ee"} Sep 30 14:09:12 crc kubenswrapper[4676]: I0930 14:09:12.176928 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-qd84c" event={"ID":"f593d783-2014-4870-b97a-66bd22eba1b4","Type":"ContainerStarted","Data":"639f8b3b209ef82533f0ccf255207bcea9b8fd8d0888e05cd94cdbd9686240a0"} Sep 30 14:09:12 crc kubenswrapper[4676]: I0930 14:09:12.179420 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-7k5cw" event={"ID":"74eb08ee-8ebc-4f31-a952-22b99cfb68ac","Type":"ContainerStarted","Data":"94b7ef20f68d608d01c9d65c3586bd24b8292fc06ac8fdd3cf63ca833f443b31"} Sep 30 14:09:12 crc kubenswrapper[4676]: I0930 14:09:12.182174 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-srpgg" event={"ID":"faf2d116-cfb0-451e-b919-5ff1e93ee944","Type":"ContainerStarted","Data":"dc81c58bd89c35f73b6e1e1d3ec0f80a95ed0399bd287a30d24535e80386a051"} Sep 30 14:09:12 crc kubenswrapper[4676]: I0930 14:09:12.182644 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-srpgg" Sep 30 14:09:12 crc kubenswrapper[4676]: I0930 14:09:12.212824 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-7k5cw" podStartSLOduration=2.788946641 podStartE2EDuration="8.212798251s" podCreationTimestamp="2025-09-30 14:09:04 +0000 UTC" firstStartedPulling="2025-09-30 14:09:05.548408646 +0000 UTC m=+649.531497075" lastFinishedPulling="2025-09-30 14:09:10.972260256 +0000 UTC m=+654.955348685" observedRunningTime="2025-09-30 14:09:12.210092001 +0000 UTC m=+656.193180430" watchObservedRunningTime="2025-09-30 14:09:12.212798251 +0000 UTC m=+656.195886680" Sep 30 14:09:12 crc kubenswrapper[4676]: I0930 14:09:12.213660 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-qd84c" podStartSLOduration=2.926133735 podStartE2EDuration="8.213654793s" podCreationTimestamp="2025-09-30 14:09:04 +0000 UTC" firstStartedPulling="2025-09-30 14:09:05.778052965 +0000 UTC m=+649.761141394" lastFinishedPulling="2025-09-30 14:09:11.065574013 +0000 UTC m=+655.048662452" observedRunningTime="2025-09-30 14:09:12.195090212 +0000 UTC m=+656.178178651" watchObservedRunningTime="2025-09-30 14:09:12.213654793 +0000 UTC m=+656.196743222" Sep 30 14:09:12 crc kubenswrapper[4676]: I0930 14:09:12.230349 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-srpgg" podStartSLOduration=3.054229863 podStartE2EDuration="8.230319375s" podCreationTimestamp="2025-09-30 14:09:04 +0000 UTC" firstStartedPulling="2025-09-30 14:09:05.797373885 +0000 UTC m=+649.780462314" lastFinishedPulling="2025-09-30 14:09:10.973463397 +0000 UTC m=+654.956551826" observedRunningTime="2025-09-30 14:09:12.228654832 +0000 UTC m=+656.211743261" watchObservedRunningTime="2025-09-30 14:09:12.230319375 +0000 UTC m=+656.213407804" Sep 30 14:09:15 crc kubenswrapper[4676]: I0930 14:09:15.443347 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-9775s"] Sep 30 14:09:15 crc kubenswrapper[4676]: I0930 14:09:15.444439 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" podUID="4fae6bdf-2a3f-4961-934d-b8f653412538" containerName="ovn-controller" containerID="cri-o://4f1d81c0d66a327a4174d0f39a9540f66d89a6c2049490ddad67174aa684b773" gracePeriod=30 Sep 30 14:09:15 crc kubenswrapper[4676]: I0930 14:09:15.444535 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" podUID="4fae6bdf-2a3f-4961-934d-b8f653412538" containerName="nbdb" containerID="cri-o://d2a8ea8d40cefaa9290dfd65c7f1fa80963910c38b71de3992bdcf77fde19eef" gracePeriod=30 Sep 30 14:09:15 crc kubenswrapper[4676]: I0930 14:09:15.444587 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" podUID="4fae6bdf-2a3f-4961-934d-b8f653412538" containerName="ovn-acl-logging" containerID="cri-o://6836ea3a8ec1a7fdff3ef83bd00963c21caeb4e8264f3b6775201c90b83d4653" gracePeriod=30 Sep 30 14:09:15 crc kubenswrapper[4676]: I0930 14:09:15.444569 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" podUID="4fae6bdf-2a3f-4961-934d-b8f653412538" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://2def96f529a78f915d69adfd74b48057081c10bf8f21bd69fce0d4b305f80a9b" gracePeriod=30 Sep 30 14:09:15 crc kubenswrapper[4676]: I0930 14:09:15.444569 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" podUID="4fae6bdf-2a3f-4961-934d-b8f653412538" containerName="kube-rbac-proxy-node" containerID="cri-o://1dbb7809575c05151462f55a6d1d4eca91ba859d817d152220688e61385beff0" gracePeriod=30 Sep 30 14:09:15 crc kubenswrapper[4676]: I0930 14:09:15.444822 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" podUID="4fae6bdf-2a3f-4961-934d-b8f653412538" containerName="northd" containerID="cri-o://1323c0deb00658d6452b2d6aa4515233551626a4c4def3f4fff539db4a3f5038" gracePeriod=30 Sep 30 14:09:15 crc kubenswrapper[4676]: I0930 14:09:15.444960 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" podUID="4fae6bdf-2a3f-4961-934d-b8f653412538" containerName="sbdb" containerID="cri-o://7d05c6ffe2c46461b1604661e27c75ac6f150de39e30e983d4c4dfbe9c84092e" gracePeriod=30 Sep 30 14:09:15 crc kubenswrapper[4676]: I0930 14:09:15.562569 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" podUID="4fae6bdf-2a3f-4961-934d-b8f653412538" containerName="ovnkube-controller" containerID="cri-o://3570aaa8d7310d0e69298c35f7c86dd176f0969ffb22846f86bda60c92e1f439" gracePeriod=30 Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.211503 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-s7q5x_12808c49-1bed-4251-bcbe-fad6207eea57/kube-multus/2.log" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.212310 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-s7q5x_12808c49-1bed-4251-bcbe-fad6207eea57/kube-multus/1.log" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.212357 4676 generic.go:334] "Generic (PLEG): container finished" podID="12808c49-1bed-4251-bcbe-fad6207eea57" containerID="113c6305d3745ecf2f275c9e94641b977c27aa7cac2b903fe4da072121bcae96" exitCode=2 Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.212422 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-s7q5x" event={"ID":"12808c49-1bed-4251-bcbe-fad6207eea57","Type":"ContainerDied","Data":"113c6305d3745ecf2f275c9e94641b977c27aa7cac2b903fe4da072121bcae96"} Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.212468 4676 scope.go:117] "RemoveContainer" containerID="c09ecf58dd0a201d63bf48e9293f7416eb2fd5f5d394f293ae8f20758bb505b2" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.213427 4676 scope.go:117] "RemoveContainer" containerID="113c6305d3745ecf2f275c9e94641b977c27aa7cac2b903fe4da072121bcae96" Sep 30 14:09:16 crc kubenswrapper[4676]: E0930 14:09:16.213823 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-s7q5x_openshift-multus(12808c49-1bed-4251-bcbe-fad6207eea57)\"" pod="openshift-multus/multus-s7q5x" podUID="12808c49-1bed-4251-bcbe-fad6207eea57" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.214785 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9775s_4fae6bdf-2a3f-4961-934d-b8f653412538/ovnkube-controller/3.log" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.217849 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9775s_4fae6bdf-2a3f-4961-934d-b8f653412538/ovn-acl-logging/0.log" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.218562 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9775s_4fae6bdf-2a3f-4961-934d-b8f653412538/ovn-controller/0.log" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.218999 4676 generic.go:334] "Generic (PLEG): container finished" podID="4fae6bdf-2a3f-4961-934d-b8f653412538" containerID="3570aaa8d7310d0e69298c35f7c86dd176f0969ffb22846f86bda60c92e1f439" exitCode=0 Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.219025 4676 generic.go:334] "Generic (PLEG): container finished" podID="4fae6bdf-2a3f-4961-934d-b8f653412538" containerID="7d05c6ffe2c46461b1604661e27c75ac6f150de39e30e983d4c4dfbe9c84092e" exitCode=0 Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.219035 4676 generic.go:334] "Generic (PLEG): container finished" podID="4fae6bdf-2a3f-4961-934d-b8f653412538" containerID="d2a8ea8d40cefaa9290dfd65c7f1fa80963910c38b71de3992bdcf77fde19eef" exitCode=0 Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.219047 4676 generic.go:334] "Generic (PLEG): container finished" podID="4fae6bdf-2a3f-4961-934d-b8f653412538" containerID="1323c0deb00658d6452b2d6aa4515233551626a4c4def3f4fff539db4a3f5038" exitCode=0 Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.219054 4676 generic.go:334] "Generic (PLEG): container finished" podID="4fae6bdf-2a3f-4961-934d-b8f653412538" containerID="2def96f529a78f915d69adfd74b48057081c10bf8f21bd69fce0d4b305f80a9b" exitCode=0 Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.219065 4676 generic.go:334] "Generic (PLEG): container finished" podID="4fae6bdf-2a3f-4961-934d-b8f653412538" containerID="1dbb7809575c05151462f55a6d1d4eca91ba859d817d152220688e61385beff0" exitCode=0 Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.219073 4676 generic.go:334] "Generic (PLEG): container finished" podID="4fae6bdf-2a3f-4961-934d-b8f653412538" containerID="6836ea3a8ec1a7fdff3ef83bd00963c21caeb4e8264f3b6775201c90b83d4653" exitCode=143 Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.219081 4676 generic.go:334] "Generic (PLEG): container finished" podID="4fae6bdf-2a3f-4961-934d-b8f653412538" containerID="4f1d81c0d66a327a4174d0f39a9540f66d89a6c2049490ddad67174aa684b773" exitCode=143 Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.219104 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" event={"ID":"4fae6bdf-2a3f-4961-934d-b8f653412538","Type":"ContainerDied","Data":"3570aaa8d7310d0e69298c35f7c86dd176f0969ffb22846f86bda60c92e1f439"} Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.219128 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" event={"ID":"4fae6bdf-2a3f-4961-934d-b8f653412538","Type":"ContainerDied","Data":"7d05c6ffe2c46461b1604661e27c75ac6f150de39e30e983d4c4dfbe9c84092e"} Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.219141 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" event={"ID":"4fae6bdf-2a3f-4961-934d-b8f653412538","Type":"ContainerDied","Data":"d2a8ea8d40cefaa9290dfd65c7f1fa80963910c38b71de3992bdcf77fde19eef"} Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.219151 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" event={"ID":"4fae6bdf-2a3f-4961-934d-b8f653412538","Type":"ContainerDied","Data":"1323c0deb00658d6452b2d6aa4515233551626a4c4def3f4fff539db4a3f5038"} Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.219161 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" event={"ID":"4fae6bdf-2a3f-4961-934d-b8f653412538","Type":"ContainerDied","Data":"2def96f529a78f915d69adfd74b48057081c10bf8f21bd69fce0d4b305f80a9b"} Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.219170 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" event={"ID":"4fae6bdf-2a3f-4961-934d-b8f653412538","Type":"ContainerDied","Data":"1dbb7809575c05151462f55a6d1d4eca91ba859d817d152220688e61385beff0"} Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.219180 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" event={"ID":"4fae6bdf-2a3f-4961-934d-b8f653412538","Type":"ContainerDied","Data":"6836ea3a8ec1a7fdff3ef83bd00963c21caeb4e8264f3b6775201c90b83d4653"} Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.219195 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" event={"ID":"4fae6bdf-2a3f-4961-934d-b8f653412538","Type":"ContainerDied","Data":"4f1d81c0d66a327a4174d0f39a9540f66d89a6c2049490ddad67174aa684b773"} Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.247728 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9775s_4fae6bdf-2a3f-4961-934d-b8f653412538/ovnkube-controller/3.log" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.248147 4676 scope.go:117] "RemoveContainer" containerID="b8cab1fe8b98cb8ace2412a246546b498693392bd88d2ce95ed838184c795670" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.252239 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9775s_4fae6bdf-2a3f-4961-934d-b8f653412538/ovn-acl-logging/0.log" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.252760 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9775s_4fae6bdf-2a3f-4961-934d-b8f653412538/ovn-controller/0.log" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.253254 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.295276 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-run-systemd\") pod \"4fae6bdf-2a3f-4961-934d-b8f653412538\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.295327 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-log-socket\") pod \"4fae6bdf-2a3f-4961-934d-b8f653412538\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.295355 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-host-run-ovn-kubernetes\") pod \"4fae6bdf-2a3f-4961-934d-b8f653412538\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.295379 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-node-log\") pod \"4fae6bdf-2a3f-4961-934d-b8f653412538\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.295401 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-host-cni-bin\") pod \"4fae6bdf-2a3f-4961-934d-b8f653412538\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.295425 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-run-openvswitch\") pod \"4fae6bdf-2a3f-4961-934d-b8f653412538\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.295463 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4fae6bdf-2a3f-4961-934d-b8f653412538-env-overrides\") pod \"4fae6bdf-2a3f-4961-934d-b8f653412538\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.295484 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-host-run-netns\") pod \"4fae6bdf-2a3f-4961-934d-b8f653412538\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.295502 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-host-cni-netd\") pod \"4fae6bdf-2a3f-4961-934d-b8f653412538\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.295534 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4fae6bdf-2a3f-4961-934d-b8f653412538-ovn-node-metrics-cert\") pod \"4fae6bdf-2a3f-4961-934d-b8f653412538\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.295556 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4fae6bdf-2a3f-4961-934d-b8f653412538-ovnkube-config\") pod \"4fae6bdf-2a3f-4961-934d-b8f653412538\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.295577 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-run-ovn\") pod \"4fae6bdf-2a3f-4961-934d-b8f653412538\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.295595 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-var-lib-openvswitch\") pod \"4fae6bdf-2a3f-4961-934d-b8f653412538\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.295620 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4fae6bdf-2a3f-4961-934d-b8f653412538-ovnkube-script-lib\") pod \"4fae6bdf-2a3f-4961-934d-b8f653412538\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.295639 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-host-slash\") pod \"4fae6bdf-2a3f-4961-934d-b8f653412538\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.295694 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r529k\" (UniqueName: \"kubernetes.io/projected/4fae6bdf-2a3f-4961-934d-b8f653412538-kube-api-access-r529k\") pod \"4fae6bdf-2a3f-4961-934d-b8f653412538\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.295712 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-systemd-units\") pod \"4fae6bdf-2a3f-4961-934d-b8f653412538\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.295735 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-host-var-lib-cni-networks-ovn-kubernetes\") pod \"4fae6bdf-2a3f-4961-934d-b8f653412538\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.295756 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-host-kubelet\") pod \"4fae6bdf-2a3f-4961-934d-b8f653412538\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.295781 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-etc-openvswitch\") pod \"4fae6bdf-2a3f-4961-934d-b8f653412538\" (UID: \"4fae6bdf-2a3f-4961-934d-b8f653412538\") " Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.296117 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "4fae6bdf-2a3f-4961-934d-b8f653412538" (UID: "4fae6bdf-2a3f-4961-934d-b8f653412538"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.296124 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "4fae6bdf-2a3f-4961-934d-b8f653412538" (UID: "4fae6bdf-2a3f-4961-934d-b8f653412538"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.296143 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "4fae6bdf-2a3f-4961-934d-b8f653412538" (UID: "4fae6bdf-2a3f-4961-934d-b8f653412538"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.296413 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "4fae6bdf-2a3f-4961-934d-b8f653412538" (UID: "4fae6bdf-2a3f-4961-934d-b8f653412538"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.296446 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fae6bdf-2a3f-4961-934d-b8f653412538-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "4fae6bdf-2a3f-4961-934d-b8f653412538" (UID: "4fae6bdf-2a3f-4961-934d-b8f653412538"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.296446 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "4fae6bdf-2a3f-4961-934d-b8f653412538" (UID: "4fae6bdf-2a3f-4961-934d-b8f653412538"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.296482 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "4fae6bdf-2a3f-4961-934d-b8f653412538" (UID: "4fae6bdf-2a3f-4961-934d-b8f653412538"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.296476 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fae6bdf-2a3f-4961-934d-b8f653412538-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "4fae6bdf-2a3f-4961-934d-b8f653412538" (UID: "4fae6bdf-2a3f-4961-934d-b8f653412538"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.296502 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-log-socket" (OuterVolumeSpecName: "log-socket") pod "4fae6bdf-2a3f-4961-934d-b8f653412538" (UID: "4fae6bdf-2a3f-4961-934d-b8f653412538"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.296522 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "4fae6bdf-2a3f-4961-934d-b8f653412538" (UID: "4fae6bdf-2a3f-4961-934d-b8f653412538"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.296544 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "4fae6bdf-2a3f-4961-934d-b8f653412538" (UID: "4fae6bdf-2a3f-4961-934d-b8f653412538"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.296564 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-host-slash" (OuterVolumeSpecName: "host-slash") pod "4fae6bdf-2a3f-4961-934d-b8f653412538" (UID: "4fae6bdf-2a3f-4961-934d-b8f653412538"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.296846 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fae6bdf-2a3f-4961-934d-b8f653412538-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "4fae6bdf-2a3f-4961-934d-b8f653412538" (UID: "4fae6bdf-2a3f-4961-934d-b8f653412538"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.296906 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "4fae6bdf-2a3f-4961-934d-b8f653412538" (UID: "4fae6bdf-2a3f-4961-934d-b8f653412538"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.296936 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "4fae6bdf-2a3f-4961-934d-b8f653412538" (UID: "4fae6bdf-2a3f-4961-934d-b8f653412538"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.297082 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-node-log" (OuterVolumeSpecName: "node-log") pod "4fae6bdf-2a3f-4961-934d-b8f653412538" (UID: "4fae6bdf-2a3f-4961-934d-b8f653412538"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.297157 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "4fae6bdf-2a3f-4961-934d-b8f653412538" (UID: "4fae6bdf-2a3f-4961-934d-b8f653412538"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.303461 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fae6bdf-2a3f-4961-934d-b8f653412538-kube-api-access-r529k" (OuterVolumeSpecName: "kube-api-access-r529k") pod "4fae6bdf-2a3f-4961-934d-b8f653412538" (UID: "4fae6bdf-2a3f-4961-934d-b8f653412538"). InnerVolumeSpecName "kube-api-access-r529k". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.304550 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fae6bdf-2a3f-4961-934d-b8f653412538-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "4fae6bdf-2a3f-4961-934d-b8f653412538" (UID: "4fae6bdf-2a3f-4961-934d-b8f653412538"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.307426 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-6p8bh"] Sep 30 14:09:16 crc kubenswrapper[4676]: E0930 14:09:16.307652 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fae6bdf-2a3f-4961-934d-b8f653412538" containerName="kube-rbac-proxy-node" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.307667 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fae6bdf-2a3f-4961-934d-b8f653412538" containerName="kube-rbac-proxy-node" Sep 30 14:09:16 crc kubenswrapper[4676]: E0930 14:09:16.307678 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fae6bdf-2a3f-4961-934d-b8f653412538" containerName="ovnkube-controller" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.307683 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fae6bdf-2a3f-4961-934d-b8f653412538" containerName="ovnkube-controller" Sep 30 14:09:16 crc kubenswrapper[4676]: E0930 14:09:16.307691 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fae6bdf-2a3f-4961-934d-b8f653412538" containerName="nbdb" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.307696 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fae6bdf-2a3f-4961-934d-b8f653412538" containerName="nbdb" Sep 30 14:09:16 crc kubenswrapper[4676]: E0930 14:09:16.307704 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fae6bdf-2a3f-4961-934d-b8f653412538" containerName="ovnkube-controller" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.307711 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fae6bdf-2a3f-4961-934d-b8f653412538" containerName="ovnkube-controller" Sep 30 14:09:16 crc kubenswrapper[4676]: E0930 14:09:16.307721 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fae6bdf-2a3f-4961-934d-b8f653412538" containerName="kubecfg-setup" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.307727 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fae6bdf-2a3f-4961-934d-b8f653412538" containerName="kubecfg-setup" Sep 30 14:09:16 crc kubenswrapper[4676]: E0930 14:09:16.307734 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fae6bdf-2a3f-4961-934d-b8f653412538" containerName="ovn-controller" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.307739 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fae6bdf-2a3f-4961-934d-b8f653412538" containerName="ovn-controller" Sep 30 14:09:16 crc kubenswrapper[4676]: E0930 14:09:16.307748 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fae6bdf-2a3f-4961-934d-b8f653412538" containerName="sbdb" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.307754 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fae6bdf-2a3f-4961-934d-b8f653412538" containerName="sbdb" Sep 30 14:09:16 crc kubenswrapper[4676]: E0930 14:09:16.307762 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fae6bdf-2a3f-4961-934d-b8f653412538" containerName="ovnkube-controller" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.307769 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fae6bdf-2a3f-4961-934d-b8f653412538" containerName="ovnkube-controller" Sep 30 14:09:16 crc kubenswrapper[4676]: E0930 14:09:16.307776 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fae6bdf-2a3f-4961-934d-b8f653412538" containerName="northd" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.307781 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fae6bdf-2a3f-4961-934d-b8f653412538" containerName="northd" Sep 30 14:09:16 crc kubenswrapper[4676]: E0930 14:09:16.307792 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fae6bdf-2a3f-4961-934d-b8f653412538" containerName="kube-rbac-proxy-ovn-metrics" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.307798 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fae6bdf-2a3f-4961-934d-b8f653412538" containerName="kube-rbac-proxy-ovn-metrics" Sep 30 14:09:16 crc kubenswrapper[4676]: E0930 14:09:16.307806 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fae6bdf-2a3f-4961-934d-b8f653412538" containerName="ovn-acl-logging" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.307811 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fae6bdf-2a3f-4961-934d-b8f653412538" containerName="ovn-acl-logging" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.307912 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fae6bdf-2a3f-4961-934d-b8f653412538" containerName="sbdb" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.307924 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fae6bdf-2a3f-4961-934d-b8f653412538" containerName="northd" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.307934 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fae6bdf-2a3f-4961-934d-b8f653412538" containerName="ovnkube-controller" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.307946 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fae6bdf-2a3f-4961-934d-b8f653412538" containerName="kube-rbac-proxy-node" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.307955 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fae6bdf-2a3f-4961-934d-b8f653412538" containerName="ovnkube-controller" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.307963 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fae6bdf-2a3f-4961-934d-b8f653412538" containerName="ovn-acl-logging" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.307971 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fae6bdf-2a3f-4961-934d-b8f653412538" containerName="nbdb" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.307982 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fae6bdf-2a3f-4961-934d-b8f653412538" containerName="ovnkube-controller" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.307991 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fae6bdf-2a3f-4961-934d-b8f653412538" containerName="ovnkube-controller" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.308001 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fae6bdf-2a3f-4961-934d-b8f653412538" containerName="ovnkube-controller" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.308010 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fae6bdf-2a3f-4961-934d-b8f653412538" containerName="ovn-controller" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.308021 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fae6bdf-2a3f-4961-934d-b8f653412538" containerName="kube-rbac-proxy-ovn-metrics" Sep 30 14:09:16 crc kubenswrapper[4676]: E0930 14:09:16.308123 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fae6bdf-2a3f-4961-934d-b8f653412538" containerName="ovnkube-controller" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.308131 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fae6bdf-2a3f-4961-934d-b8f653412538" containerName="ovnkube-controller" Sep 30 14:09:16 crc kubenswrapper[4676]: E0930 14:09:16.308143 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fae6bdf-2a3f-4961-934d-b8f653412538" containerName="ovnkube-controller" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.308149 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fae6bdf-2a3f-4961-934d-b8f653412538" containerName="ovnkube-controller" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.309740 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.320585 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "4fae6bdf-2a3f-4961-934d-b8f653412538" (UID: "4fae6bdf-2a3f-4961-934d-b8f653412538"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.396453 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c7fea76e-e952-4574-ba7b-786f42a87daf-run-ovn\") pod \"ovnkube-node-6p8bh\" (UID: \"c7fea76e-e952-4574-ba7b-786f42a87daf\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.396499 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c7fea76e-e952-4574-ba7b-786f42a87daf-host-cni-netd\") pod \"ovnkube-node-6p8bh\" (UID: \"c7fea76e-e952-4574-ba7b-786f42a87daf\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.396518 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkspr\" (UniqueName: \"kubernetes.io/projected/c7fea76e-e952-4574-ba7b-786f42a87daf-kube-api-access-jkspr\") pod \"ovnkube-node-6p8bh\" (UID: \"c7fea76e-e952-4574-ba7b-786f42a87daf\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.396535 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c7fea76e-e952-4574-ba7b-786f42a87daf-etc-openvswitch\") pod \"ovnkube-node-6p8bh\" (UID: \"c7fea76e-e952-4574-ba7b-786f42a87daf\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.396552 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c7fea76e-e952-4574-ba7b-786f42a87daf-host-run-ovn-kubernetes\") pod \"ovnkube-node-6p8bh\" (UID: \"c7fea76e-e952-4574-ba7b-786f42a87daf\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.396600 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c7fea76e-e952-4574-ba7b-786f42a87daf-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6p8bh\" (UID: \"c7fea76e-e952-4574-ba7b-786f42a87daf\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.396627 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c7fea76e-e952-4574-ba7b-786f42a87daf-run-openvswitch\") pod \"ovnkube-node-6p8bh\" (UID: \"c7fea76e-e952-4574-ba7b-786f42a87daf\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.396647 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c7fea76e-e952-4574-ba7b-786f42a87daf-host-cni-bin\") pod \"ovnkube-node-6p8bh\" (UID: \"c7fea76e-e952-4574-ba7b-786f42a87daf\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.396713 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c7fea76e-e952-4574-ba7b-786f42a87daf-env-overrides\") pod \"ovnkube-node-6p8bh\" (UID: \"c7fea76e-e952-4574-ba7b-786f42a87daf\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.396767 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c7fea76e-e952-4574-ba7b-786f42a87daf-node-log\") pod \"ovnkube-node-6p8bh\" (UID: \"c7fea76e-e952-4574-ba7b-786f42a87daf\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.396793 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c7fea76e-e952-4574-ba7b-786f42a87daf-ovn-node-metrics-cert\") pod \"ovnkube-node-6p8bh\" (UID: \"c7fea76e-e952-4574-ba7b-786f42a87daf\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.396814 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c7fea76e-e952-4574-ba7b-786f42a87daf-systemd-units\") pod \"ovnkube-node-6p8bh\" (UID: \"c7fea76e-e952-4574-ba7b-786f42a87daf\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.396833 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c7fea76e-e952-4574-ba7b-786f42a87daf-var-lib-openvswitch\") pod \"ovnkube-node-6p8bh\" (UID: \"c7fea76e-e952-4574-ba7b-786f42a87daf\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.396854 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c7fea76e-e952-4574-ba7b-786f42a87daf-host-run-netns\") pod \"ovnkube-node-6p8bh\" (UID: \"c7fea76e-e952-4574-ba7b-786f42a87daf\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.396875 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c7fea76e-e952-4574-ba7b-786f42a87daf-ovnkube-config\") pod \"ovnkube-node-6p8bh\" (UID: \"c7fea76e-e952-4574-ba7b-786f42a87daf\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.396966 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c7fea76e-e952-4574-ba7b-786f42a87daf-log-socket\") pod \"ovnkube-node-6p8bh\" (UID: \"c7fea76e-e952-4574-ba7b-786f42a87daf\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.397001 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c7fea76e-e952-4574-ba7b-786f42a87daf-ovnkube-script-lib\") pod \"ovnkube-node-6p8bh\" (UID: \"c7fea76e-e952-4574-ba7b-786f42a87daf\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.397036 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c7fea76e-e952-4574-ba7b-786f42a87daf-host-slash\") pod \"ovnkube-node-6p8bh\" (UID: \"c7fea76e-e952-4574-ba7b-786f42a87daf\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.397064 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c7fea76e-e952-4574-ba7b-786f42a87daf-host-kubelet\") pod \"ovnkube-node-6p8bh\" (UID: \"c7fea76e-e952-4574-ba7b-786f42a87daf\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.397089 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c7fea76e-e952-4574-ba7b-786f42a87daf-run-systemd\") pod \"ovnkube-node-6p8bh\" (UID: \"c7fea76e-e952-4574-ba7b-786f42a87daf\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.397154 4676 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.397171 4676 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-run-systemd\") on node \"crc\" DevicePath \"\"" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.397197 4676 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-log-socket\") on node \"crc\" DevicePath \"\"" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.397211 4676 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.397223 4676 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-host-cni-bin\") on node \"crc\" DevicePath \"\"" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.397234 4676 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-node-log\") on node \"crc\" DevicePath \"\"" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.397244 4676 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-run-openvswitch\") on node \"crc\" DevicePath \"\"" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.397255 4676 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4fae6bdf-2a3f-4961-934d-b8f653412538-env-overrides\") on node \"crc\" DevicePath \"\"" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.397265 4676 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-host-cni-netd\") on node \"crc\" DevicePath \"\"" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.397276 4676 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-host-run-netns\") on node \"crc\" DevicePath \"\"" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.397285 4676 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4fae6bdf-2a3f-4961-934d-b8f653412538-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.397293 4676 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4fae6bdf-2a3f-4961-934d-b8f653412538-ovnkube-config\") on node \"crc\" DevicePath \"\"" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.397301 4676 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-run-ovn\") on node \"crc\" DevicePath \"\"" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.397309 4676 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.397317 4676 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-host-slash\") on node \"crc\" DevicePath \"\"" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.397325 4676 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4fae6bdf-2a3f-4961-934d-b8f653412538-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.397334 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r529k\" (UniqueName: \"kubernetes.io/projected/4fae6bdf-2a3f-4961-934d-b8f653412538-kube-api-access-r529k\") on node \"crc\" DevicePath \"\"" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.397342 4676 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-systemd-units\") on node \"crc\" DevicePath \"\"" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.397350 4676 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.397357 4676 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4fae6bdf-2a3f-4961-934d-b8f653412538-host-kubelet\") on node \"crc\" DevicePath \"\"" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.497919 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkspr\" (UniqueName: \"kubernetes.io/projected/c7fea76e-e952-4574-ba7b-786f42a87daf-kube-api-access-jkspr\") pod \"ovnkube-node-6p8bh\" (UID: \"c7fea76e-e952-4574-ba7b-786f42a87daf\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.497974 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c7fea76e-e952-4574-ba7b-786f42a87daf-etc-openvswitch\") pod \"ovnkube-node-6p8bh\" (UID: \"c7fea76e-e952-4574-ba7b-786f42a87daf\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.497999 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c7fea76e-e952-4574-ba7b-786f42a87daf-host-run-ovn-kubernetes\") pod \"ovnkube-node-6p8bh\" (UID: \"c7fea76e-e952-4574-ba7b-786f42a87daf\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.498022 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c7fea76e-e952-4574-ba7b-786f42a87daf-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6p8bh\" (UID: \"c7fea76e-e952-4574-ba7b-786f42a87daf\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.498045 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c7fea76e-e952-4574-ba7b-786f42a87daf-run-openvswitch\") pod \"ovnkube-node-6p8bh\" (UID: \"c7fea76e-e952-4574-ba7b-786f42a87daf\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.498062 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c7fea76e-e952-4574-ba7b-786f42a87daf-host-run-ovn-kubernetes\") pod \"ovnkube-node-6p8bh\" (UID: \"c7fea76e-e952-4574-ba7b-786f42a87daf\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.498099 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c7fea76e-e952-4574-ba7b-786f42a87daf-host-cni-bin\") pod \"ovnkube-node-6p8bh\" (UID: \"c7fea76e-e952-4574-ba7b-786f42a87daf\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.498100 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c7fea76e-e952-4574-ba7b-786f42a87daf-etc-openvswitch\") pod \"ovnkube-node-6p8bh\" (UID: \"c7fea76e-e952-4574-ba7b-786f42a87daf\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.498132 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c7fea76e-e952-4574-ba7b-786f42a87daf-env-overrides\") pod \"ovnkube-node-6p8bh\" (UID: \"c7fea76e-e952-4574-ba7b-786f42a87daf\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.498159 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c7fea76e-e952-4574-ba7b-786f42a87daf-host-cni-bin\") pod \"ovnkube-node-6p8bh\" (UID: \"c7fea76e-e952-4574-ba7b-786f42a87daf\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.498116 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c7fea76e-e952-4574-ba7b-786f42a87daf-run-openvswitch\") pod \"ovnkube-node-6p8bh\" (UID: \"c7fea76e-e952-4574-ba7b-786f42a87daf\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.498193 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c7fea76e-e952-4574-ba7b-786f42a87daf-node-log\") pod \"ovnkube-node-6p8bh\" (UID: \"c7fea76e-e952-4574-ba7b-786f42a87daf\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.498269 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c7fea76e-e952-4574-ba7b-786f42a87daf-ovn-node-metrics-cert\") pod \"ovnkube-node-6p8bh\" (UID: \"c7fea76e-e952-4574-ba7b-786f42a87daf\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.498291 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c7fea76e-e952-4574-ba7b-786f42a87daf-systemd-units\") pod \"ovnkube-node-6p8bh\" (UID: \"c7fea76e-e952-4574-ba7b-786f42a87daf\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.498311 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c7fea76e-e952-4574-ba7b-786f42a87daf-var-lib-openvswitch\") pod \"ovnkube-node-6p8bh\" (UID: \"c7fea76e-e952-4574-ba7b-786f42a87daf\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.498331 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c7fea76e-e952-4574-ba7b-786f42a87daf-host-run-netns\") pod \"ovnkube-node-6p8bh\" (UID: \"c7fea76e-e952-4574-ba7b-786f42a87daf\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.498237 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c7fea76e-e952-4574-ba7b-786f42a87daf-node-log\") pod \"ovnkube-node-6p8bh\" (UID: \"c7fea76e-e952-4574-ba7b-786f42a87daf\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.498370 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c7fea76e-e952-4574-ba7b-786f42a87daf-systemd-units\") pod \"ovnkube-node-6p8bh\" (UID: \"c7fea76e-e952-4574-ba7b-786f42a87daf\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.498192 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c7fea76e-e952-4574-ba7b-786f42a87daf-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6p8bh\" (UID: \"c7fea76e-e952-4574-ba7b-786f42a87daf\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.498425 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c7fea76e-e952-4574-ba7b-786f42a87daf-host-run-netns\") pod \"ovnkube-node-6p8bh\" (UID: \"c7fea76e-e952-4574-ba7b-786f42a87daf\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.498427 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c7fea76e-e952-4574-ba7b-786f42a87daf-ovnkube-config\") pod \"ovnkube-node-6p8bh\" (UID: \"c7fea76e-e952-4574-ba7b-786f42a87daf\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.498460 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c7fea76e-e952-4574-ba7b-786f42a87daf-var-lib-openvswitch\") pod \"ovnkube-node-6p8bh\" (UID: \"c7fea76e-e952-4574-ba7b-786f42a87daf\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.498505 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c7fea76e-e952-4574-ba7b-786f42a87daf-log-socket\") pod \"ovnkube-node-6p8bh\" (UID: \"c7fea76e-e952-4574-ba7b-786f42a87daf\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.498544 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c7fea76e-e952-4574-ba7b-786f42a87daf-log-socket\") pod \"ovnkube-node-6p8bh\" (UID: \"c7fea76e-e952-4574-ba7b-786f42a87daf\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.498618 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c7fea76e-e952-4574-ba7b-786f42a87daf-ovnkube-script-lib\") pod \"ovnkube-node-6p8bh\" (UID: \"c7fea76e-e952-4574-ba7b-786f42a87daf\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.498679 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c7fea76e-e952-4574-ba7b-786f42a87daf-host-slash\") pod \"ovnkube-node-6p8bh\" (UID: \"c7fea76e-e952-4574-ba7b-786f42a87daf\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.498716 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c7fea76e-e952-4574-ba7b-786f42a87daf-host-kubelet\") pod \"ovnkube-node-6p8bh\" (UID: \"c7fea76e-e952-4574-ba7b-786f42a87daf\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.498750 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c7fea76e-e952-4574-ba7b-786f42a87daf-run-systemd\") pod \"ovnkube-node-6p8bh\" (UID: \"c7fea76e-e952-4574-ba7b-786f42a87daf\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.498768 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c7fea76e-e952-4574-ba7b-786f42a87daf-host-cni-netd\") pod \"ovnkube-node-6p8bh\" (UID: \"c7fea76e-e952-4574-ba7b-786f42a87daf\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.498785 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c7fea76e-e952-4574-ba7b-786f42a87daf-host-slash\") pod \"ovnkube-node-6p8bh\" (UID: \"c7fea76e-e952-4574-ba7b-786f42a87daf\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.498807 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c7fea76e-e952-4574-ba7b-786f42a87daf-host-kubelet\") pod \"ovnkube-node-6p8bh\" (UID: \"c7fea76e-e952-4574-ba7b-786f42a87daf\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.498787 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c7fea76e-e952-4574-ba7b-786f42a87daf-run-ovn\") pod \"ovnkube-node-6p8bh\" (UID: \"c7fea76e-e952-4574-ba7b-786f42a87daf\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.498822 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c7fea76e-e952-4574-ba7b-786f42a87daf-run-systemd\") pod \"ovnkube-node-6p8bh\" (UID: \"c7fea76e-e952-4574-ba7b-786f42a87daf\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.498822 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c7fea76e-e952-4574-ba7b-786f42a87daf-run-ovn\") pod \"ovnkube-node-6p8bh\" (UID: \"c7fea76e-e952-4574-ba7b-786f42a87daf\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.498863 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c7fea76e-e952-4574-ba7b-786f42a87daf-host-cni-netd\") pod \"ovnkube-node-6p8bh\" (UID: \"c7fea76e-e952-4574-ba7b-786f42a87daf\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.498952 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c7fea76e-e952-4574-ba7b-786f42a87daf-env-overrides\") pod \"ovnkube-node-6p8bh\" (UID: \"c7fea76e-e952-4574-ba7b-786f42a87daf\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.499189 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c7fea76e-e952-4574-ba7b-786f42a87daf-ovnkube-config\") pod \"ovnkube-node-6p8bh\" (UID: \"c7fea76e-e952-4574-ba7b-786f42a87daf\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.499557 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c7fea76e-e952-4574-ba7b-786f42a87daf-ovnkube-script-lib\") pod \"ovnkube-node-6p8bh\" (UID: \"c7fea76e-e952-4574-ba7b-786f42a87daf\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.501628 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c7fea76e-e952-4574-ba7b-786f42a87daf-ovn-node-metrics-cert\") pod \"ovnkube-node-6p8bh\" (UID: \"c7fea76e-e952-4574-ba7b-786f42a87daf\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.514563 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkspr\" (UniqueName: \"kubernetes.io/projected/c7fea76e-e952-4574-ba7b-786f42a87daf-kube-api-access-jkspr\") pod \"ovnkube-node-6p8bh\" (UID: \"c7fea76e-e952-4574-ba7b-786f42a87daf\") " pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" Sep 30 14:09:16 crc kubenswrapper[4676]: I0930 14:09:16.623156 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" Sep 30 14:09:17 crc kubenswrapper[4676]: I0930 14:09:17.229376 4676 generic.go:334] "Generic (PLEG): container finished" podID="c7fea76e-e952-4574-ba7b-786f42a87daf" containerID="5f34ee49b248202a8ccb5bf13b1b4a8a8ab97d7bf8ee7180a3400a44f5600b38" exitCode=0 Sep 30 14:09:17 crc kubenswrapper[4676]: I0930 14:09:17.229513 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" event={"ID":"c7fea76e-e952-4574-ba7b-786f42a87daf","Type":"ContainerDied","Data":"5f34ee49b248202a8ccb5bf13b1b4a8a8ab97d7bf8ee7180a3400a44f5600b38"} Sep 30 14:09:17 crc kubenswrapper[4676]: I0930 14:09:17.229976 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" event={"ID":"c7fea76e-e952-4574-ba7b-786f42a87daf","Type":"ContainerStarted","Data":"1154cb052798312ddbf36fdf0b7ca76ebdca4f89bf0a3768cea288edc0eae73f"} Sep 30 14:09:17 crc kubenswrapper[4676]: I0930 14:09:17.240926 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9775s_4fae6bdf-2a3f-4961-934d-b8f653412538/ovn-acl-logging/0.log" Sep 30 14:09:17 crc kubenswrapper[4676]: I0930 14:09:17.242081 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9775s_4fae6bdf-2a3f-4961-934d-b8f653412538/ovn-controller/0.log" Sep 30 14:09:17 crc kubenswrapper[4676]: I0930 14:09:17.242691 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" event={"ID":"4fae6bdf-2a3f-4961-934d-b8f653412538","Type":"ContainerDied","Data":"b4b3ede39112fa145ad47d0259f78cb10b78ed13fbc9ab5e8d51d85991a81773"} Sep 30 14:09:17 crc kubenswrapper[4676]: I0930 14:09:17.242739 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9775s" Sep 30 14:09:17 crc kubenswrapper[4676]: I0930 14:09:17.242768 4676 scope.go:117] "RemoveContainer" containerID="3570aaa8d7310d0e69298c35f7c86dd176f0969ffb22846f86bda60c92e1f439" Sep 30 14:09:17 crc kubenswrapper[4676]: I0930 14:09:17.246038 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-s7q5x_12808c49-1bed-4251-bcbe-fad6207eea57/kube-multus/2.log" Sep 30 14:09:17 crc kubenswrapper[4676]: I0930 14:09:17.266961 4676 scope.go:117] "RemoveContainer" containerID="7d05c6ffe2c46461b1604661e27c75ac6f150de39e30e983d4c4dfbe9c84092e" Sep 30 14:09:17 crc kubenswrapper[4676]: I0930 14:09:17.303437 4676 scope.go:117] "RemoveContainer" containerID="d2a8ea8d40cefaa9290dfd65c7f1fa80963910c38b71de3992bdcf77fde19eef" Sep 30 14:09:17 crc kubenswrapper[4676]: I0930 14:09:17.343148 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-9775s"] Sep 30 14:09:17 crc kubenswrapper[4676]: I0930 14:09:17.348037 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-9775s"] Sep 30 14:09:17 crc kubenswrapper[4676]: I0930 14:09:17.366152 4676 scope.go:117] "RemoveContainer" containerID="1323c0deb00658d6452b2d6aa4515233551626a4c4def3f4fff539db4a3f5038" Sep 30 14:09:17 crc kubenswrapper[4676]: I0930 14:09:17.380048 4676 scope.go:117] "RemoveContainer" containerID="2def96f529a78f915d69adfd74b48057081c10bf8f21bd69fce0d4b305f80a9b" Sep 30 14:09:17 crc kubenswrapper[4676]: I0930 14:09:17.403555 4676 scope.go:117] "RemoveContainer" containerID="1dbb7809575c05151462f55a6d1d4eca91ba859d817d152220688e61385beff0" Sep 30 14:09:17 crc kubenswrapper[4676]: I0930 14:09:17.423186 4676 scope.go:117] "RemoveContainer" containerID="6836ea3a8ec1a7fdff3ef83bd00963c21caeb4e8264f3b6775201c90b83d4653" Sep 30 14:09:17 crc kubenswrapper[4676]: I0930 14:09:17.438123 4676 scope.go:117] "RemoveContainer" containerID="4f1d81c0d66a327a4174d0f39a9540f66d89a6c2049490ddad67174aa684b773" Sep 30 14:09:17 crc kubenswrapper[4676]: I0930 14:09:17.440812 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fae6bdf-2a3f-4961-934d-b8f653412538" path="/var/lib/kubelet/pods/4fae6bdf-2a3f-4961-934d-b8f653412538/volumes" Sep 30 14:09:17 crc kubenswrapper[4676]: I0930 14:09:17.454811 4676 scope.go:117] "RemoveContainer" containerID="d6b55521ba448057d6f78ab17ef0a1a7d4c8e593e5c7a23bebb4db492fb4b54f" Sep 30 14:09:18 crc kubenswrapper[4676]: I0930 14:09:18.256555 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" event={"ID":"c7fea76e-e952-4574-ba7b-786f42a87daf","Type":"ContainerStarted","Data":"2a9020f2cc3bb89bb05b0db92202c2cdfb5c2413dd6720604344d44f87d7a0e8"} Sep 30 14:09:18 crc kubenswrapper[4676]: I0930 14:09:18.256988 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" event={"ID":"c7fea76e-e952-4574-ba7b-786f42a87daf","Type":"ContainerStarted","Data":"37aa40309e3e907f52dd4f9a7cc434dbbc9dad2e3b7ef63d6fc9fd900b123d84"} Sep 30 14:09:18 crc kubenswrapper[4676]: I0930 14:09:18.257000 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" event={"ID":"c7fea76e-e952-4574-ba7b-786f42a87daf","Type":"ContainerStarted","Data":"b82dd255b2f85283fda3a0a75b3235959d71c084384a4c65ae8293a94ab19623"} Sep 30 14:09:19 crc kubenswrapper[4676]: I0930 14:09:19.267739 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" event={"ID":"c7fea76e-e952-4574-ba7b-786f42a87daf","Type":"ContainerStarted","Data":"6173b1edcde7790224fe50c16947712ffb1a78b69ef7a53ac8a245317bfc6777"} Sep 30 14:09:19 crc kubenswrapper[4676]: I0930 14:09:19.268262 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" event={"ID":"c7fea76e-e952-4574-ba7b-786f42a87daf","Type":"ContainerStarted","Data":"329b1e3cce8f379bea90a1f6086fed88b1365eeb1ce4f85c7fd84e2f9bde2ac9"} Sep 30 14:09:19 crc kubenswrapper[4676]: I0930 14:09:19.268281 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" event={"ID":"c7fea76e-e952-4574-ba7b-786f42a87daf","Type":"ContainerStarted","Data":"ea7d01f08d42fad5cf1d97245ed7743145914184daa65ba0a1fe5b7519b1d9eb"} Sep 30 14:09:20 crc kubenswrapper[4676]: I0930 14:09:20.300844 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-srpgg" Sep 30 14:09:22 crc kubenswrapper[4676]: I0930 14:09:22.296267 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" event={"ID":"c7fea76e-e952-4574-ba7b-786f42a87daf","Type":"ContainerStarted","Data":"c6d339f0012b340bc4c4385e9c21af833109e7eed3340361fe39deed2d4eb4ba"} Sep 30 14:09:24 crc kubenswrapper[4676]: I0930 14:09:24.315870 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" event={"ID":"c7fea76e-e952-4574-ba7b-786f42a87daf","Type":"ContainerStarted","Data":"43fe76b3eb07a679116271a0f34affcacae1b9ca49835ec8cb4c82d84eda132c"} Sep 30 14:09:24 crc kubenswrapper[4676]: I0930 14:09:24.316511 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" Sep 30 14:09:24 crc kubenswrapper[4676]: I0930 14:09:24.316538 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" Sep 30 14:09:24 crc kubenswrapper[4676]: I0930 14:09:24.345455 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" Sep 30 14:09:24 crc kubenswrapper[4676]: I0930 14:09:24.356128 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" podStartSLOduration=8.356091686 podStartE2EDuration="8.356091686s" podCreationTimestamp="2025-09-30 14:09:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:09:24.347927464 +0000 UTC m=+668.331015913" watchObservedRunningTime="2025-09-30 14:09:24.356091686 +0000 UTC m=+668.339180125" Sep 30 14:09:25 crc kubenswrapper[4676]: I0930 14:09:25.321094 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" Sep 30 14:09:25 crc kubenswrapper[4676]: I0930 14:09:25.364353 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" Sep 30 14:09:27 crc kubenswrapper[4676]: I0930 14:09:27.436036 4676 scope.go:117] "RemoveContainer" containerID="113c6305d3745ecf2f275c9e94641b977c27aa7cac2b903fe4da072121bcae96" Sep 30 14:09:27 crc kubenswrapper[4676]: E0930 14:09:27.436931 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-s7q5x_openshift-multus(12808c49-1bed-4251-bcbe-fad6207eea57)\"" pod="openshift-multus/multus-s7q5x" podUID="12808c49-1bed-4251-bcbe-fad6207eea57" Sep 30 14:09:42 crc kubenswrapper[4676]: I0930 14:09:42.433391 4676 scope.go:117] "RemoveContainer" containerID="113c6305d3745ecf2f275c9e94641b977c27aa7cac2b903fe4da072121bcae96" Sep 30 14:09:43 crc kubenswrapper[4676]: I0930 14:09:43.441524 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-s7q5x_12808c49-1bed-4251-bcbe-fad6207eea57/kube-multus/2.log" Sep 30 14:09:43 crc kubenswrapper[4676]: I0930 14:09:43.442368 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-s7q5x" event={"ID":"12808c49-1bed-4251-bcbe-fad6207eea57","Type":"ContainerStarted","Data":"867e001c8b58bef2da4bc15620f7ddb762fae1188db66f0ed7e190638c7a709d"} Sep 30 14:09:46 crc kubenswrapper[4676]: I0930 14:09:46.646031 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6p8bh" Sep 30 14:10:03 crc kubenswrapper[4676]: I0930 14:10:03.646188 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcksm7x"] Sep 30 14:10:03 crc kubenswrapper[4676]: I0930 14:10:03.648403 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcksm7x" Sep 30 14:10:03 crc kubenswrapper[4676]: I0930 14:10:03.652671 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Sep 30 14:10:03 crc kubenswrapper[4676]: I0930 14:10:03.652952 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcksm7x"] Sep 30 14:10:03 crc kubenswrapper[4676]: I0930 14:10:03.677965 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jncb5\" (UniqueName: \"kubernetes.io/projected/bc584293-077a-42bb-a90c-5f1931728034-kube-api-access-jncb5\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcksm7x\" (UID: \"bc584293-077a-42bb-a90c-5f1931728034\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcksm7x" Sep 30 14:10:03 crc kubenswrapper[4676]: I0930 14:10:03.678071 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bc584293-077a-42bb-a90c-5f1931728034-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcksm7x\" (UID: \"bc584293-077a-42bb-a90c-5f1931728034\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcksm7x" Sep 30 14:10:03 crc kubenswrapper[4676]: I0930 14:10:03.678148 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bc584293-077a-42bb-a90c-5f1931728034-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcksm7x\" (UID: \"bc584293-077a-42bb-a90c-5f1931728034\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcksm7x" Sep 30 14:10:03 crc kubenswrapper[4676]: I0930 14:10:03.779765 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bc584293-077a-42bb-a90c-5f1931728034-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcksm7x\" (UID: \"bc584293-077a-42bb-a90c-5f1931728034\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcksm7x" Sep 30 14:10:03 crc kubenswrapper[4676]: I0930 14:10:03.779846 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bc584293-077a-42bb-a90c-5f1931728034-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcksm7x\" (UID: \"bc584293-077a-42bb-a90c-5f1931728034\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcksm7x" Sep 30 14:10:03 crc kubenswrapper[4676]: I0930 14:10:03.779872 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jncb5\" (UniqueName: \"kubernetes.io/projected/bc584293-077a-42bb-a90c-5f1931728034-kube-api-access-jncb5\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcksm7x\" (UID: \"bc584293-077a-42bb-a90c-5f1931728034\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcksm7x" Sep 30 14:10:03 crc kubenswrapper[4676]: I0930 14:10:03.780474 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bc584293-077a-42bb-a90c-5f1931728034-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcksm7x\" (UID: \"bc584293-077a-42bb-a90c-5f1931728034\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcksm7x" Sep 30 14:10:03 crc kubenswrapper[4676]: I0930 14:10:03.780525 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bc584293-077a-42bb-a90c-5f1931728034-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcksm7x\" (UID: \"bc584293-077a-42bb-a90c-5f1931728034\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcksm7x" Sep 30 14:10:03 crc kubenswrapper[4676]: I0930 14:10:03.801076 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jncb5\" (UniqueName: \"kubernetes.io/projected/bc584293-077a-42bb-a90c-5f1931728034-kube-api-access-jncb5\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcksm7x\" (UID: \"bc584293-077a-42bb-a90c-5f1931728034\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcksm7x" Sep 30 14:10:03 crc kubenswrapper[4676]: I0930 14:10:03.973699 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcksm7x" Sep 30 14:10:04 crc kubenswrapper[4676]: I0930 14:10:04.157042 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcksm7x"] Sep 30 14:10:04 crc kubenswrapper[4676]: I0930 14:10:04.576916 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcksm7x" event={"ID":"bc584293-077a-42bb-a90c-5f1931728034","Type":"ContainerStarted","Data":"8c426a6fbf367287ea9e1faad9be54d3be484db091f2269d181838c5fd858ed7"} Sep 30 14:10:04 crc kubenswrapper[4676]: I0930 14:10:04.577379 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcksm7x" event={"ID":"bc584293-077a-42bb-a90c-5f1931728034","Type":"ContainerStarted","Data":"ba73f903f975b14f2b946a3c3017fa80b748745ce6d4d8c7964d352dba0d0534"} Sep 30 14:10:05 crc kubenswrapper[4676]: I0930 14:10:05.584004 4676 generic.go:334] "Generic (PLEG): container finished" podID="bc584293-077a-42bb-a90c-5f1931728034" containerID="8c426a6fbf367287ea9e1faad9be54d3be484db091f2269d181838c5fd858ed7" exitCode=0 Sep 30 14:10:05 crc kubenswrapper[4676]: I0930 14:10:05.584068 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcksm7x" event={"ID":"bc584293-077a-42bb-a90c-5f1931728034","Type":"ContainerDied","Data":"8c426a6fbf367287ea9e1faad9be54d3be484db091f2269d181838c5fd858ed7"} Sep 30 14:10:07 crc kubenswrapper[4676]: I0930 14:10:07.596912 4676 generic.go:334] "Generic (PLEG): container finished" podID="bc584293-077a-42bb-a90c-5f1931728034" containerID="7fb1a6ae5ccb0192ceecce3dd241a66259e6ebd8b3341f8d8cc66c0004502f6c" exitCode=0 Sep 30 14:10:07 crc kubenswrapper[4676]: I0930 14:10:07.597065 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcksm7x" event={"ID":"bc584293-077a-42bb-a90c-5f1931728034","Type":"ContainerDied","Data":"7fb1a6ae5ccb0192ceecce3dd241a66259e6ebd8b3341f8d8cc66c0004502f6c"} Sep 30 14:10:08 crc kubenswrapper[4676]: I0930 14:10:08.605235 4676 generic.go:334] "Generic (PLEG): container finished" podID="bc584293-077a-42bb-a90c-5f1931728034" containerID="30b6c388d4b4f0922aa05ac441b5a771148066277c8c634398280dce1251296f" exitCode=0 Sep 30 14:10:08 crc kubenswrapper[4676]: I0930 14:10:08.605291 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcksm7x" event={"ID":"bc584293-077a-42bb-a90c-5f1931728034","Type":"ContainerDied","Data":"30b6c388d4b4f0922aa05ac441b5a771148066277c8c634398280dce1251296f"} Sep 30 14:10:09 crc kubenswrapper[4676]: I0930 14:10:09.833991 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcksm7x" Sep 30 14:10:09 crc kubenswrapper[4676]: I0930 14:10:09.865872 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jncb5\" (UniqueName: \"kubernetes.io/projected/bc584293-077a-42bb-a90c-5f1931728034-kube-api-access-jncb5\") pod \"bc584293-077a-42bb-a90c-5f1931728034\" (UID: \"bc584293-077a-42bb-a90c-5f1931728034\") " Sep 30 14:10:09 crc kubenswrapper[4676]: I0930 14:10:09.866016 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bc584293-077a-42bb-a90c-5f1931728034-util\") pod \"bc584293-077a-42bb-a90c-5f1931728034\" (UID: \"bc584293-077a-42bb-a90c-5f1931728034\") " Sep 30 14:10:09 crc kubenswrapper[4676]: I0930 14:10:09.866047 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bc584293-077a-42bb-a90c-5f1931728034-bundle\") pod \"bc584293-077a-42bb-a90c-5f1931728034\" (UID: \"bc584293-077a-42bb-a90c-5f1931728034\") " Sep 30 14:10:09 crc kubenswrapper[4676]: I0930 14:10:09.867221 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc584293-077a-42bb-a90c-5f1931728034-bundle" (OuterVolumeSpecName: "bundle") pod "bc584293-077a-42bb-a90c-5f1931728034" (UID: "bc584293-077a-42bb-a90c-5f1931728034"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:10:09 crc kubenswrapper[4676]: I0930 14:10:09.872089 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc584293-077a-42bb-a90c-5f1931728034-kube-api-access-jncb5" (OuterVolumeSpecName: "kube-api-access-jncb5") pod "bc584293-077a-42bb-a90c-5f1931728034" (UID: "bc584293-077a-42bb-a90c-5f1931728034"). InnerVolumeSpecName "kube-api-access-jncb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:10:09 crc kubenswrapper[4676]: I0930 14:10:09.877081 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc584293-077a-42bb-a90c-5f1931728034-util" (OuterVolumeSpecName: "util") pod "bc584293-077a-42bb-a90c-5f1931728034" (UID: "bc584293-077a-42bb-a90c-5f1931728034"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:10:09 crc kubenswrapper[4676]: I0930 14:10:09.967774 4676 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bc584293-077a-42bb-a90c-5f1931728034-util\") on node \"crc\" DevicePath \"\"" Sep 30 14:10:09 crc kubenswrapper[4676]: I0930 14:10:09.967813 4676 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bc584293-077a-42bb-a90c-5f1931728034-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:10:09 crc kubenswrapper[4676]: I0930 14:10:09.967824 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jncb5\" (UniqueName: \"kubernetes.io/projected/bc584293-077a-42bb-a90c-5f1931728034-kube-api-access-jncb5\") on node \"crc\" DevicePath \"\"" Sep 30 14:10:10 crc kubenswrapper[4676]: I0930 14:10:10.625456 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcksm7x" event={"ID":"bc584293-077a-42bb-a90c-5f1931728034","Type":"ContainerDied","Data":"ba73f903f975b14f2b946a3c3017fa80b748745ce6d4d8c7964d352dba0d0534"} Sep 30 14:10:10 crc kubenswrapper[4676]: I0930 14:10:10.625515 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba73f903f975b14f2b946a3c3017fa80b748745ce6d4d8c7964d352dba0d0534" Sep 30 14:10:10 crc kubenswrapper[4676]: I0930 14:10:10.625924 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcksm7x" Sep 30 14:10:15 crc kubenswrapper[4676]: I0930 14:10:15.179292 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-4v459"] Sep 30 14:10:15 crc kubenswrapper[4676]: E0930 14:10:15.181029 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc584293-077a-42bb-a90c-5f1931728034" containerName="extract" Sep 30 14:10:15 crc kubenswrapper[4676]: I0930 14:10:15.181132 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc584293-077a-42bb-a90c-5f1931728034" containerName="extract" Sep 30 14:10:15 crc kubenswrapper[4676]: E0930 14:10:15.181214 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc584293-077a-42bb-a90c-5f1931728034" containerName="pull" Sep 30 14:10:15 crc kubenswrapper[4676]: I0930 14:10:15.181277 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc584293-077a-42bb-a90c-5f1931728034" containerName="pull" Sep 30 14:10:15 crc kubenswrapper[4676]: E0930 14:10:15.181377 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc584293-077a-42bb-a90c-5f1931728034" containerName="util" Sep 30 14:10:15 crc kubenswrapper[4676]: I0930 14:10:15.181450 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc584293-077a-42bb-a90c-5f1931728034" containerName="util" Sep 30 14:10:15 crc kubenswrapper[4676]: I0930 14:10:15.181679 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc584293-077a-42bb-a90c-5f1931728034" containerName="extract" Sep 30 14:10:15 crc kubenswrapper[4676]: I0930 14:10:15.182388 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-4v459" Sep 30 14:10:15 crc kubenswrapper[4676]: I0930 14:10:15.185502 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Sep 30 14:10:15 crc kubenswrapper[4676]: I0930 14:10:15.185609 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-l88qw" Sep 30 14:10:15 crc kubenswrapper[4676]: I0930 14:10:15.186380 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Sep 30 14:10:15 crc kubenswrapper[4676]: I0930 14:10:15.200011 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-4v459"] Sep 30 14:10:15 crc kubenswrapper[4676]: I0930 14:10:15.247528 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckt4s\" (UniqueName: \"kubernetes.io/projected/73fea0d7-e7c7-4db0-8205-1c86203f6a88-kube-api-access-ckt4s\") pod \"nmstate-operator-5d6f6cfd66-4v459\" (UID: \"73fea0d7-e7c7-4db0-8205-1c86203f6a88\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-4v459" Sep 30 14:10:15 crc kubenswrapper[4676]: I0930 14:10:15.348915 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckt4s\" (UniqueName: \"kubernetes.io/projected/73fea0d7-e7c7-4db0-8205-1c86203f6a88-kube-api-access-ckt4s\") pod \"nmstate-operator-5d6f6cfd66-4v459\" (UID: \"73fea0d7-e7c7-4db0-8205-1c86203f6a88\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-4v459" Sep 30 14:10:15 crc kubenswrapper[4676]: I0930 14:10:15.367017 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckt4s\" (UniqueName: \"kubernetes.io/projected/73fea0d7-e7c7-4db0-8205-1c86203f6a88-kube-api-access-ckt4s\") pod \"nmstate-operator-5d6f6cfd66-4v459\" (UID: \"73fea0d7-e7c7-4db0-8205-1c86203f6a88\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-4v459" Sep 30 14:10:15 crc kubenswrapper[4676]: I0930 14:10:15.509865 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-4v459" Sep 30 14:10:16 crc kubenswrapper[4676]: I0930 14:10:16.047368 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-4v459"] Sep 30 14:10:16 crc kubenswrapper[4676]: I0930 14:10:16.672986 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-4v459" event={"ID":"73fea0d7-e7c7-4db0-8205-1c86203f6a88","Type":"ContainerStarted","Data":"a3db12cd66e030566648969bf6a4678dfa08c716a6c8307e5111b023e9a40d52"} Sep 30 14:10:19 crc kubenswrapper[4676]: I0930 14:10:19.693340 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-4v459" event={"ID":"73fea0d7-e7c7-4db0-8205-1c86203f6a88","Type":"ContainerStarted","Data":"c7895e81b29809ad24b409f066d5ab7c07546505a6e1ff25557808d4e001ce2f"} Sep 30 14:10:19 crc kubenswrapper[4676]: I0930 14:10:19.713117 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-4v459" podStartSLOduration=1.556866373 podStartE2EDuration="4.713090257s" podCreationTimestamp="2025-09-30 14:10:15 +0000 UTC" firstStartedPulling="2025-09-30 14:10:16.064493227 +0000 UTC m=+720.047581656" lastFinishedPulling="2025-09-30 14:10:19.220717111 +0000 UTC m=+723.203805540" observedRunningTime="2025-09-30 14:10:19.710454778 +0000 UTC m=+723.693543207" watchObservedRunningTime="2025-09-30 14:10:19.713090257 +0000 UTC m=+723.696178686" Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.086155 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-w5b6p"] Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.087763 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58fcddf996-w5b6p" Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.089799 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-gnr2t" Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.090455 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-rb9kz"] Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.091446 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6d689559c5-rb9kz" Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.094960 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.117649 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-w5b6p"] Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.122657 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-rb9kz"] Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.133450 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-w7g4n"] Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.134340 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-w7g4n" Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.171804 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25qld\" (UniqueName: \"kubernetes.io/projected/9820f583-f5b4-4642-ade6-683242648b4d-kube-api-access-25qld\") pod \"nmstate-webhook-6d689559c5-rb9kz\" (UID: \"9820f583-f5b4-4642-ade6-683242648b4d\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-rb9kz" Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.171848 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/ec2f5c4c-d407-4e7d-9f0b-406feff7cdcc-dbus-socket\") pod \"nmstate-handler-w7g4n\" (UID: \"ec2f5c4c-d407-4e7d-9f0b-406feff7cdcc\") " pod="openshift-nmstate/nmstate-handler-w7g4n" Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.171875 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxdwk\" (UniqueName: \"kubernetes.io/projected/ec2f5c4c-d407-4e7d-9f0b-406feff7cdcc-kube-api-access-wxdwk\") pod \"nmstate-handler-w7g4n\" (UID: \"ec2f5c4c-d407-4e7d-9f0b-406feff7cdcc\") " pod="openshift-nmstate/nmstate-handler-w7g4n" Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.171948 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/ec2f5c4c-d407-4e7d-9f0b-406feff7cdcc-nmstate-lock\") pod \"nmstate-handler-w7g4n\" (UID: \"ec2f5c4c-d407-4e7d-9f0b-406feff7cdcc\") " pod="openshift-nmstate/nmstate-handler-w7g4n" Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.171967 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/ec2f5c4c-d407-4e7d-9f0b-406feff7cdcc-ovs-socket\") pod \"nmstate-handler-w7g4n\" (UID: \"ec2f5c4c-d407-4e7d-9f0b-406feff7cdcc\") " pod="openshift-nmstate/nmstate-handler-w7g4n" Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.172026 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/9820f583-f5b4-4642-ade6-683242648b4d-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-rb9kz\" (UID: \"9820f583-f5b4-4642-ade6-683242648b4d\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-rb9kz" Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.172059 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdmgn\" (UniqueName: \"kubernetes.io/projected/6c1f56f8-8d1c-47f2-9099-25a15fdaee77-kube-api-access-kdmgn\") pod \"nmstate-metrics-58fcddf996-w5b6p\" (UID: \"6c1f56f8-8d1c-47f2-9099-25a15fdaee77\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-w5b6p" Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.254380 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-l7jdz"] Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.255328 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-l7jdz" Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.258299 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.258318 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-44h98" Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.258678 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.267634 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-l7jdz"] Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.273079 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdmgn\" (UniqueName: \"kubernetes.io/projected/6c1f56f8-8d1c-47f2-9099-25a15fdaee77-kube-api-access-kdmgn\") pod \"nmstate-metrics-58fcddf996-w5b6p\" (UID: \"6c1f56f8-8d1c-47f2-9099-25a15fdaee77\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-w5b6p" Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.273157 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/87e24d2c-e308-4f03-a28c-eb3ca52bb5f6-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-l7jdz\" (UID: \"87e24d2c-e308-4f03-a28c-eb3ca52bb5f6\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-l7jdz" Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.273189 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/87e24d2c-e308-4f03-a28c-eb3ca52bb5f6-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-l7jdz\" (UID: \"87e24d2c-e308-4f03-a28c-eb3ca52bb5f6\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-l7jdz" Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.273214 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25qld\" (UniqueName: \"kubernetes.io/projected/9820f583-f5b4-4642-ade6-683242648b4d-kube-api-access-25qld\") pod \"nmstate-webhook-6d689559c5-rb9kz\" (UID: \"9820f583-f5b4-4642-ade6-683242648b4d\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-rb9kz" Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.273246 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/ec2f5c4c-d407-4e7d-9f0b-406feff7cdcc-dbus-socket\") pod \"nmstate-handler-w7g4n\" (UID: \"ec2f5c4c-d407-4e7d-9f0b-406feff7cdcc\") " pod="openshift-nmstate/nmstate-handler-w7g4n" Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.273281 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkdzf\" (UniqueName: \"kubernetes.io/projected/87e24d2c-e308-4f03-a28c-eb3ca52bb5f6-kube-api-access-kkdzf\") pod \"nmstate-console-plugin-864bb6dfb5-l7jdz\" (UID: \"87e24d2c-e308-4f03-a28c-eb3ca52bb5f6\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-l7jdz" Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.273305 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxdwk\" (UniqueName: \"kubernetes.io/projected/ec2f5c4c-d407-4e7d-9f0b-406feff7cdcc-kube-api-access-wxdwk\") pod \"nmstate-handler-w7g4n\" (UID: \"ec2f5c4c-d407-4e7d-9f0b-406feff7cdcc\") " pod="openshift-nmstate/nmstate-handler-w7g4n" Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.273331 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/ec2f5c4c-d407-4e7d-9f0b-406feff7cdcc-nmstate-lock\") pod \"nmstate-handler-w7g4n\" (UID: \"ec2f5c4c-d407-4e7d-9f0b-406feff7cdcc\") " pod="openshift-nmstate/nmstate-handler-w7g4n" Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.273355 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/ec2f5c4c-d407-4e7d-9f0b-406feff7cdcc-ovs-socket\") pod \"nmstate-handler-w7g4n\" (UID: \"ec2f5c4c-d407-4e7d-9f0b-406feff7cdcc\") " pod="openshift-nmstate/nmstate-handler-w7g4n" Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.273407 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/9820f583-f5b4-4642-ade6-683242648b4d-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-rb9kz\" (UID: \"9820f583-f5b4-4642-ade6-683242648b4d\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-rb9kz" Sep 30 14:10:24 crc kubenswrapper[4676]: E0930 14:10:24.273555 4676 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Sep 30 14:10:24 crc kubenswrapper[4676]: E0930 14:10:24.273626 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9820f583-f5b4-4642-ade6-683242648b4d-tls-key-pair podName:9820f583-f5b4-4642-ade6-683242648b4d nodeName:}" failed. No retries permitted until 2025-09-30 14:10:24.773598668 +0000 UTC m=+728.756687097 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/9820f583-f5b4-4642-ade6-683242648b4d-tls-key-pair") pod "nmstate-webhook-6d689559c5-rb9kz" (UID: "9820f583-f5b4-4642-ade6-683242648b4d") : secret "openshift-nmstate-webhook" not found Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.274109 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/ec2f5c4c-d407-4e7d-9f0b-406feff7cdcc-ovs-socket\") pod \"nmstate-handler-w7g4n\" (UID: \"ec2f5c4c-d407-4e7d-9f0b-406feff7cdcc\") " pod="openshift-nmstate/nmstate-handler-w7g4n" Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.274127 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/ec2f5c4c-d407-4e7d-9f0b-406feff7cdcc-dbus-socket\") pod \"nmstate-handler-w7g4n\" (UID: \"ec2f5c4c-d407-4e7d-9f0b-406feff7cdcc\") " pod="openshift-nmstate/nmstate-handler-w7g4n" Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.274331 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/ec2f5c4c-d407-4e7d-9f0b-406feff7cdcc-nmstate-lock\") pod \"nmstate-handler-w7g4n\" (UID: \"ec2f5c4c-d407-4e7d-9f0b-406feff7cdcc\") " pod="openshift-nmstate/nmstate-handler-w7g4n" Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.297038 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxdwk\" (UniqueName: \"kubernetes.io/projected/ec2f5c4c-d407-4e7d-9f0b-406feff7cdcc-kube-api-access-wxdwk\") pod \"nmstate-handler-w7g4n\" (UID: \"ec2f5c4c-d407-4e7d-9f0b-406feff7cdcc\") " pod="openshift-nmstate/nmstate-handler-w7g4n" Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.297373 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25qld\" (UniqueName: \"kubernetes.io/projected/9820f583-f5b4-4642-ade6-683242648b4d-kube-api-access-25qld\") pod \"nmstate-webhook-6d689559c5-rb9kz\" (UID: \"9820f583-f5b4-4642-ade6-683242648b4d\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-rb9kz" Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.298177 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdmgn\" (UniqueName: \"kubernetes.io/projected/6c1f56f8-8d1c-47f2-9099-25a15fdaee77-kube-api-access-kdmgn\") pod \"nmstate-metrics-58fcddf996-w5b6p\" (UID: \"6c1f56f8-8d1c-47f2-9099-25a15fdaee77\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-w5b6p" Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.374765 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/87e24d2c-e308-4f03-a28c-eb3ca52bb5f6-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-l7jdz\" (UID: \"87e24d2c-e308-4f03-a28c-eb3ca52bb5f6\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-l7jdz" Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.375312 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/87e24d2c-e308-4f03-a28c-eb3ca52bb5f6-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-l7jdz\" (UID: \"87e24d2c-e308-4f03-a28c-eb3ca52bb5f6\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-l7jdz" Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.375349 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkdzf\" (UniqueName: \"kubernetes.io/projected/87e24d2c-e308-4f03-a28c-eb3ca52bb5f6-kube-api-access-kkdzf\") pod \"nmstate-console-plugin-864bb6dfb5-l7jdz\" (UID: \"87e24d2c-e308-4f03-a28c-eb3ca52bb5f6\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-l7jdz" Sep 30 14:10:24 crc kubenswrapper[4676]: E0930 14:10:24.375007 4676 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Sep 30 14:10:24 crc kubenswrapper[4676]: E0930 14:10:24.375487 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/87e24d2c-e308-4f03-a28c-eb3ca52bb5f6-plugin-serving-cert podName:87e24d2c-e308-4f03-a28c-eb3ca52bb5f6 nodeName:}" failed. No retries permitted until 2025-09-30 14:10:24.875459807 +0000 UTC m=+728.858548236 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/87e24d2c-e308-4f03-a28c-eb3ca52bb5f6-plugin-serving-cert") pod "nmstate-console-plugin-864bb6dfb5-l7jdz" (UID: "87e24d2c-e308-4f03-a28c-eb3ca52bb5f6") : secret "plugin-serving-cert" not found Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.376605 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/87e24d2c-e308-4f03-a28c-eb3ca52bb5f6-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-l7jdz\" (UID: \"87e24d2c-e308-4f03-a28c-eb3ca52bb5f6\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-l7jdz" Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.395102 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkdzf\" (UniqueName: \"kubernetes.io/projected/87e24d2c-e308-4f03-a28c-eb3ca52bb5f6-kube-api-access-kkdzf\") pod \"nmstate-console-plugin-864bb6dfb5-l7jdz\" (UID: \"87e24d2c-e308-4f03-a28c-eb3ca52bb5f6\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-l7jdz" Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.412980 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58fcddf996-w5b6p" Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.443919 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7bcb5d4c85-hj6xm"] Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.444916 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7bcb5d4c85-hj6xm" Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.455868 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-w7g4n" Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.457846 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7bcb5d4c85-hj6xm"] Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.476728 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cc4b44cb-3ce4-4ede-84f6-1eb482ca17e9-trusted-ca-bundle\") pod \"console-7bcb5d4c85-hj6xm\" (UID: \"cc4b44cb-3ce4-4ede-84f6-1eb482ca17e9\") " pod="openshift-console/console-7bcb5d4c85-hj6xm" Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.476783 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cc4b44cb-3ce4-4ede-84f6-1eb482ca17e9-console-oauth-config\") pod \"console-7bcb5d4c85-hj6xm\" (UID: \"cc4b44cb-3ce4-4ede-84f6-1eb482ca17e9\") " pod="openshift-console/console-7bcb5d4c85-hj6xm" Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.476807 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ms62\" (UniqueName: \"kubernetes.io/projected/cc4b44cb-3ce4-4ede-84f6-1eb482ca17e9-kube-api-access-4ms62\") pod \"console-7bcb5d4c85-hj6xm\" (UID: \"cc4b44cb-3ce4-4ede-84f6-1eb482ca17e9\") " pod="openshift-console/console-7bcb5d4c85-hj6xm" Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.476827 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cc4b44cb-3ce4-4ede-84f6-1eb482ca17e9-service-ca\") pod \"console-7bcb5d4c85-hj6xm\" (UID: \"cc4b44cb-3ce4-4ede-84f6-1eb482ca17e9\") " pod="openshift-console/console-7bcb5d4c85-hj6xm" Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.476909 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cc4b44cb-3ce4-4ede-84f6-1eb482ca17e9-oauth-serving-cert\") pod \"console-7bcb5d4c85-hj6xm\" (UID: \"cc4b44cb-3ce4-4ede-84f6-1eb482ca17e9\") " pod="openshift-console/console-7bcb5d4c85-hj6xm" Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.476956 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cc4b44cb-3ce4-4ede-84f6-1eb482ca17e9-console-config\") pod \"console-7bcb5d4c85-hj6xm\" (UID: \"cc4b44cb-3ce4-4ede-84f6-1eb482ca17e9\") " pod="openshift-console/console-7bcb5d4c85-hj6xm" Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.476982 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cc4b44cb-3ce4-4ede-84f6-1eb482ca17e9-console-serving-cert\") pod \"console-7bcb5d4c85-hj6xm\" (UID: \"cc4b44cb-3ce4-4ede-84f6-1eb482ca17e9\") " pod="openshift-console/console-7bcb5d4c85-hj6xm" Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.578573 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cc4b44cb-3ce4-4ede-84f6-1eb482ca17e9-console-oauth-config\") pod \"console-7bcb5d4c85-hj6xm\" (UID: \"cc4b44cb-3ce4-4ede-84f6-1eb482ca17e9\") " pod="openshift-console/console-7bcb5d4c85-hj6xm" Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.579077 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ms62\" (UniqueName: \"kubernetes.io/projected/cc4b44cb-3ce4-4ede-84f6-1eb482ca17e9-kube-api-access-4ms62\") pod \"console-7bcb5d4c85-hj6xm\" (UID: \"cc4b44cb-3ce4-4ede-84f6-1eb482ca17e9\") " pod="openshift-console/console-7bcb5d4c85-hj6xm" Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.579122 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cc4b44cb-3ce4-4ede-84f6-1eb482ca17e9-service-ca\") pod \"console-7bcb5d4c85-hj6xm\" (UID: \"cc4b44cb-3ce4-4ede-84f6-1eb482ca17e9\") " pod="openshift-console/console-7bcb5d4c85-hj6xm" Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.579194 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cc4b44cb-3ce4-4ede-84f6-1eb482ca17e9-oauth-serving-cert\") pod \"console-7bcb5d4c85-hj6xm\" (UID: \"cc4b44cb-3ce4-4ede-84f6-1eb482ca17e9\") " pod="openshift-console/console-7bcb5d4c85-hj6xm" Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.579256 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cc4b44cb-3ce4-4ede-84f6-1eb482ca17e9-console-config\") pod \"console-7bcb5d4c85-hj6xm\" (UID: \"cc4b44cb-3ce4-4ede-84f6-1eb482ca17e9\") " pod="openshift-console/console-7bcb5d4c85-hj6xm" Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.579288 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cc4b44cb-3ce4-4ede-84f6-1eb482ca17e9-console-serving-cert\") pod \"console-7bcb5d4c85-hj6xm\" (UID: \"cc4b44cb-3ce4-4ede-84f6-1eb482ca17e9\") " pod="openshift-console/console-7bcb5d4c85-hj6xm" Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.579332 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cc4b44cb-3ce4-4ede-84f6-1eb482ca17e9-trusted-ca-bundle\") pod \"console-7bcb5d4c85-hj6xm\" (UID: \"cc4b44cb-3ce4-4ede-84f6-1eb482ca17e9\") " pod="openshift-console/console-7bcb5d4c85-hj6xm" Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.580231 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cc4b44cb-3ce4-4ede-84f6-1eb482ca17e9-service-ca\") pod \"console-7bcb5d4c85-hj6xm\" (UID: \"cc4b44cb-3ce4-4ede-84f6-1eb482ca17e9\") " pod="openshift-console/console-7bcb5d4c85-hj6xm" Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.580828 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cc4b44cb-3ce4-4ede-84f6-1eb482ca17e9-oauth-serving-cert\") pod \"console-7bcb5d4c85-hj6xm\" (UID: \"cc4b44cb-3ce4-4ede-84f6-1eb482ca17e9\") " pod="openshift-console/console-7bcb5d4c85-hj6xm" Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.581461 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cc4b44cb-3ce4-4ede-84f6-1eb482ca17e9-trusted-ca-bundle\") pod \"console-7bcb5d4c85-hj6xm\" (UID: \"cc4b44cb-3ce4-4ede-84f6-1eb482ca17e9\") " pod="openshift-console/console-7bcb5d4c85-hj6xm" Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.581624 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cc4b44cb-3ce4-4ede-84f6-1eb482ca17e9-console-config\") pod \"console-7bcb5d4c85-hj6xm\" (UID: \"cc4b44cb-3ce4-4ede-84f6-1eb482ca17e9\") " pod="openshift-console/console-7bcb5d4c85-hj6xm" Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.587705 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cc4b44cb-3ce4-4ede-84f6-1eb482ca17e9-console-serving-cert\") pod \"console-7bcb5d4c85-hj6xm\" (UID: \"cc4b44cb-3ce4-4ede-84f6-1eb482ca17e9\") " pod="openshift-console/console-7bcb5d4c85-hj6xm" Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.603777 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ms62\" (UniqueName: \"kubernetes.io/projected/cc4b44cb-3ce4-4ede-84f6-1eb482ca17e9-kube-api-access-4ms62\") pod \"console-7bcb5d4c85-hj6xm\" (UID: \"cc4b44cb-3ce4-4ede-84f6-1eb482ca17e9\") " pod="openshift-console/console-7bcb5d4c85-hj6xm" Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.608148 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cc4b44cb-3ce4-4ede-84f6-1eb482ca17e9-console-oauth-config\") pod \"console-7bcb5d4c85-hj6xm\" (UID: \"cc4b44cb-3ce4-4ede-84f6-1eb482ca17e9\") " pod="openshift-console/console-7bcb5d4c85-hj6xm" Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.683746 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-w5b6p"] Sep 30 14:10:24 crc kubenswrapper[4676]: W0930 14:10:24.692233 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c1f56f8_8d1c_47f2_9099_25a15fdaee77.slice/crio-fb8957cf35e451ef70e5123c09cfaa13876d2c94054f6b5f286463cf90771f64 WatchSource:0}: Error finding container fb8957cf35e451ef70e5123c09cfaa13876d2c94054f6b5f286463cf90771f64: Status 404 returned error can't find the container with id fb8957cf35e451ef70e5123c09cfaa13876d2c94054f6b5f286463cf90771f64 Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.728707 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-w7g4n" event={"ID":"ec2f5c4c-d407-4e7d-9f0b-406feff7cdcc","Type":"ContainerStarted","Data":"5bbac7c1aca680ff4ca1565520aac7e9c8b07d664eb5954be518ab3c0753d849"} Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.730566 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-w5b6p" event={"ID":"6c1f56f8-8d1c-47f2-9099-25a15fdaee77","Type":"ContainerStarted","Data":"fb8957cf35e451ef70e5123c09cfaa13876d2c94054f6b5f286463cf90771f64"} Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.782356 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/9820f583-f5b4-4642-ade6-683242648b4d-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-rb9kz\" (UID: \"9820f583-f5b4-4642-ade6-683242648b4d\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-rb9kz" Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.786079 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/9820f583-f5b4-4642-ade6-683242648b4d-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-rb9kz\" (UID: \"9820f583-f5b4-4642-ade6-683242648b4d\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-rb9kz" Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.794119 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7bcb5d4c85-hj6xm" Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.886953 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/87e24d2c-e308-4f03-a28c-eb3ca52bb5f6-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-l7jdz\" (UID: \"87e24d2c-e308-4f03-a28c-eb3ca52bb5f6\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-l7jdz" Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.892743 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/87e24d2c-e308-4f03-a28c-eb3ca52bb5f6-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-l7jdz\" (UID: \"87e24d2c-e308-4f03-a28c-eb3ca52bb5f6\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-l7jdz" Sep 30 14:10:24 crc kubenswrapper[4676]: I0930 14:10:24.992745 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7bcb5d4c85-hj6xm"] Sep 30 14:10:24 crc kubenswrapper[4676]: W0930 14:10:24.997535 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc4b44cb_3ce4_4ede_84f6_1eb482ca17e9.slice/crio-9c42ee6266aabf960acfdebeea6a630f1509ec1415949098e6bd5ad2930ebe3b WatchSource:0}: Error finding container 9c42ee6266aabf960acfdebeea6a630f1509ec1415949098e6bd5ad2930ebe3b: Status 404 returned error can't find the container with id 9c42ee6266aabf960acfdebeea6a630f1509ec1415949098e6bd5ad2930ebe3b Sep 30 14:10:25 crc kubenswrapper[4676]: I0930 14:10:25.028056 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6d689559c5-rb9kz" Sep 30 14:10:25 crc kubenswrapper[4676]: I0930 14:10:25.174738 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-l7jdz" Sep 30 14:10:25 crc kubenswrapper[4676]: I0930 14:10:25.447280 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-l7jdz"] Sep 30 14:10:25 crc kubenswrapper[4676]: I0930 14:10:25.447732 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-rb9kz"] Sep 30 14:10:25 crc kubenswrapper[4676]: W0930 14:10:25.465469 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87e24d2c_e308_4f03_a28c_eb3ca52bb5f6.slice/crio-3c7cc5e440c0cf71e9dc6abd5a85ffb8363d3dfe4ba766a545c12023e4f9d173 WatchSource:0}: Error finding container 3c7cc5e440c0cf71e9dc6abd5a85ffb8363d3dfe4ba766a545c12023e4f9d173: Status 404 returned error can't find the container with id 3c7cc5e440c0cf71e9dc6abd5a85ffb8363d3dfe4ba766a545c12023e4f9d173 Sep 30 14:10:25 crc kubenswrapper[4676]: I0930 14:10:25.740116 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7bcb5d4c85-hj6xm" event={"ID":"cc4b44cb-3ce4-4ede-84f6-1eb482ca17e9","Type":"ContainerStarted","Data":"ac4a7dd7836b574e92a61b5cb5eef675458b978189af2ddeb5818cebf2ec657e"} Sep 30 14:10:25 crc kubenswrapper[4676]: I0930 14:10:25.740208 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7bcb5d4c85-hj6xm" event={"ID":"cc4b44cb-3ce4-4ede-84f6-1eb482ca17e9","Type":"ContainerStarted","Data":"9c42ee6266aabf960acfdebeea6a630f1509ec1415949098e6bd5ad2930ebe3b"} Sep 30 14:10:25 crc kubenswrapper[4676]: I0930 14:10:25.742185 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6d689559c5-rb9kz" event={"ID":"9820f583-f5b4-4642-ade6-683242648b4d","Type":"ContainerStarted","Data":"9c751b3e3a732ad59f6e82fb03b0dcd88a6e1e3608d138e5aab8134c1d0b8400"} Sep 30 14:10:25 crc kubenswrapper[4676]: I0930 14:10:25.743873 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-l7jdz" event={"ID":"87e24d2c-e308-4f03-a28c-eb3ca52bb5f6","Type":"ContainerStarted","Data":"3c7cc5e440c0cf71e9dc6abd5a85ffb8363d3dfe4ba766a545c12023e4f9d173"} Sep 30 14:10:25 crc kubenswrapper[4676]: I0930 14:10:25.774770 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7bcb5d4c85-hj6xm" podStartSLOduration=1.774740604 podStartE2EDuration="1.774740604s" podCreationTimestamp="2025-09-30 14:10:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:10:25.772477005 +0000 UTC m=+729.755565464" watchObservedRunningTime="2025-09-30 14:10:25.774740604 +0000 UTC m=+729.757829043" Sep 30 14:10:29 crc kubenswrapper[4676]: I0930 14:10:29.772331 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6d689559c5-rb9kz" event={"ID":"9820f583-f5b4-4642-ade6-683242648b4d","Type":"ContainerStarted","Data":"1f14545f9ff254c1a860b33371036fc75d070e82e3b81a5d5fd278718e47d154"} Sep 30 14:10:29 crc kubenswrapper[4676]: I0930 14:10:29.773036 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6d689559c5-rb9kz" Sep 30 14:10:29 crc kubenswrapper[4676]: I0930 14:10:29.773568 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-w7g4n" event={"ID":"ec2f5c4c-d407-4e7d-9f0b-406feff7cdcc","Type":"ContainerStarted","Data":"f29512e0df6a4eaab980bee17e74112c9250444b315f183bb4c8a027b6913122"} Sep 30 14:10:29 crc kubenswrapper[4676]: I0930 14:10:29.773699 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-w7g4n" Sep 30 14:10:29 crc kubenswrapper[4676]: I0930 14:10:29.775192 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-w5b6p" event={"ID":"6c1f56f8-8d1c-47f2-9099-25a15fdaee77","Type":"ContainerStarted","Data":"3a6d06a7adaa02eea4ca33d04042bfb9e10126b7df004de2be3cedf7fd2f047c"} Sep 30 14:10:29 crc kubenswrapper[4676]: I0930 14:10:29.776506 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-l7jdz" event={"ID":"87e24d2c-e308-4f03-a28c-eb3ca52bb5f6","Type":"ContainerStarted","Data":"09e2d509daf6b387484fa4b696a4322522bf298ee98f597fe8355a4aa76464cf"} Sep 30 14:10:29 crc kubenswrapper[4676]: I0930 14:10:29.793543 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6d689559c5-rb9kz" podStartSLOduration=2.223999625 podStartE2EDuration="5.79352267s" podCreationTimestamp="2025-09-30 14:10:24 +0000 UTC" firstStartedPulling="2025-09-30 14:10:25.457603903 +0000 UTC m=+729.440692332" lastFinishedPulling="2025-09-30 14:10:29.027126948 +0000 UTC m=+733.010215377" observedRunningTime="2025-09-30 14:10:29.792142223 +0000 UTC m=+733.775230652" watchObservedRunningTime="2025-09-30 14:10:29.79352267 +0000 UTC m=+733.776611099" Sep 30 14:10:29 crc kubenswrapper[4676]: I0930 14:10:29.816350 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-w7g4n" podStartSLOduration=1.31498861 podStartE2EDuration="5.816326935s" podCreationTimestamp="2025-09-30 14:10:24 +0000 UTC" firstStartedPulling="2025-09-30 14:10:24.504125007 +0000 UTC m=+728.487213436" lastFinishedPulling="2025-09-30 14:10:29.005463332 +0000 UTC m=+732.988551761" observedRunningTime="2025-09-30 14:10:29.812719171 +0000 UTC m=+733.795807610" watchObservedRunningTime="2025-09-30 14:10:29.816326935 +0000 UTC m=+733.799415364" Sep 30 14:10:29 crc kubenswrapper[4676]: I0930 14:10:29.836155 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-l7jdz" podStartSLOduration=2.308169562 podStartE2EDuration="5.836131272s" podCreationTimestamp="2025-09-30 14:10:24 +0000 UTC" firstStartedPulling="2025-09-30 14:10:25.468773525 +0000 UTC m=+729.451861964" lastFinishedPulling="2025-09-30 14:10:28.996735245 +0000 UTC m=+732.979823674" observedRunningTime="2025-09-30 14:10:29.833914274 +0000 UTC m=+733.817002703" watchObservedRunningTime="2025-09-30 14:10:29.836131272 +0000 UTC m=+733.819219701" Sep 30 14:10:31 crc kubenswrapper[4676]: I0930 14:10:31.789664 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-w5b6p" event={"ID":"6c1f56f8-8d1c-47f2-9099-25a15fdaee77","Type":"ContainerStarted","Data":"3d2fd0abdd707dd43999fec8bbe8135e77d3a5dae1df8c8dfe5b0b4351fd062b"} Sep 30 14:10:31 crc kubenswrapper[4676]: I0930 14:10:31.814630 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-58fcddf996-w5b6p" podStartSLOduration=1.082386036 podStartE2EDuration="7.814599832s" podCreationTimestamp="2025-09-30 14:10:24 +0000 UTC" firstStartedPulling="2025-09-30 14:10:24.695292769 +0000 UTC m=+728.678381198" lastFinishedPulling="2025-09-30 14:10:31.427506565 +0000 UTC m=+735.410594994" observedRunningTime="2025-09-30 14:10:31.81220322 +0000 UTC m=+735.795291649" watchObservedRunningTime="2025-09-30 14:10:31.814599832 +0000 UTC m=+735.797688261" Sep 30 14:10:34 crc kubenswrapper[4676]: I0930 14:10:34.484017 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-w7g4n" Sep 30 14:10:34 crc kubenswrapper[4676]: I0930 14:10:34.794752 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7bcb5d4c85-hj6xm" Sep 30 14:10:34 crc kubenswrapper[4676]: I0930 14:10:34.795190 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7bcb5d4c85-hj6xm" Sep 30 14:10:34 crc kubenswrapper[4676]: I0930 14:10:34.799306 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7bcb5d4c85-hj6xm" Sep 30 14:10:34 crc kubenswrapper[4676]: I0930 14:10:34.814084 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7bcb5d4c85-hj6xm" Sep 30 14:10:34 crc kubenswrapper[4676]: I0930 14:10:34.875422 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-zgcbx"] Sep 30 14:10:45 crc kubenswrapper[4676]: I0930 14:10:45.033598 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6d689559c5-rb9kz" Sep 30 14:10:45 crc kubenswrapper[4676]: I0930 14:10:45.512060 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-qbdpc"] Sep 30 14:10:45 crc kubenswrapper[4676]: I0930 14:10:45.512432 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-qbdpc" podUID="61b1921b-4102-4959-abd7-c86ca3ae880e" containerName="controller-manager" containerID="cri-o://b97143f2266a0da32711e5f0615f458fab187e7156146ccd6101995077239953" gracePeriod=30 Sep 30 14:10:45 crc kubenswrapper[4676]: I0930 14:10:45.604317 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gn7dj"] Sep 30 14:10:45 crc kubenswrapper[4676]: I0930 14:10:45.605424 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gn7dj" podUID="0b3b285f-9406-4f9b-9768-8827933418d7" containerName="route-controller-manager" containerID="cri-o://bb9f3c494c7aa1a871ede7d9a4d19bc790048d46eefbc6f4f7a6fc1e15ce6ddf" gracePeriod=30 Sep 30 14:10:45 crc kubenswrapper[4676]: I0930 14:10:45.881672 4676 generic.go:334] "Generic (PLEG): container finished" podID="61b1921b-4102-4959-abd7-c86ca3ae880e" containerID="b97143f2266a0da32711e5f0615f458fab187e7156146ccd6101995077239953" exitCode=0 Sep 30 14:10:45 crc kubenswrapper[4676]: I0930 14:10:45.881848 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-qbdpc" event={"ID":"61b1921b-4102-4959-abd7-c86ca3ae880e","Type":"ContainerDied","Data":"b97143f2266a0da32711e5f0615f458fab187e7156146ccd6101995077239953"} Sep 30 14:10:45 crc kubenswrapper[4676]: I0930 14:10:45.884237 4676 generic.go:334] "Generic (PLEG): container finished" podID="0b3b285f-9406-4f9b-9768-8827933418d7" containerID="bb9f3c494c7aa1a871ede7d9a4d19bc790048d46eefbc6f4f7a6fc1e15ce6ddf" exitCode=0 Sep 30 14:10:45 crc kubenswrapper[4676]: I0930 14:10:45.884348 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gn7dj" event={"ID":"0b3b285f-9406-4f9b-9768-8827933418d7","Type":"ContainerDied","Data":"bb9f3c494c7aa1a871ede7d9a4d19bc790048d46eefbc6f4f7a6fc1e15ce6ddf"} Sep 30 14:10:45 crc kubenswrapper[4676]: I0930 14:10:45.999318 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-qbdpc" Sep 30 14:10:46 crc kubenswrapper[4676]: I0930 14:10:46.007076 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gn7dj" Sep 30 14:10:46 crc kubenswrapper[4676]: I0930 14:10:46.019253 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/61b1921b-4102-4959-abd7-c86ca3ae880e-client-ca\") pod \"61b1921b-4102-4959-abd7-c86ca3ae880e\" (UID: \"61b1921b-4102-4959-abd7-c86ca3ae880e\") " Sep 30 14:10:46 crc kubenswrapper[4676]: I0930 14:10:46.019381 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzlh4\" (UniqueName: \"kubernetes.io/projected/0b3b285f-9406-4f9b-9768-8827933418d7-kube-api-access-vzlh4\") pod \"0b3b285f-9406-4f9b-9768-8827933418d7\" (UID: \"0b3b285f-9406-4f9b-9768-8827933418d7\") " Sep 30 14:10:46 crc kubenswrapper[4676]: I0930 14:10:46.019422 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/61b1921b-4102-4959-abd7-c86ca3ae880e-proxy-ca-bundles\") pod \"61b1921b-4102-4959-abd7-c86ca3ae880e\" (UID: \"61b1921b-4102-4959-abd7-c86ca3ae880e\") " Sep 30 14:10:46 crc kubenswrapper[4676]: I0930 14:10:46.019467 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0b3b285f-9406-4f9b-9768-8827933418d7-client-ca\") pod \"0b3b285f-9406-4f9b-9768-8827933418d7\" (UID: \"0b3b285f-9406-4f9b-9768-8827933418d7\") " Sep 30 14:10:46 crc kubenswrapper[4676]: I0930 14:10:46.019493 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61b1921b-4102-4959-abd7-c86ca3ae880e-config\") pod \"61b1921b-4102-4959-abd7-c86ca3ae880e\" (UID: \"61b1921b-4102-4959-abd7-c86ca3ae880e\") " Sep 30 14:10:46 crc kubenswrapper[4676]: I0930 14:10:46.019538 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b3b285f-9406-4f9b-9768-8827933418d7-config\") pod \"0b3b285f-9406-4f9b-9768-8827933418d7\" (UID: \"0b3b285f-9406-4f9b-9768-8827933418d7\") " Sep 30 14:10:46 crc kubenswrapper[4676]: I0930 14:10:46.019592 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwd5w\" (UniqueName: \"kubernetes.io/projected/61b1921b-4102-4959-abd7-c86ca3ae880e-kube-api-access-rwd5w\") pod \"61b1921b-4102-4959-abd7-c86ca3ae880e\" (UID: \"61b1921b-4102-4959-abd7-c86ca3ae880e\") " Sep 30 14:10:46 crc kubenswrapper[4676]: I0930 14:10:46.019615 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b3b285f-9406-4f9b-9768-8827933418d7-serving-cert\") pod \"0b3b285f-9406-4f9b-9768-8827933418d7\" (UID: \"0b3b285f-9406-4f9b-9768-8827933418d7\") " Sep 30 14:10:46 crc kubenswrapper[4676]: I0930 14:10:46.019636 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61b1921b-4102-4959-abd7-c86ca3ae880e-serving-cert\") pod \"61b1921b-4102-4959-abd7-c86ca3ae880e\" (UID: \"61b1921b-4102-4959-abd7-c86ca3ae880e\") " Sep 30 14:10:46 crc kubenswrapper[4676]: I0930 14:10:46.024358 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61b1921b-4102-4959-abd7-c86ca3ae880e-client-ca" (OuterVolumeSpecName: "client-ca") pod "61b1921b-4102-4959-abd7-c86ca3ae880e" (UID: "61b1921b-4102-4959-abd7-c86ca3ae880e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:10:46 crc kubenswrapper[4676]: I0930 14:10:46.024334 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b3b285f-9406-4f9b-9768-8827933418d7-client-ca" (OuterVolumeSpecName: "client-ca") pod "0b3b285f-9406-4f9b-9768-8827933418d7" (UID: "0b3b285f-9406-4f9b-9768-8827933418d7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:10:46 crc kubenswrapper[4676]: I0930 14:10:46.024624 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61b1921b-4102-4959-abd7-c86ca3ae880e-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "61b1921b-4102-4959-abd7-c86ca3ae880e" (UID: "61b1921b-4102-4959-abd7-c86ca3ae880e"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:10:46 crc kubenswrapper[4676]: I0930 14:10:46.025232 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61b1921b-4102-4959-abd7-c86ca3ae880e-config" (OuterVolumeSpecName: "config") pod "61b1921b-4102-4959-abd7-c86ca3ae880e" (UID: "61b1921b-4102-4959-abd7-c86ca3ae880e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:10:46 crc kubenswrapper[4676]: I0930 14:10:46.025568 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b3b285f-9406-4f9b-9768-8827933418d7-config" (OuterVolumeSpecName: "config") pod "0b3b285f-9406-4f9b-9768-8827933418d7" (UID: "0b3b285f-9406-4f9b-9768-8827933418d7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:10:46 crc kubenswrapper[4676]: I0930 14:10:46.032970 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b3b285f-9406-4f9b-9768-8827933418d7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b3b285f-9406-4f9b-9768-8827933418d7" (UID: "0b3b285f-9406-4f9b-9768-8827933418d7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:10:46 crc kubenswrapper[4676]: I0930 14:10:46.033285 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61b1921b-4102-4959-abd7-c86ca3ae880e-kube-api-access-rwd5w" (OuterVolumeSpecName: "kube-api-access-rwd5w") pod "61b1921b-4102-4959-abd7-c86ca3ae880e" (UID: "61b1921b-4102-4959-abd7-c86ca3ae880e"). InnerVolumeSpecName "kube-api-access-rwd5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:10:46 crc kubenswrapper[4676]: I0930 14:10:46.044176 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b3b285f-9406-4f9b-9768-8827933418d7-kube-api-access-vzlh4" (OuterVolumeSpecName: "kube-api-access-vzlh4") pod "0b3b285f-9406-4f9b-9768-8827933418d7" (UID: "0b3b285f-9406-4f9b-9768-8827933418d7"). InnerVolumeSpecName "kube-api-access-vzlh4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:10:46 crc kubenswrapper[4676]: I0930 14:10:46.052641 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61b1921b-4102-4959-abd7-c86ca3ae880e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "61b1921b-4102-4959-abd7-c86ca3ae880e" (UID: "61b1921b-4102-4959-abd7-c86ca3ae880e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:10:46 crc kubenswrapper[4676]: I0930 14:10:46.121698 4676 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/61b1921b-4102-4959-abd7-c86ca3ae880e-client-ca\") on node \"crc\" DevicePath \"\"" Sep 30 14:10:46 crc kubenswrapper[4676]: I0930 14:10:46.121742 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzlh4\" (UniqueName: \"kubernetes.io/projected/0b3b285f-9406-4f9b-9768-8827933418d7-kube-api-access-vzlh4\") on node \"crc\" DevicePath \"\"" Sep 30 14:10:46 crc kubenswrapper[4676]: I0930 14:10:46.121759 4676 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/61b1921b-4102-4959-abd7-c86ca3ae880e-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Sep 30 14:10:46 crc kubenswrapper[4676]: I0930 14:10:46.121773 4676 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0b3b285f-9406-4f9b-9768-8827933418d7-client-ca\") on node \"crc\" DevicePath \"\"" Sep 30 14:10:46 crc kubenswrapper[4676]: I0930 14:10:46.121787 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61b1921b-4102-4959-abd7-c86ca3ae880e-config\") on node \"crc\" DevicePath \"\"" Sep 30 14:10:46 crc kubenswrapper[4676]: I0930 14:10:46.121797 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b3b285f-9406-4f9b-9768-8827933418d7-config\") on node \"crc\" DevicePath \"\"" Sep 30 14:10:46 crc kubenswrapper[4676]: I0930 14:10:46.121808 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwd5w\" (UniqueName: \"kubernetes.io/projected/61b1921b-4102-4959-abd7-c86ca3ae880e-kube-api-access-rwd5w\") on node \"crc\" DevicePath \"\"" Sep 30 14:10:46 crc kubenswrapper[4676]: I0930 14:10:46.121818 4676 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b3b285f-9406-4f9b-9768-8827933418d7-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 14:10:46 crc kubenswrapper[4676]: I0930 14:10:46.121828 4676 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61b1921b-4102-4959-abd7-c86ca3ae880e-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 14:10:46 crc kubenswrapper[4676]: I0930 14:10:46.892082 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-qbdpc" Sep 30 14:10:46 crc kubenswrapper[4676]: I0930 14:10:46.892083 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-qbdpc" event={"ID":"61b1921b-4102-4959-abd7-c86ca3ae880e","Type":"ContainerDied","Data":"4e025dbf33a38f3867667d8c52bbc6658048ca466e959b5560f4f5184bc50b80"} Sep 30 14:10:46 crc kubenswrapper[4676]: I0930 14:10:46.892281 4676 scope.go:117] "RemoveContainer" containerID="b97143f2266a0da32711e5f0615f458fab187e7156146ccd6101995077239953" Sep 30 14:10:46 crc kubenswrapper[4676]: I0930 14:10:46.893175 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gn7dj" event={"ID":"0b3b285f-9406-4f9b-9768-8827933418d7","Type":"ContainerDied","Data":"768459b8d0ae42a92971c99a8a1ddd447fbe9f91d0ef8e4efa77308b1d771a68"} Sep 30 14:10:46 crc kubenswrapper[4676]: I0930 14:10:46.893307 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gn7dj" Sep 30 14:10:46 crc kubenswrapper[4676]: I0930 14:10:46.909579 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-699fcd57b-hz4cp"] Sep 30 14:10:46 crc kubenswrapper[4676]: E0930 14:10:46.909908 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b3b285f-9406-4f9b-9768-8827933418d7" containerName="route-controller-manager" Sep 30 14:10:46 crc kubenswrapper[4676]: I0930 14:10:46.909924 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b3b285f-9406-4f9b-9768-8827933418d7" containerName="route-controller-manager" Sep 30 14:10:46 crc kubenswrapper[4676]: E0930 14:10:46.909938 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61b1921b-4102-4959-abd7-c86ca3ae880e" containerName="controller-manager" Sep 30 14:10:46 crc kubenswrapper[4676]: I0930 14:10:46.909945 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="61b1921b-4102-4959-abd7-c86ca3ae880e" containerName="controller-manager" Sep 30 14:10:46 crc kubenswrapper[4676]: I0930 14:10:46.910052 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b3b285f-9406-4f9b-9768-8827933418d7" containerName="route-controller-manager" Sep 30 14:10:46 crc kubenswrapper[4676]: I0930 14:10:46.910064 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="61b1921b-4102-4959-abd7-c86ca3ae880e" containerName="controller-manager" Sep 30 14:10:46 crc kubenswrapper[4676]: I0930 14:10:46.910576 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-699fcd57b-hz4cp" Sep 30 14:10:46 crc kubenswrapper[4676]: I0930 14:10:46.915986 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Sep 30 14:10:46 crc kubenswrapper[4676]: I0930 14:10:46.916620 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Sep 30 14:10:46 crc kubenswrapper[4676]: I0930 14:10:46.916739 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Sep 30 14:10:46 crc kubenswrapper[4676]: I0930 14:10:46.916998 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Sep 30 14:10:46 crc kubenswrapper[4676]: I0930 14:10:46.917614 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Sep 30 14:10:46 crc kubenswrapper[4676]: I0930 14:10:46.917859 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Sep 30 14:10:46 crc kubenswrapper[4676]: I0930 14:10:46.923367 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b668748f7-c4w8p"] Sep 30 14:10:46 crc kubenswrapper[4676]: I0930 14:10:46.924306 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b668748f7-c4w8p" Sep 30 14:10:46 crc kubenswrapper[4676]: I0930 14:10:46.929775 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b668748f7-c4w8p"] Sep 30 14:10:46 crc kubenswrapper[4676]: I0930 14:10:46.930372 4676 scope.go:117] "RemoveContainer" containerID="bb9f3c494c7aa1a871ede7d9a4d19bc790048d46eefbc6f4f7a6fc1e15ce6ddf" Sep 30 14:10:46 crc kubenswrapper[4676]: I0930 14:10:46.932188 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/24e5713d-5cef-4c41-a315-6b70f0045699-proxy-ca-bundles\") pod \"controller-manager-699fcd57b-hz4cp\" (UID: \"24e5713d-5cef-4c41-a315-6b70f0045699\") " pod="openshift-controller-manager/controller-manager-699fcd57b-hz4cp" Sep 30 14:10:46 crc kubenswrapper[4676]: I0930 14:10:46.932224 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlqsn\" (UniqueName: \"kubernetes.io/projected/24e5713d-5cef-4c41-a315-6b70f0045699-kube-api-access-dlqsn\") pod \"controller-manager-699fcd57b-hz4cp\" (UID: \"24e5713d-5cef-4c41-a315-6b70f0045699\") " pod="openshift-controller-manager/controller-manager-699fcd57b-hz4cp" Sep 30 14:10:46 crc kubenswrapper[4676]: I0930 14:10:46.932255 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24e5713d-5cef-4c41-a315-6b70f0045699-serving-cert\") pod \"controller-manager-699fcd57b-hz4cp\" (UID: \"24e5713d-5cef-4c41-a315-6b70f0045699\") " pod="openshift-controller-manager/controller-manager-699fcd57b-hz4cp" Sep 30 14:10:46 crc kubenswrapper[4676]: I0930 14:10:46.932277 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6pzj\" (UniqueName: \"kubernetes.io/projected/b192003e-b2c2-46b7-9f76-af0f50110768-kube-api-access-c6pzj\") pod \"route-controller-manager-5b668748f7-c4w8p\" (UID: \"b192003e-b2c2-46b7-9f76-af0f50110768\") " pod="openshift-route-controller-manager/route-controller-manager-5b668748f7-c4w8p" Sep 30 14:10:46 crc kubenswrapper[4676]: I0930 14:10:46.932301 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b192003e-b2c2-46b7-9f76-af0f50110768-client-ca\") pod \"route-controller-manager-5b668748f7-c4w8p\" (UID: \"b192003e-b2c2-46b7-9f76-af0f50110768\") " pod="openshift-route-controller-manager/route-controller-manager-5b668748f7-c4w8p" Sep 30 14:10:46 crc kubenswrapper[4676]: I0930 14:10:46.932321 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b192003e-b2c2-46b7-9f76-af0f50110768-config\") pod \"route-controller-manager-5b668748f7-c4w8p\" (UID: \"b192003e-b2c2-46b7-9f76-af0f50110768\") " pod="openshift-route-controller-manager/route-controller-manager-5b668748f7-c4w8p" Sep 30 14:10:46 crc kubenswrapper[4676]: I0930 14:10:46.932342 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24e5713d-5cef-4c41-a315-6b70f0045699-config\") pod \"controller-manager-699fcd57b-hz4cp\" (UID: \"24e5713d-5cef-4c41-a315-6b70f0045699\") " pod="openshift-controller-manager/controller-manager-699fcd57b-hz4cp" Sep 30 14:10:46 crc kubenswrapper[4676]: I0930 14:10:46.932363 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/24e5713d-5cef-4c41-a315-6b70f0045699-client-ca\") pod \"controller-manager-699fcd57b-hz4cp\" (UID: \"24e5713d-5cef-4c41-a315-6b70f0045699\") " pod="openshift-controller-manager/controller-manager-699fcd57b-hz4cp" Sep 30 14:10:46 crc kubenswrapper[4676]: I0930 14:10:46.932390 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b192003e-b2c2-46b7-9f76-af0f50110768-serving-cert\") pod \"route-controller-manager-5b668748f7-c4w8p\" (UID: \"b192003e-b2c2-46b7-9f76-af0f50110768\") " pod="openshift-route-controller-manager/route-controller-manager-5b668748f7-c4w8p" Sep 30 14:10:46 crc kubenswrapper[4676]: I0930 14:10:46.936313 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Sep 30 14:10:46 crc kubenswrapper[4676]: I0930 14:10:46.936659 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Sep 30 14:10:46 crc kubenswrapper[4676]: I0930 14:10:46.936964 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Sep 30 14:10:46 crc kubenswrapper[4676]: I0930 14:10:46.937099 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Sep 30 14:10:46 crc kubenswrapper[4676]: I0930 14:10:46.937247 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Sep 30 14:10:46 crc kubenswrapper[4676]: I0930 14:10:46.938101 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Sep 30 14:10:46 crc kubenswrapper[4676]: I0930 14:10:46.939590 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Sep 30 14:10:46 crc kubenswrapper[4676]: I0930 14:10:46.946033 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-699fcd57b-hz4cp"] Sep 30 14:10:46 crc kubenswrapper[4676]: I0930 14:10:46.994202 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-qbdpc"] Sep 30 14:10:47 crc kubenswrapper[4676]: I0930 14:10:47.004719 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-qbdpc"] Sep 30 14:10:47 crc kubenswrapper[4676]: I0930 14:10:47.010258 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gn7dj"] Sep 30 14:10:47 crc kubenswrapper[4676]: I0930 14:10:47.015656 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gn7dj"] Sep 30 14:10:47 crc kubenswrapper[4676]: I0930 14:10:47.033748 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b192003e-b2c2-46b7-9f76-af0f50110768-client-ca\") pod \"route-controller-manager-5b668748f7-c4w8p\" (UID: \"b192003e-b2c2-46b7-9f76-af0f50110768\") " pod="openshift-route-controller-manager/route-controller-manager-5b668748f7-c4w8p" Sep 30 14:10:47 crc kubenswrapper[4676]: I0930 14:10:47.034037 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b192003e-b2c2-46b7-9f76-af0f50110768-config\") pod \"route-controller-manager-5b668748f7-c4w8p\" (UID: \"b192003e-b2c2-46b7-9f76-af0f50110768\") " pod="openshift-route-controller-manager/route-controller-manager-5b668748f7-c4w8p" Sep 30 14:10:47 crc kubenswrapper[4676]: I0930 14:10:47.034146 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24e5713d-5cef-4c41-a315-6b70f0045699-config\") pod \"controller-manager-699fcd57b-hz4cp\" (UID: \"24e5713d-5cef-4c41-a315-6b70f0045699\") " pod="openshift-controller-manager/controller-manager-699fcd57b-hz4cp" Sep 30 14:10:47 crc kubenswrapper[4676]: I0930 14:10:47.034224 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/24e5713d-5cef-4c41-a315-6b70f0045699-client-ca\") pod \"controller-manager-699fcd57b-hz4cp\" (UID: \"24e5713d-5cef-4c41-a315-6b70f0045699\") " pod="openshift-controller-manager/controller-manager-699fcd57b-hz4cp" Sep 30 14:10:47 crc kubenswrapper[4676]: I0930 14:10:47.034303 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b192003e-b2c2-46b7-9f76-af0f50110768-serving-cert\") pod \"route-controller-manager-5b668748f7-c4w8p\" (UID: \"b192003e-b2c2-46b7-9f76-af0f50110768\") " pod="openshift-route-controller-manager/route-controller-manager-5b668748f7-c4w8p" Sep 30 14:10:47 crc kubenswrapper[4676]: I0930 14:10:47.034392 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/24e5713d-5cef-4c41-a315-6b70f0045699-proxy-ca-bundles\") pod \"controller-manager-699fcd57b-hz4cp\" (UID: \"24e5713d-5cef-4c41-a315-6b70f0045699\") " pod="openshift-controller-manager/controller-manager-699fcd57b-hz4cp" Sep 30 14:10:47 crc kubenswrapper[4676]: I0930 14:10:47.034459 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlqsn\" (UniqueName: \"kubernetes.io/projected/24e5713d-5cef-4c41-a315-6b70f0045699-kube-api-access-dlqsn\") pod \"controller-manager-699fcd57b-hz4cp\" (UID: \"24e5713d-5cef-4c41-a315-6b70f0045699\") " pod="openshift-controller-manager/controller-manager-699fcd57b-hz4cp" Sep 30 14:10:47 crc kubenswrapper[4676]: I0930 14:10:47.034543 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24e5713d-5cef-4c41-a315-6b70f0045699-serving-cert\") pod \"controller-manager-699fcd57b-hz4cp\" (UID: \"24e5713d-5cef-4c41-a315-6b70f0045699\") " pod="openshift-controller-manager/controller-manager-699fcd57b-hz4cp" Sep 30 14:10:47 crc kubenswrapper[4676]: I0930 14:10:47.034618 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6pzj\" (UniqueName: \"kubernetes.io/projected/b192003e-b2c2-46b7-9f76-af0f50110768-kube-api-access-c6pzj\") pod \"route-controller-manager-5b668748f7-c4w8p\" (UID: \"b192003e-b2c2-46b7-9f76-af0f50110768\") " pod="openshift-route-controller-manager/route-controller-manager-5b668748f7-c4w8p" Sep 30 14:10:47 crc kubenswrapper[4676]: I0930 14:10:47.035094 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b192003e-b2c2-46b7-9f76-af0f50110768-client-ca\") pod \"route-controller-manager-5b668748f7-c4w8p\" (UID: \"b192003e-b2c2-46b7-9f76-af0f50110768\") " pod="openshift-route-controller-manager/route-controller-manager-5b668748f7-c4w8p" Sep 30 14:10:47 crc kubenswrapper[4676]: I0930 14:10:47.035358 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b192003e-b2c2-46b7-9f76-af0f50110768-config\") pod \"route-controller-manager-5b668748f7-c4w8p\" (UID: \"b192003e-b2c2-46b7-9f76-af0f50110768\") " pod="openshift-route-controller-manager/route-controller-manager-5b668748f7-c4w8p" Sep 30 14:10:47 crc kubenswrapper[4676]: I0930 14:10:47.036155 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/24e5713d-5cef-4c41-a315-6b70f0045699-client-ca\") pod \"controller-manager-699fcd57b-hz4cp\" (UID: \"24e5713d-5cef-4c41-a315-6b70f0045699\") " pod="openshift-controller-manager/controller-manager-699fcd57b-hz4cp" Sep 30 14:10:47 crc kubenswrapper[4676]: I0930 14:10:47.036592 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/24e5713d-5cef-4c41-a315-6b70f0045699-proxy-ca-bundles\") pod \"controller-manager-699fcd57b-hz4cp\" (UID: \"24e5713d-5cef-4c41-a315-6b70f0045699\") " pod="openshift-controller-manager/controller-manager-699fcd57b-hz4cp" Sep 30 14:10:47 crc kubenswrapper[4676]: I0930 14:10:47.036680 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24e5713d-5cef-4c41-a315-6b70f0045699-config\") pod \"controller-manager-699fcd57b-hz4cp\" (UID: \"24e5713d-5cef-4c41-a315-6b70f0045699\") " pod="openshift-controller-manager/controller-manager-699fcd57b-hz4cp" Sep 30 14:10:47 crc kubenswrapper[4676]: I0930 14:10:47.043161 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b192003e-b2c2-46b7-9f76-af0f50110768-serving-cert\") pod \"route-controller-manager-5b668748f7-c4w8p\" (UID: \"b192003e-b2c2-46b7-9f76-af0f50110768\") " pod="openshift-route-controller-manager/route-controller-manager-5b668748f7-c4w8p" Sep 30 14:10:47 crc kubenswrapper[4676]: I0930 14:10:47.047183 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24e5713d-5cef-4c41-a315-6b70f0045699-serving-cert\") pod \"controller-manager-699fcd57b-hz4cp\" (UID: \"24e5713d-5cef-4c41-a315-6b70f0045699\") " pod="openshift-controller-manager/controller-manager-699fcd57b-hz4cp" Sep 30 14:10:47 crc kubenswrapper[4676]: I0930 14:10:47.053254 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6pzj\" (UniqueName: \"kubernetes.io/projected/b192003e-b2c2-46b7-9f76-af0f50110768-kube-api-access-c6pzj\") pod \"route-controller-manager-5b668748f7-c4w8p\" (UID: \"b192003e-b2c2-46b7-9f76-af0f50110768\") " pod="openshift-route-controller-manager/route-controller-manager-5b668748f7-c4w8p" Sep 30 14:10:47 crc kubenswrapper[4676]: I0930 14:10:47.058032 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlqsn\" (UniqueName: \"kubernetes.io/projected/24e5713d-5cef-4c41-a315-6b70f0045699-kube-api-access-dlqsn\") pod \"controller-manager-699fcd57b-hz4cp\" (UID: \"24e5713d-5cef-4c41-a315-6b70f0045699\") " pod="openshift-controller-manager/controller-manager-699fcd57b-hz4cp" Sep 30 14:10:47 crc kubenswrapper[4676]: I0930 14:10:47.239574 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-699fcd57b-hz4cp" Sep 30 14:10:47 crc kubenswrapper[4676]: I0930 14:10:47.263224 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b668748f7-c4w8p" Sep 30 14:10:47 crc kubenswrapper[4676]: I0930 14:10:47.450694 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b3b285f-9406-4f9b-9768-8827933418d7" path="/var/lib/kubelet/pods/0b3b285f-9406-4f9b-9768-8827933418d7/volumes" Sep 30 14:10:47 crc kubenswrapper[4676]: I0930 14:10:47.451562 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61b1921b-4102-4959-abd7-c86ca3ae880e" path="/var/lib/kubelet/pods/61b1921b-4102-4959-abd7-c86ca3ae880e/volumes" Sep 30 14:10:47 crc kubenswrapper[4676]: W0930 14:10:47.509754 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24e5713d_5cef_4c41_a315_6b70f0045699.slice/crio-b0c148e02b5eec0560bf60a43b7f10fb4c3439f3771f09b9b2794f52d7abf906 WatchSource:0}: Error finding container b0c148e02b5eec0560bf60a43b7f10fb4c3439f3771f09b9b2794f52d7abf906: Status 404 returned error can't find the container with id b0c148e02b5eec0560bf60a43b7f10fb4c3439f3771f09b9b2794f52d7abf906 Sep 30 14:10:47 crc kubenswrapper[4676]: I0930 14:10:47.514538 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-699fcd57b-hz4cp"] Sep 30 14:10:47 crc kubenswrapper[4676]: I0930 14:10:47.564382 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b668748f7-c4w8p"] Sep 30 14:10:47 crc kubenswrapper[4676]: I0930 14:10:47.901356 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-699fcd57b-hz4cp" event={"ID":"24e5713d-5cef-4c41-a315-6b70f0045699","Type":"ContainerStarted","Data":"69c2b0dbf9223db217763de3a26108ac99b0408ed8ecc188636cd5a0213b5722"} Sep 30 14:10:47 crc kubenswrapper[4676]: I0930 14:10:47.901810 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-699fcd57b-hz4cp" event={"ID":"24e5713d-5cef-4c41-a315-6b70f0045699","Type":"ContainerStarted","Data":"b0c148e02b5eec0560bf60a43b7f10fb4c3439f3771f09b9b2794f52d7abf906"} Sep 30 14:10:47 crc kubenswrapper[4676]: I0930 14:10:47.903095 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-699fcd57b-hz4cp" Sep 30 14:10:47 crc kubenswrapper[4676]: I0930 14:10:47.909102 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b668748f7-c4w8p" event={"ID":"b192003e-b2c2-46b7-9f76-af0f50110768","Type":"ContainerStarted","Data":"cc1f6970c13d8f8ad5a1bc5d1ad362681f527c2723fa3f87ab71b3c618f366a6"} Sep 30 14:10:47 crc kubenswrapper[4676]: I0930 14:10:47.909410 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b668748f7-c4w8p" event={"ID":"b192003e-b2c2-46b7-9f76-af0f50110768","Type":"ContainerStarted","Data":"940da5c406a61aaf78a5f7a21c9c69d2d16e26702df3fe9034208d19cc2ed4c5"} Sep 30 14:10:47 crc kubenswrapper[4676]: I0930 14:10:47.909482 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5b668748f7-c4w8p" Sep 30 14:10:47 crc kubenswrapper[4676]: I0930 14:10:47.913430 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-699fcd57b-hz4cp" Sep 30 14:10:47 crc kubenswrapper[4676]: I0930 14:10:47.936839 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-699fcd57b-hz4cp" podStartSLOduration=2.9368176139999997 podStartE2EDuration="2.936817614s" podCreationTimestamp="2025-09-30 14:10:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:10:47.934330478 +0000 UTC m=+751.917418907" watchObservedRunningTime="2025-09-30 14:10:47.936817614 +0000 UTC m=+751.919906043" Sep 30 14:10:47 crc kubenswrapper[4676]: I0930 14:10:47.971560 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5b668748f7-c4w8p" podStartSLOduration=2.97153811 podStartE2EDuration="2.97153811s" podCreationTimestamp="2025-09-30 14:10:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:10:47.968644375 +0000 UTC m=+751.951732824" watchObservedRunningTime="2025-09-30 14:10:47.97153811 +0000 UTC m=+751.954626539" Sep 30 14:10:48 crc kubenswrapper[4676]: I0930 14:10:48.261153 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5b668748f7-c4w8p" Sep 30 14:10:53 crc kubenswrapper[4676]: I0930 14:10:53.189104 4676 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Sep 30 14:10:59 crc kubenswrapper[4676]: I0930 14:10:59.927546 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-zgcbx" podUID="1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d" containerName="console" containerID="cri-o://2c4503746a0f21b49f56c0355db44aff31fb1af8504aa8170e55afcd34cbfce1" gracePeriod=15 Sep 30 14:11:00 crc kubenswrapper[4676]: I0930 14:11:00.055428 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96stcjg"] Sep 30 14:11:00 crc kubenswrapper[4676]: I0930 14:11:00.057461 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96stcjg" Sep 30 14:11:00 crc kubenswrapper[4676]: I0930 14:11:00.061302 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Sep 30 14:11:00 crc kubenswrapper[4676]: I0930 14:11:00.068924 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96stcjg"] Sep 30 14:11:00 crc kubenswrapper[4676]: I0930 14:11:00.151183 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c6917646-fb37-4aa8-bd63-a09bdb713ea1-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96stcjg\" (UID: \"c6917646-fb37-4aa8-bd63-a09bdb713ea1\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96stcjg" Sep 30 14:11:00 crc kubenswrapper[4676]: I0930 14:11:00.151249 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5248t\" (UniqueName: \"kubernetes.io/projected/c6917646-fb37-4aa8-bd63-a09bdb713ea1-kube-api-access-5248t\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96stcjg\" (UID: \"c6917646-fb37-4aa8-bd63-a09bdb713ea1\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96stcjg" Sep 30 14:11:00 crc kubenswrapper[4676]: I0930 14:11:00.151690 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c6917646-fb37-4aa8-bd63-a09bdb713ea1-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96stcjg\" (UID: \"c6917646-fb37-4aa8-bd63-a09bdb713ea1\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96stcjg" Sep 30 14:11:00 crc kubenswrapper[4676]: I0930 14:11:00.253906 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c6917646-fb37-4aa8-bd63-a09bdb713ea1-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96stcjg\" (UID: \"c6917646-fb37-4aa8-bd63-a09bdb713ea1\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96stcjg" Sep 30 14:11:00 crc kubenswrapper[4676]: I0930 14:11:00.253966 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5248t\" (UniqueName: \"kubernetes.io/projected/c6917646-fb37-4aa8-bd63-a09bdb713ea1-kube-api-access-5248t\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96stcjg\" (UID: \"c6917646-fb37-4aa8-bd63-a09bdb713ea1\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96stcjg" Sep 30 14:11:00 crc kubenswrapper[4676]: I0930 14:11:00.254031 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c6917646-fb37-4aa8-bd63-a09bdb713ea1-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96stcjg\" (UID: \"c6917646-fb37-4aa8-bd63-a09bdb713ea1\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96stcjg" Sep 30 14:11:00 crc kubenswrapper[4676]: I0930 14:11:00.254607 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c6917646-fb37-4aa8-bd63-a09bdb713ea1-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96stcjg\" (UID: \"c6917646-fb37-4aa8-bd63-a09bdb713ea1\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96stcjg" Sep 30 14:11:00 crc kubenswrapper[4676]: I0930 14:11:00.254619 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c6917646-fb37-4aa8-bd63-a09bdb713ea1-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96stcjg\" (UID: \"c6917646-fb37-4aa8-bd63-a09bdb713ea1\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96stcjg" Sep 30 14:11:00 crc kubenswrapper[4676]: I0930 14:11:00.279406 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5248t\" (UniqueName: \"kubernetes.io/projected/c6917646-fb37-4aa8-bd63-a09bdb713ea1-kube-api-access-5248t\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96stcjg\" (UID: \"c6917646-fb37-4aa8-bd63-a09bdb713ea1\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96stcjg" Sep 30 14:11:00 crc kubenswrapper[4676]: I0930 14:11:00.381320 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96stcjg" Sep 30 14:11:00 crc kubenswrapper[4676]: I0930 14:11:00.430789 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-zgcbx_1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d/console/0.log" Sep 30 14:11:00 crc kubenswrapper[4676]: I0930 14:11:00.430913 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-zgcbx" Sep 30 14:11:00 crc kubenswrapper[4676]: I0930 14:11:00.558344 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d-console-oauth-config\") pod \"1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d\" (UID: \"1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d\") " Sep 30 14:11:00 crc kubenswrapper[4676]: I0930 14:11:00.558927 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d-oauth-serving-cert\") pod \"1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d\" (UID: \"1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d\") " Sep 30 14:11:00 crc kubenswrapper[4676]: I0930 14:11:00.559807 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d" (UID: "1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:11:00 crc kubenswrapper[4676]: I0930 14:11:00.559923 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d-console-config\") pod \"1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d\" (UID: \"1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d\") " Sep 30 14:11:00 crc kubenswrapper[4676]: I0930 14:11:00.560400 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d-console-config" (OuterVolumeSpecName: "console-config") pod "1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d" (UID: "1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:11:00 crc kubenswrapper[4676]: I0930 14:11:00.560441 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d-service-ca\") pod \"1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d\" (UID: \"1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d\") " Sep 30 14:11:00 crc kubenswrapper[4676]: I0930 14:11:00.560485 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d-service-ca" (OuterVolumeSpecName: "service-ca") pod "1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d" (UID: "1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:11:00 crc kubenswrapper[4676]: I0930 14:11:00.560503 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d-trusted-ca-bundle\") pod \"1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d\" (UID: \"1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d\") " Sep 30 14:11:00 crc kubenswrapper[4676]: I0930 14:11:00.560536 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d-console-serving-cert\") pod \"1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d\" (UID: \"1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d\") " Sep 30 14:11:00 crc kubenswrapper[4676]: I0930 14:11:00.561077 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d" (UID: "1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:11:00 crc kubenswrapper[4676]: I0930 14:11:00.560603 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4m5j\" (UniqueName: \"kubernetes.io/projected/1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d-kube-api-access-x4m5j\") pod \"1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d\" (UID: \"1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d\") " Sep 30 14:11:00 crc kubenswrapper[4676]: I0930 14:11:00.562591 4676 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 14:11:00 crc kubenswrapper[4676]: I0930 14:11:00.562661 4676 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d-console-config\") on node \"crc\" DevicePath \"\"" Sep 30 14:11:00 crc kubenswrapper[4676]: I0930 14:11:00.562681 4676 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 14:11:00 crc kubenswrapper[4676]: I0930 14:11:00.562701 4676 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:11:00 crc kubenswrapper[4676]: I0930 14:11:00.566644 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d" (UID: "1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:11:00 crc kubenswrapper[4676]: I0930 14:11:00.567522 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d-kube-api-access-x4m5j" (OuterVolumeSpecName: "kube-api-access-x4m5j") pod "1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d" (UID: "1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d"). InnerVolumeSpecName "kube-api-access-x4m5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:11:00 crc kubenswrapper[4676]: I0930 14:11:00.568074 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d" (UID: "1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:11:00 crc kubenswrapper[4676]: I0930 14:11:00.664967 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4m5j\" (UniqueName: \"kubernetes.io/projected/1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d-kube-api-access-x4m5j\") on node \"crc\" DevicePath \"\"" Sep 30 14:11:00 crc kubenswrapper[4676]: I0930 14:11:00.665026 4676 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d-console-oauth-config\") on node \"crc\" DevicePath \"\"" Sep 30 14:11:00 crc kubenswrapper[4676]: I0930 14:11:00.665050 4676 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d-console-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 14:11:00 crc kubenswrapper[4676]: I0930 14:11:00.821197 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96stcjg"] Sep 30 14:11:01 crc kubenswrapper[4676]: I0930 14:11:01.003378 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-zgcbx_1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d/console/0.log" Sep 30 14:11:01 crc kubenswrapper[4676]: I0930 14:11:01.003460 4676 generic.go:334] "Generic (PLEG): container finished" podID="1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d" containerID="2c4503746a0f21b49f56c0355db44aff31fb1af8504aa8170e55afcd34cbfce1" exitCode=2 Sep 30 14:11:01 crc kubenswrapper[4676]: I0930 14:11:01.003556 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-zgcbx" event={"ID":"1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d","Type":"ContainerDied","Data":"2c4503746a0f21b49f56c0355db44aff31fb1af8504aa8170e55afcd34cbfce1"} Sep 30 14:11:01 crc kubenswrapper[4676]: I0930 14:11:01.003600 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-zgcbx" event={"ID":"1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d","Type":"ContainerDied","Data":"a7d1e40a274a98d8dff2e0d85a1bc4ade490d1fdac6303a396af1869601339d6"} Sep 30 14:11:01 crc kubenswrapper[4676]: I0930 14:11:01.003633 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-zgcbx" Sep 30 14:11:01 crc kubenswrapper[4676]: I0930 14:11:01.003684 4676 scope.go:117] "RemoveContainer" containerID="2c4503746a0f21b49f56c0355db44aff31fb1af8504aa8170e55afcd34cbfce1" Sep 30 14:11:01 crc kubenswrapper[4676]: I0930 14:11:01.005936 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96stcjg" event={"ID":"c6917646-fb37-4aa8-bd63-a09bdb713ea1","Type":"ContainerStarted","Data":"15604117543a91b11da77dec5d04a203b1fc0fad07e98bdc351f77c920ddedbb"} Sep 30 14:11:01 crc kubenswrapper[4676]: I0930 14:11:01.006000 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96stcjg" event={"ID":"c6917646-fb37-4aa8-bd63-a09bdb713ea1","Type":"ContainerStarted","Data":"2c03ff62c417a1c33332b21fd1d843379f8eb4296e00ccabdf3a8667dae96499"} Sep 30 14:11:01 crc kubenswrapper[4676]: I0930 14:11:01.025152 4676 scope.go:117] "RemoveContainer" containerID="2c4503746a0f21b49f56c0355db44aff31fb1af8504aa8170e55afcd34cbfce1" Sep 30 14:11:01 crc kubenswrapper[4676]: E0930 14:11:01.025766 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c4503746a0f21b49f56c0355db44aff31fb1af8504aa8170e55afcd34cbfce1\": container with ID starting with 2c4503746a0f21b49f56c0355db44aff31fb1af8504aa8170e55afcd34cbfce1 not found: ID does not exist" containerID="2c4503746a0f21b49f56c0355db44aff31fb1af8504aa8170e55afcd34cbfce1" Sep 30 14:11:01 crc kubenswrapper[4676]: I0930 14:11:01.025836 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c4503746a0f21b49f56c0355db44aff31fb1af8504aa8170e55afcd34cbfce1"} err="failed to get container status \"2c4503746a0f21b49f56c0355db44aff31fb1af8504aa8170e55afcd34cbfce1\": rpc error: code = NotFound desc = could not find container \"2c4503746a0f21b49f56c0355db44aff31fb1af8504aa8170e55afcd34cbfce1\": container with ID starting with 2c4503746a0f21b49f56c0355db44aff31fb1af8504aa8170e55afcd34cbfce1 not found: ID does not exist" Sep 30 14:11:01 crc kubenswrapper[4676]: I0930 14:11:01.055119 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-zgcbx"] Sep 30 14:11:01 crc kubenswrapper[4676]: I0930 14:11:01.057743 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-zgcbx"] Sep 30 14:11:01 crc kubenswrapper[4676]: I0930 14:11:01.443276 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d" path="/var/lib/kubelet/pods/1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d/volumes" Sep 30 14:11:02 crc kubenswrapper[4676]: I0930 14:11:02.014464 4676 generic.go:334] "Generic (PLEG): container finished" podID="c6917646-fb37-4aa8-bd63-a09bdb713ea1" containerID="15604117543a91b11da77dec5d04a203b1fc0fad07e98bdc351f77c920ddedbb" exitCode=0 Sep 30 14:11:02 crc kubenswrapper[4676]: I0930 14:11:02.014513 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96stcjg" event={"ID":"c6917646-fb37-4aa8-bd63-a09bdb713ea1","Type":"ContainerDied","Data":"15604117543a91b11da77dec5d04a203b1fc0fad07e98bdc351f77c920ddedbb"} Sep 30 14:11:02 crc kubenswrapper[4676]: I0930 14:11:02.413215 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8kfpw"] Sep 30 14:11:02 crc kubenswrapper[4676]: E0930 14:11:02.414534 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d" containerName="console" Sep 30 14:11:02 crc kubenswrapper[4676]: I0930 14:11:02.414558 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d" containerName="console" Sep 30 14:11:02 crc kubenswrapper[4676]: I0930 14:11:02.414690 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f9d9d1d-543a-4ed7-8e04-b4dadf19af2d" containerName="console" Sep 30 14:11:02 crc kubenswrapper[4676]: I0930 14:11:02.415609 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8kfpw" Sep 30 14:11:02 crc kubenswrapper[4676]: I0930 14:11:02.420067 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8kfpw"] Sep 30 14:11:02 crc kubenswrapper[4676]: I0930 14:11:02.498112 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/239015d7-f8f2-4823-9094-6a3248c8e0a0-utilities\") pod \"redhat-operators-8kfpw\" (UID: \"239015d7-f8f2-4823-9094-6a3248c8e0a0\") " pod="openshift-marketplace/redhat-operators-8kfpw" Sep 30 14:11:02 crc kubenswrapper[4676]: I0930 14:11:02.498175 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6f8px\" (UniqueName: \"kubernetes.io/projected/239015d7-f8f2-4823-9094-6a3248c8e0a0-kube-api-access-6f8px\") pod \"redhat-operators-8kfpw\" (UID: \"239015d7-f8f2-4823-9094-6a3248c8e0a0\") " pod="openshift-marketplace/redhat-operators-8kfpw" Sep 30 14:11:02 crc kubenswrapper[4676]: I0930 14:11:02.498202 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/239015d7-f8f2-4823-9094-6a3248c8e0a0-catalog-content\") pod \"redhat-operators-8kfpw\" (UID: \"239015d7-f8f2-4823-9094-6a3248c8e0a0\") " pod="openshift-marketplace/redhat-operators-8kfpw" Sep 30 14:11:02 crc kubenswrapper[4676]: I0930 14:11:02.599678 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/239015d7-f8f2-4823-9094-6a3248c8e0a0-utilities\") pod \"redhat-operators-8kfpw\" (UID: \"239015d7-f8f2-4823-9094-6a3248c8e0a0\") " pod="openshift-marketplace/redhat-operators-8kfpw" Sep 30 14:11:02 crc kubenswrapper[4676]: I0930 14:11:02.599743 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6f8px\" (UniqueName: \"kubernetes.io/projected/239015d7-f8f2-4823-9094-6a3248c8e0a0-kube-api-access-6f8px\") pod \"redhat-operators-8kfpw\" (UID: \"239015d7-f8f2-4823-9094-6a3248c8e0a0\") " pod="openshift-marketplace/redhat-operators-8kfpw" Sep 30 14:11:02 crc kubenswrapper[4676]: I0930 14:11:02.599767 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/239015d7-f8f2-4823-9094-6a3248c8e0a0-catalog-content\") pod \"redhat-operators-8kfpw\" (UID: \"239015d7-f8f2-4823-9094-6a3248c8e0a0\") " pod="openshift-marketplace/redhat-operators-8kfpw" Sep 30 14:11:02 crc kubenswrapper[4676]: I0930 14:11:02.600345 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/239015d7-f8f2-4823-9094-6a3248c8e0a0-utilities\") pod \"redhat-operators-8kfpw\" (UID: \"239015d7-f8f2-4823-9094-6a3248c8e0a0\") " pod="openshift-marketplace/redhat-operators-8kfpw" Sep 30 14:11:02 crc kubenswrapper[4676]: I0930 14:11:02.600374 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/239015d7-f8f2-4823-9094-6a3248c8e0a0-catalog-content\") pod \"redhat-operators-8kfpw\" (UID: \"239015d7-f8f2-4823-9094-6a3248c8e0a0\") " pod="openshift-marketplace/redhat-operators-8kfpw" Sep 30 14:11:02 crc kubenswrapper[4676]: I0930 14:11:02.628323 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6f8px\" (UniqueName: \"kubernetes.io/projected/239015d7-f8f2-4823-9094-6a3248c8e0a0-kube-api-access-6f8px\") pod \"redhat-operators-8kfpw\" (UID: \"239015d7-f8f2-4823-9094-6a3248c8e0a0\") " pod="openshift-marketplace/redhat-operators-8kfpw" Sep 30 14:11:02 crc kubenswrapper[4676]: I0930 14:11:02.738344 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8kfpw" Sep 30 14:11:03 crc kubenswrapper[4676]: I0930 14:11:03.165677 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8kfpw"] Sep 30 14:11:04 crc kubenswrapper[4676]: I0930 14:11:04.030424 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96stcjg" event={"ID":"c6917646-fb37-4aa8-bd63-a09bdb713ea1","Type":"ContainerStarted","Data":"89e1248e1224933c2a0d72071d6f50a3ae4ac55f457f532eba6c8606ae1dbd1b"} Sep 30 14:11:04 crc kubenswrapper[4676]: I0930 14:11:04.032337 4676 generic.go:334] "Generic (PLEG): container finished" podID="239015d7-f8f2-4823-9094-6a3248c8e0a0" containerID="f376f1fab5002adbf30280ef0e02953fe843c9e283d293dc7e5124123b754642" exitCode=0 Sep 30 14:11:04 crc kubenswrapper[4676]: I0930 14:11:04.032394 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8kfpw" event={"ID":"239015d7-f8f2-4823-9094-6a3248c8e0a0","Type":"ContainerDied","Data":"f376f1fab5002adbf30280ef0e02953fe843c9e283d293dc7e5124123b754642"} Sep 30 14:11:04 crc kubenswrapper[4676]: I0930 14:11:04.032428 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8kfpw" event={"ID":"239015d7-f8f2-4823-9094-6a3248c8e0a0","Type":"ContainerStarted","Data":"e927f7a942f8023fa73db823deba9e1a132dbda3c4039a320b9a934c76e317f3"} Sep 30 14:11:05 crc kubenswrapper[4676]: I0930 14:11:05.073664 4676 generic.go:334] "Generic (PLEG): container finished" podID="c6917646-fb37-4aa8-bd63-a09bdb713ea1" containerID="89e1248e1224933c2a0d72071d6f50a3ae4ac55f457f532eba6c8606ae1dbd1b" exitCode=0 Sep 30 14:11:05 crc kubenswrapper[4676]: I0930 14:11:05.073739 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96stcjg" event={"ID":"c6917646-fb37-4aa8-bd63-a09bdb713ea1","Type":"ContainerDied","Data":"89e1248e1224933c2a0d72071d6f50a3ae4ac55f457f532eba6c8606ae1dbd1b"} Sep 30 14:11:06 crc kubenswrapper[4676]: I0930 14:11:06.083695 4676 generic.go:334] "Generic (PLEG): container finished" podID="c6917646-fb37-4aa8-bd63-a09bdb713ea1" containerID="022021e7761283ac34760417f1659f7a0e4877cac2ecd47c54b15f6044ad1ce4" exitCode=0 Sep 30 14:11:06 crc kubenswrapper[4676]: I0930 14:11:06.083809 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96stcjg" event={"ID":"c6917646-fb37-4aa8-bd63-a09bdb713ea1","Type":"ContainerDied","Data":"022021e7761283ac34760417f1659f7a0e4877cac2ecd47c54b15f6044ad1ce4"} Sep 30 14:11:07 crc kubenswrapper[4676]: I0930 14:11:07.438124 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96stcjg" Sep 30 14:11:07 crc kubenswrapper[4676]: I0930 14:11:07.584591 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c6917646-fb37-4aa8-bd63-a09bdb713ea1-util\") pod \"c6917646-fb37-4aa8-bd63-a09bdb713ea1\" (UID: \"c6917646-fb37-4aa8-bd63-a09bdb713ea1\") " Sep 30 14:11:07 crc kubenswrapper[4676]: I0930 14:11:07.584697 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5248t\" (UniqueName: \"kubernetes.io/projected/c6917646-fb37-4aa8-bd63-a09bdb713ea1-kube-api-access-5248t\") pod \"c6917646-fb37-4aa8-bd63-a09bdb713ea1\" (UID: \"c6917646-fb37-4aa8-bd63-a09bdb713ea1\") " Sep 30 14:11:07 crc kubenswrapper[4676]: I0930 14:11:07.584717 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c6917646-fb37-4aa8-bd63-a09bdb713ea1-bundle\") pod \"c6917646-fb37-4aa8-bd63-a09bdb713ea1\" (UID: \"c6917646-fb37-4aa8-bd63-a09bdb713ea1\") " Sep 30 14:11:07 crc kubenswrapper[4676]: I0930 14:11:07.585859 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6917646-fb37-4aa8-bd63-a09bdb713ea1-bundle" (OuterVolumeSpecName: "bundle") pod "c6917646-fb37-4aa8-bd63-a09bdb713ea1" (UID: "c6917646-fb37-4aa8-bd63-a09bdb713ea1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:11:07 crc kubenswrapper[4676]: I0930 14:11:07.590435 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6917646-fb37-4aa8-bd63-a09bdb713ea1-kube-api-access-5248t" (OuterVolumeSpecName: "kube-api-access-5248t") pod "c6917646-fb37-4aa8-bd63-a09bdb713ea1" (UID: "c6917646-fb37-4aa8-bd63-a09bdb713ea1"). InnerVolumeSpecName "kube-api-access-5248t". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:11:07 crc kubenswrapper[4676]: I0930 14:11:07.595469 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6917646-fb37-4aa8-bd63-a09bdb713ea1-util" (OuterVolumeSpecName: "util") pod "c6917646-fb37-4aa8-bd63-a09bdb713ea1" (UID: "c6917646-fb37-4aa8-bd63-a09bdb713ea1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:11:07 crc kubenswrapper[4676]: I0930 14:11:07.685984 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5248t\" (UniqueName: \"kubernetes.io/projected/c6917646-fb37-4aa8-bd63-a09bdb713ea1-kube-api-access-5248t\") on node \"crc\" DevicePath \"\"" Sep 30 14:11:07 crc kubenswrapper[4676]: I0930 14:11:07.686021 4676 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c6917646-fb37-4aa8-bd63-a09bdb713ea1-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:11:07 crc kubenswrapper[4676]: I0930 14:11:07.686035 4676 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c6917646-fb37-4aa8-bd63-a09bdb713ea1-util\") on node \"crc\" DevicePath \"\"" Sep 30 14:11:08 crc kubenswrapper[4676]: I0930 14:11:08.097532 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96stcjg" event={"ID":"c6917646-fb37-4aa8-bd63-a09bdb713ea1","Type":"ContainerDied","Data":"2c03ff62c417a1c33332b21fd1d843379f8eb4296e00ccabdf3a8667dae96499"} Sep 30 14:11:08 crc kubenswrapper[4676]: I0930 14:11:08.097580 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c03ff62c417a1c33332b21fd1d843379f8eb4296e00ccabdf3a8667dae96499" Sep 30 14:11:08 crc kubenswrapper[4676]: I0930 14:11:08.097599 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96stcjg" Sep 30 14:11:12 crc kubenswrapper[4676]: I0930 14:11:12.120394 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8kfpw" event={"ID":"239015d7-f8f2-4823-9094-6a3248c8e0a0","Type":"ContainerStarted","Data":"6b622361001dd4ea0a63f90149ce1bcdf282480130c878731eb926c46ce2a29d"} Sep 30 14:11:13 crc kubenswrapper[4676]: I0930 14:11:13.127365 4676 generic.go:334] "Generic (PLEG): container finished" podID="239015d7-f8f2-4823-9094-6a3248c8e0a0" containerID="6b622361001dd4ea0a63f90149ce1bcdf282480130c878731eb926c46ce2a29d" exitCode=0 Sep 30 14:11:13 crc kubenswrapper[4676]: I0930 14:11:13.127421 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8kfpw" event={"ID":"239015d7-f8f2-4823-9094-6a3248c8e0a0","Type":"ContainerDied","Data":"6b622361001dd4ea0a63f90149ce1bcdf282480130c878731eb926c46ce2a29d"} Sep 30 14:11:14 crc kubenswrapper[4676]: I0930 14:11:14.135757 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8kfpw" event={"ID":"239015d7-f8f2-4823-9094-6a3248c8e0a0","Type":"ContainerStarted","Data":"6dda16e7620900e8b567c945d0748086a935713b9bebe5f8a6a8b1a65003ce02"} Sep 30 14:11:14 crc kubenswrapper[4676]: I0930 14:11:14.152167 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8kfpw" podStartSLOduration=2.65707568 podStartE2EDuration="12.152115817s" podCreationTimestamp="2025-09-30 14:11:02 +0000 UTC" firstStartedPulling="2025-09-30 14:11:04.034051292 +0000 UTC m=+768.017139761" lastFinishedPulling="2025-09-30 14:11:13.529091459 +0000 UTC m=+777.512179898" observedRunningTime="2025-09-30 14:11:14.151149123 +0000 UTC m=+778.134237572" watchObservedRunningTime="2025-09-30 14:11:14.152115817 +0000 UTC m=+778.135204256" Sep 30 14:11:18 crc kubenswrapper[4676]: I0930 14:11:18.174584 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-76c7cc4875-dchh6"] Sep 30 14:11:18 crc kubenswrapper[4676]: E0930 14:11:18.175560 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6917646-fb37-4aa8-bd63-a09bdb713ea1" containerName="extract" Sep 30 14:11:18 crc kubenswrapper[4676]: I0930 14:11:18.175574 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6917646-fb37-4aa8-bd63-a09bdb713ea1" containerName="extract" Sep 30 14:11:18 crc kubenswrapper[4676]: E0930 14:11:18.175591 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6917646-fb37-4aa8-bd63-a09bdb713ea1" containerName="util" Sep 30 14:11:18 crc kubenswrapper[4676]: I0930 14:11:18.175598 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6917646-fb37-4aa8-bd63-a09bdb713ea1" containerName="util" Sep 30 14:11:18 crc kubenswrapper[4676]: E0930 14:11:18.175612 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6917646-fb37-4aa8-bd63-a09bdb713ea1" containerName="pull" Sep 30 14:11:18 crc kubenswrapper[4676]: I0930 14:11:18.175619 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6917646-fb37-4aa8-bd63-a09bdb713ea1" containerName="pull" Sep 30 14:11:18 crc kubenswrapper[4676]: I0930 14:11:18.175735 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6917646-fb37-4aa8-bd63-a09bdb713ea1" containerName="extract" Sep 30 14:11:18 crc kubenswrapper[4676]: I0930 14:11:18.176198 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-76c7cc4875-dchh6" Sep 30 14:11:18 crc kubenswrapper[4676]: I0930 14:11:18.178937 4676 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-fczz4" Sep 30 14:11:18 crc kubenswrapper[4676]: I0930 14:11:18.179622 4676 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Sep 30 14:11:18 crc kubenswrapper[4676]: I0930 14:11:18.179642 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Sep 30 14:11:18 crc kubenswrapper[4676]: I0930 14:11:18.179931 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Sep 30 14:11:18 crc kubenswrapper[4676]: I0930 14:11:18.196801 4676 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Sep 30 14:11:18 crc kubenswrapper[4676]: I0930 14:11:18.201677 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-76c7cc4875-dchh6"] Sep 30 14:11:18 crc kubenswrapper[4676]: I0930 14:11:18.235933 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2a1324e4-02a6-4c81-b533-086cbd21e10f-apiservice-cert\") pod \"metallb-operator-controller-manager-76c7cc4875-dchh6\" (UID: \"2a1324e4-02a6-4c81-b533-086cbd21e10f\") " pod="metallb-system/metallb-operator-controller-manager-76c7cc4875-dchh6" Sep 30 14:11:18 crc kubenswrapper[4676]: I0930 14:11:18.236264 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kcd6\" (UniqueName: \"kubernetes.io/projected/2a1324e4-02a6-4c81-b533-086cbd21e10f-kube-api-access-2kcd6\") pod \"metallb-operator-controller-manager-76c7cc4875-dchh6\" (UID: \"2a1324e4-02a6-4c81-b533-086cbd21e10f\") " pod="metallb-system/metallb-operator-controller-manager-76c7cc4875-dchh6" Sep 30 14:11:18 crc kubenswrapper[4676]: I0930 14:11:18.236356 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2a1324e4-02a6-4c81-b533-086cbd21e10f-webhook-cert\") pod \"metallb-operator-controller-manager-76c7cc4875-dchh6\" (UID: \"2a1324e4-02a6-4c81-b533-086cbd21e10f\") " pod="metallb-system/metallb-operator-controller-manager-76c7cc4875-dchh6" Sep 30 14:11:18 crc kubenswrapper[4676]: I0930 14:11:18.337283 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2a1324e4-02a6-4c81-b533-086cbd21e10f-apiservice-cert\") pod \"metallb-operator-controller-manager-76c7cc4875-dchh6\" (UID: \"2a1324e4-02a6-4c81-b533-086cbd21e10f\") " pod="metallb-system/metallb-operator-controller-manager-76c7cc4875-dchh6" Sep 30 14:11:18 crc kubenswrapper[4676]: I0930 14:11:18.337382 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kcd6\" (UniqueName: \"kubernetes.io/projected/2a1324e4-02a6-4c81-b533-086cbd21e10f-kube-api-access-2kcd6\") pod \"metallb-operator-controller-manager-76c7cc4875-dchh6\" (UID: \"2a1324e4-02a6-4c81-b533-086cbd21e10f\") " pod="metallb-system/metallb-operator-controller-manager-76c7cc4875-dchh6" Sep 30 14:11:18 crc kubenswrapper[4676]: I0930 14:11:18.337417 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2a1324e4-02a6-4c81-b533-086cbd21e10f-webhook-cert\") pod \"metallb-operator-controller-manager-76c7cc4875-dchh6\" (UID: \"2a1324e4-02a6-4c81-b533-086cbd21e10f\") " pod="metallb-system/metallb-operator-controller-manager-76c7cc4875-dchh6" Sep 30 14:11:18 crc kubenswrapper[4676]: I0930 14:11:18.345770 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2a1324e4-02a6-4c81-b533-086cbd21e10f-webhook-cert\") pod \"metallb-operator-controller-manager-76c7cc4875-dchh6\" (UID: \"2a1324e4-02a6-4c81-b533-086cbd21e10f\") " pod="metallb-system/metallb-operator-controller-manager-76c7cc4875-dchh6" Sep 30 14:11:18 crc kubenswrapper[4676]: I0930 14:11:18.357466 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2a1324e4-02a6-4c81-b533-086cbd21e10f-apiservice-cert\") pod \"metallb-operator-controller-manager-76c7cc4875-dchh6\" (UID: \"2a1324e4-02a6-4c81-b533-086cbd21e10f\") " pod="metallb-system/metallb-operator-controller-manager-76c7cc4875-dchh6" Sep 30 14:11:18 crc kubenswrapper[4676]: I0930 14:11:18.366055 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kcd6\" (UniqueName: \"kubernetes.io/projected/2a1324e4-02a6-4c81-b533-086cbd21e10f-kube-api-access-2kcd6\") pod \"metallb-operator-controller-manager-76c7cc4875-dchh6\" (UID: \"2a1324e4-02a6-4c81-b533-086cbd21e10f\") " pod="metallb-system/metallb-operator-controller-manager-76c7cc4875-dchh6" Sep 30 14:11:18 crc kubenswrapper[4676]: I0930 14:11:18.498049 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-76c7cc4875-dchh6" Sep 30 14:11:18 crc kubenswrapper[4676]: I0930 14:11:18.512137 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-fb5687d59-w22bc"] Sep 30 14:11:18 crc kubenswrapper[4676]: I0930 14:11:18.513143 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-fb5687d59-w22bc" Sep 30 14:11:18 crc kubenswrapper[4676]: I0930 14:11:18.515816 4676 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Sep 30 14:11:18 crc kubenswrapper[4676]: I0930 14:11:18.515895 4676 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Sep 30 14:11:18 crc kubenswrapper[4676]: I0930 14:11:18.516216 4676 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-jbqrs" Sep 30 14:11:18 crc kubenswrapper[4676]: I0930 14:11:18.542272 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-fb5687d59-w22bc"] Sep 30 14:11:18 crc kubenswrapper[4676]: I0930 14:11:18.644435 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/eb9ac7e3-b48e-44d4-9053-7e5ecec6a138-apiservice-cert\") pod \"metallb-operator-webhook-server-fb5687d59-w22bc\" (UID: \"eb9ac7e3-b48e-44d4-9053-7e5ecec6a138\") " pod="metallb-system/metallb-operator-webhook-server-fb5687d59-w22bc" Sep 30 14:11:18 crc kubenswrapper[4676]: I0930 14:11:18.644795 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bfhw\" (UniqueName: \"kubernetes.io/projected/eb9ac7e3-b48e-44d4-9053-7e5ecec6a138-kube-api-access-7bfhw\") pod \"metallb-operator-webhook-server-fb5687d59-w22bc\" (UID: \"eb9ac7e3-b48e-44d4-9053-7e5ecec6a138\") " pod="metallb-system/metallb-operator-webhook-server-fb5687d59-w22bc" Sep 30 14:11:18 crc kubenswrapper[4676]: I0930 14:11:18.644854 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/eb9ac7e3-b48e-44d4-9053-7e5ecec6a138-webhook-cert\") pod \"metallb-operator-webhook-server-fb5687d59-w22bc\" (UID: \"eb9ac7e3-b48e-44d4-9053-7e5ecec6a138\") " pod="metallb-system/metallb-operator-webhook-server-fb5687d59-w22bc" Sep 30 14:11:18 crc kubenswrapper[4676]: I0930 14:11:18.746733 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/eb9ac7e3-b48e-44d4-9053-7e5ecec6a138-webhook-cert\") pod \"metallb-operator-webhook-server-fb5687d59-w22bc\" (UID: \"eb9ac7e3-b48e-44d4-9053-7e5ecec6a138\") " pod="metallb-system/metallb-operator-webhook-server-fb5687d59-w22bc" Sep 30 14:11:18 crc kubenswrapper[4676]: I0930 14:11:18.746808 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/eb9ac7e3-b48e-44d4-9053-7e5ecec6a138-apiservice-cert\") pod \"metallb-operator-webhook-server-fb5687d59-w22bc\" (UID: \"eb9ac7e3-b48e-44d4-9053-7e5ecec6a138\") " pod="metallb-system/metallb-operator-webhook-server-fb5687d59-w22bc" Sep 30 14:11:18 crc kubenswrapper[4676]: I0930 14:11:18.746846 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bfhw\" (UniqueName: \"kubernetes.io/projected/eb9ac7e3-b48e-44d4-9053-7e5ecec6a138-kube-api-access-7bfhw\") pod \"metallb-operator-webhook-server-fb5687d59-w22bc\" (UID: \"eb9ac7e3-b48e-44d4-9053-7e5ecec6a138\") " pod="metallb-system/metallb-operator-webhook-server-fb5687d59-w22bc" Sep 30 14:11:18 crc kubenswrapper[4676]: I0930 14:11:18.752389 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/eb9ac7e3-b48e-44d4-9053-7e5ecec6a138-apiservice-cert\") pod \"metallb-operator-webhook-server-fb5687d59-w22bc\" (UID: \"eb9ac7e3-b48e-44d4-9053-7e5ecec6a138\") " pod="metallb-system/metallb-operator-webhook-server-fb5687d59-w22bc" Sep 30 14:11:18 crc kubenswrapper[4676]: I0930 14:11:18.754674 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/eb9ac7e3-b48e-44d4-9053-7e5ecec6a138-webhook-cert\") pod \"metallb-operator-webhook-server-fb5687d59-w22bc\" (UID: \"eb9ac7e3-b48e-44d4-9053-7e5ecec6a138\") " pod="metallb-system/metallb-operator-webhook-server-fb5687d59-w22bc" Sep 30 14:11:18 crc kubenswrapper[4676]: I0930 14:11:18.766607 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bfhw\" (UniqueName: \"kubernetes.io/projected/eb9ac7e3-b48e-44d4-9053-7e5ecec6a138-kube-api-access-7bfhw\") pod \"metallb-operator-webhook-server-fb5687d59-w22bc\" (UID: \"eb9ac7e3-b48e-44d4-9053-7e5ecec6a138\") " pod="metallb-system/metallb-operator-webhook-server-fb5687d59-w22bc" Sep 30 14:11:18 crc kubenswrapper[4676]: I0930 14:11:18.890803 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-fb5687d59-w22bc" Sep 30 14:11:19 crc kubenswrapper[4676]: I0930 14:11:19.043372 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-76c7cc4875-dchh6"] Sep 30 14:11:19 crc kubenswrapper[4676]: W0930 14:11:19.070604 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a1324e4_02a6_4c81_b533_086cbd21e10f.slice/crio-faefdb7146045c8a027d0a971367e79779f032a9ad0ae4f078d334a233c3bd11 WatchSource:0}: Error finding container faefdb7146045c8a027d0a971367e79779f032a9ad0ae4f078d334a233c3bd11: Status 404 returned error can't find the container with id faefdb7146045c8a027d0a971367e79779f032a9ad0ae4f078d334a233c3bd11 Sep 30 14:11:19 crc kubenswrapper[4676]: I0930 14:11:19.165662 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-76c7cc4875-dchh6" event={"ID":"2a1324e4-02a6-4c81-b533-086cbd21e10f","Type":"ContainerStarted","Data":"faefdb7146045c8a027d0a971367e79779f032a9ad0ae4f078d334a233c3bd11"} Sep 30 14:11:19 crc kubenswrapper[4676]: I0930 14:11:19.452366 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-fb5687d59-w22bc"] Sep 30 14:11:19 crc kubenswrapper[4676]: W0930 14:11:19.460341 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb9ac7e3_b48e_44d4_9053_7e5ecec6a138.slice/crio-45784fe81131f45676607a9245673b9ceefde483856118d9ffc9fcb589c8ea1a WatchSource:0}: Error finding container 45784fe81131f45676607a9245673b9ceefde483856118d9ffc9fcb589c8ea1a: Status 404 returned error can't find the container with id 45784fe81131f45676607a9245673b9ceefde483856118d9ffc9fcb589c8ea1a Sep 30 14:11:20 crc kubenswrapper[4676]: I0930 14:11:20.176107 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-fb5687d59-w22bc" event={"ID":"eb9ac7e3-b48e-44d4-9053-7e5ecec6a138","Type":"ContainerStarted","Data":"45784fe81131f45676607a9245673b9ceefde483856118d9ffc9fcb589c8ea1a"} Sep 30 14:11:22 crc kubenswrapper[4676]: I0930 14:11:22.739388 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8kfpw" Sep 30 14:11:22 crc kubenswrapper[4676]: I0930 14:11:22.739721 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8kfpw" Sep 30 14:11:22 crc kubenswrapper[4676]: I0930 14:11:22.788640 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8kfpw" Sep 30 14:11:23 crc kubenswrapper[4676]: I0930 14:11:23.268940 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8kfpw" Sep 30 14:11:23 crc kubenswrapper[4676]: I0930 14:11:23.664413 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8kfpw"] Sep 30 14:11:23 crc kubenswrapper[4676]: I0930 14:11:23.709742 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l9lpt"] Sep 30 14:11:23 crc kubenswrapper[4676]: I0930 14:11:23.710081 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-l9lpt" podUID="c2edaf2e-4def-4177-87e6-6e9e1a62f16b" containerName="registry-server" containerID="cri-o://e0032e8cd9fe7d955fedefcfc265b528b16990ebcf898e7d0071de4fe5ca07e8" gracePeriod=2 Sep 30 14:11:25 crc kubenswrapper[4676]: I0930 14:11:25.252127 4676 generic.go:334] "Generic (PLEG): container finished" podID="c2edaf2e-4def-4177-87e6-6e9e1a62f16b" containerID="e0032e8cd9fe7d955fedefcfc265b528b16990ebcf898e7d0071de4fe5ca07e8" exitCode=0 Sep 30 14:11:25 crc kubenswrapper[4676]: I0930 14:11:25.253532 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l9lpt" event={"ID":"c2edaf2e-4def-4177-87e6-6e9e1a62f16b","Type":"ContainerDied","Data":"e0032e8cd9fe7d955fedefcfc265b528b16990ebcf898e7d0071de4fe5ca07e8"} Sep 30 14:11:25 crc kubenswrapper[4676]: I0930 14:11:25.346709 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l9lpt" Sep 30 14:11:25 crc kubenswrapper[4676]: I0930 14:11:25.468855 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2edaf2e-4def-4177-87e6-6e9e1a62f16b-catalog-content\") pod \"c2edaf2e-4def-4177-87e6-6e9e1a62f16b\" (UID: \"c2edaf2e-4def-4177-87e6-6e9e1a62f16b\") " Sep 30 14:11:25 crc kubenswrapper[4676]: I0930 14:11:25.481885 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2edaf2e-4def-4177-87e6-6e9e1a62f16b-utilities\") pod \"c2edaf2e-4def-4177-87e6-6e9e1a62f16b\" (UID: \"c2edaf2e-4def-4177-87e6-6e9e1a62f16b\") " Sep 30 14:11:25 crc kubenswrapper[4676]: I0930 14:11:25.482004 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gsnp4\" (UniqueName: \"kubernetes.io/projected/c2edaf2e-4def-4177-87e6-6e9e1a62f16b-kube-api-access-gsnp4\") pod \"c2edaf2e-4def-4177-87e6-6e9e1a62f16b\" (UID: \"c2edaf2e-4def-4177-87e6-6e9e1a62f16b\") " Sep 30 14:11:25 crc kubenswrapper[4676]: I0930 14:11:25.483160 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2edaf2e-4def-4177-87e6-6e9e1a62f16b-utilities" (OuterVolumeSpecName: "utilities") pod "c2edaf2e-4def-4177-87e6-6e9e1a62f16b" (UID: "c2edaf2e-4def-4177-87e6-6e9e1a62f16b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:11:25 crc kubenswrapper[4676]: I0930 14:11:25.499291 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2edaf2e-4def-4177-87e6-6e9e1a62f16b-kube-api-access-gsnp4" (OuterVolumeSpecName: "kube-api-access-gsnp4") pod "c2edaf2e-4def-4177-87e6-6e9e1a62f16b" (UID: "c2edaf2e-4def-4177-87e6-6e9e1a62f16b"). InnerVolumeSpecName "kube-api-access-gsnp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:11:25 crc kubenswrapper[4676]: I0930 14:11:25.584628 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2edaf2e-4def-4177-87e6-6e9e1a62f16b-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 14:11:25 crc kubenswrapper[4676]: I0930 14:11:25.585179 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gsnp4\" (UniqueName: \"kubernetes.io/projected/c2edaf2e-4def-4177-87e6-6e9e1a62f16b-kube-api-access-gsnp4\") on node \"crc\" DevicePath \"\"" Sep 30 14:11:25 crc kubenswrapper[4676]: I0930 14:11:25.589352 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2edaf2e-4def-4177-87e6-6e9e1a62f16b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c2edaf2e-4def-4177-87e6-6e9e1a62f16b" (UID: "c2edaf2e-4def-4177-87e6-6e9e1a62f16b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:11:25 crc kubenswrapper[4676]: I0930 14:11:25.686643 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2edaf2e-4def-4177-87e6-6e9e1a62f16b-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 14:11:26 crc kubenswrapper[4676]: I0930 14:11:26.265871 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l9lpt" event={"ID":"c2edaf2e-4def-4177-87e6-6e9e1a62f16b","Type":"ContainerDied","Data":"2809c953b8fe8193b35538cd811c4ddd9c97f5d39151b90bde5bf9a25821baca"} Sep 30 14:11:26 crc kubenswrapper[4676]: I0930 14:11:26.265980 4676 scope.go:117] "RemoveContainer" containerID="e0032e8cd9fe7d955fedefcfc265b528b16990ebcf898e7d0071de4fe5ca07e8" Sep 30 14:11:26 crc kubenswrapper[4676]: I0930 14:11:26.266115 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l9lpt" Sep 30 14:11:26 crc kubenswrapper[4676]: I0930 14:11:26.310660 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l9lpt"] Sep 30 14:11:26 crc kubenswrapper[4676]: I0930 14:11:26.314036 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-l9lpt"] Sep 30 14:11:26 crc kubenswrapper[4676]: I0930 14:11:26.330916 4676 scope.go:117] "RemoveContainer" containerID="97823dcb6c5d319e48de696227056449e81b62d8716c2e63cb70278bda65d77f" Sep 30 14:11:26 crc kubenswrapper[4676]: I0930 14:11:26.363364 4676 scope.go:117] "RemoveContainer" containerID="cf235d82ff40a1de080292475ad9c354382c1b82a530b8fa88f2f0e58edd0ae1" Sep 30 14:11:27 crc kubenswrapper[4676]: I0930 14:11:27.442994 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2edaf2e-4def-4177-87e6-6e9e1a62f16b" path="/var/lib/kubelet/pods/c2edaf2e-4def-4177-87e6-6e9e1a62f16b/volumes" Sep 30 14:11:29 crc kubenswrapper[4676]: I0930 14:11:29.920386 4676 patch_prober.go:28] interesting pod/machine-config-daemon-4k2dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:11:29 crc kubenswrapper[4676]: I0930 14:11:29.920730 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:11:32 crc kubenswrapper[4676]: I0930 14:11:32.315215 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-76c7cc4875-dchh6" event={"ID":"2a1324e4-02a6-4c81-b533-086cbd21e10f","Type":"ContainerStarted","Data":"9e15f5d4e5a03da78b871e8de81f22682bddb37ccfbf241523d94b36785bfda6"} Sep 30 14:11:32 crc kubenswrapper[4676]: I0930 14:11:32.315770 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-76c7cc4875-dchh6" Sep 30 14:11:32 crc kubenswrapper[4676]: I0930 14:11:32.316718 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-fb5687d59-w22bc" event={"ID":"eb9ac7e3-b48e-44d4-9053-7e5ecec6a138","Type":"ContainerStarted","Data":"0f80f56145a0174a1e2f2e81bc603ee4488ec366973f0152a54084f6ef842b34"} Sep 30 14:11:32 crc kubenswrapper[4676]: I0930 14:11:32.316913 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-fb5687d59-w22bc" Sep 30 14:11:32 crc kubenswrapper[4676]: I0930 14:11:32.342948 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-76c7cc4875-dchh6" podStartSLOduration=2.328058547 podStartE2EDuration="14.342917035s" podCreationTimestamp="2025-09-30 14:11:18 +0000 UTC" firstStartedPulling="2025-09-30 14:11:19.087318267 +0000 UTC m=+783.070406696" lastFinishedPulling="2025-09-30 14:11:31.102176755 +0000 UTC m=+795.085265184" observedRunningTime="2025-09-30 14:11:32.337444769 +0000 UTC m=+796.320533208" watchObservedRunningTime="2025-09-30 14:11:32.342917035 +0000 UTC m=+796.326005474" Sep 30 14:11:32 crc kubenswrapper[4676]: I0930 14:11:32.365537 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-fb5687d59-w22bc" podStartSLOduration=2.723318463 podStartE2EDuration="14.365508021s" podCreationTimestamp="2025-09-30 14:11:18 +0000 UTC" firstStartedPulling="2025-09-30 14:11:19.466372809 +0000 UTC m=+783.449461238" lastFinishedPulling="2025-09-30 14:11:31.108562377 +0000 UTC m=+795.091650796" observedRunningTime="2025-09-30 14:11:32.360481406 +0000 UTC m=+796.343569845" watchObservedRunningTime="2025-09-30 14:11:32.365508021 +0000 UTC m=+796.348596440" Sep 30 14:11:41 crc kubenswrapper[4676]: I0930 14:11:41.313480 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-khftk"] Sep 30 14:11:41 crc kubenswrapper[4676]: E0930 14:11:41.314534 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2edaf2e-4def-4177-87e6-6e9e1a62f16b" containerName="extract-utilities" Sep 30 14:11:41 crc kubenswrapper[4676]: I0930 14:11:41.314551 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2edaf2e-4def-4177-87e6-6e9e1a62f16b" containerName="extract-utilities" Sep 30 14:11:41 crc kubenswrapper[4676]: E0930 14:11:41.314589 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2edaf2e-4def-4177-87e6-6e9e1a62f16b" containerName="extract-content" Sep 30 14:11:41 crc kubenswrapper[4676]: I0930 14:11:41.314600 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2edaf2e-4def-4177-87e6-6e9e1a62f16b" containerName="extract-content" Sep 30 14:11:41 crc kubenswrapper[4676]: E0930 14:11:41.314613 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2edaf2e-4def-4177-87e6-6e9e1a62f16b" containerName="registry-server" Sep 30 14:11:41 crc kubenswrapper[4676]: I0930 14:11:41.314622 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2edaf2e-4def-4177-87e6-6e9e1a62f16b" containerName="registry-server" Sep 30 14:11:41 crc kubenswrapper[4676]: I0930 14:11:41.314748 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2edaf2e-4def-4177-87e6-6e9e1a62f16b" containerName="registry-server" Sep 30 14:11:41 crc kubenswrapper[4676]: I0930 14:11:41.315928 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-khftk" Sep 30 14:11:41 crc kubenswrapper[4676]: I0930 14:11:41.336624 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-khftk"] Sep 30 14:11:41 crc kubenswrapper[4676]: I0930 14:11:41.425257 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33a3b1ba-20dd-4377-8225-c0103b65a68d-catalog-content\") pod \"community-operators-khftk\" (UID: \"33a3b1ba-20dd-4377-8225-c0103b65a68d\") " pod="openshift-marketplace/community-operators-khftk" Sep 30 14:11:41 crc kubenswrapper[4676]: I0930 14:11:41.425407 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bl78\" (UniqueName: \"kubernetes.io/projected/33a3b1ba-20dd-4377-8225-c0103b65a68d-kube-api-access-8bl78\") pod \"community-operators-khftk\" (UID: \"33a3b1ba-20dd-4377-8225-c0103b65a68d\") " pod="openshift-marketplace/community-operators-khftk" Sep 30 14:11:41 crc kubenswrapper[4676]: I0930 14:11:41.425492 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33a3b1ba-20dd-4377-8225-c0103b65a68d-utilities\") pod \"community-operators-khftk\" (UID: \"33a3b1ba-20dd-4377-8225-c0103b65a68d\") " pod="openshift-marketplace/community-operators-khftk" Sep 30 14:11:41 crc kubenswrapper[4676]: I0930 14:11:41.527485 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33a3b1ba-20dd-4377-8225-c0103b65a68d-utilities\") pod \"community-operators-khftk\" (UID: \"33a3b1ba-20dd-4377-8225-c0103b65a68d\") " pod="openshift-marketplace/community-operators-khftk" Sep 30 14:11:41 crc kubenswrapper[4676]: I0930 14:11:41.527576 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33a3b1ba-20dd-4377-8225-c0103b65a68d-catalog-content\") pod \"community-operators-khftk\" (UID: \"33a3b1ba-20dd-4377-8225-c0103b65a68d\") " pod="openshift-marketplace/community-operators-khftk" Sep 30 14:11:41 crc kubenswrapper[4676]: I0930 14:11:41.527642 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bl78\" (UniqueName: \"kubernetes.io/projected/33a3b1ba-20dd-4377-8225-c0103b65a68d-kube-api-access-8bl78\") pod \"community-operators-khftk\" (UID: \"33a3b1ba-20dd-4377-8225-c0103b65a68d\") " pod="openshift-marketplace/community-operators-khftk" Sep 30 14:11:41 crc kubenswrapper[4676]: I0930 14:11:41.528013 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33a3b1ba-20dd-4377-8225-c0103b65a68d-utilities\") pod \"community-operators-khftk\" (UID: \"33a3b1ba-20dd-4377-8225-c0103b65a68d\") " pod="openshift-marketplace/community-operators-khftk" Sep 30 14:11:41 crc kubenswrapper[4676]: I0930 14:11:41.528156 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33a3b1ba-20dd-4377-8225-c0103b65a68d-catalog-content\") pod \"community-operators-khftk\" (UID: \"33a3b1ba-20dd-4377-8225-c0103b65a68d\") " pod="openshift-marketplace/community-operators-khftk" Sep 30 14:11:41 crc kubenswrapper[4676]: I0930 14:11:41.549493 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bl78\" (UniqueName: \"kubernetes.io/projected/33a3b1ba-20dd-4377-8225-c0103b65a68d-kube-api-access-8bl78\") pod \"community-operators-khftk\" (UID: \"33a3b1ba-20dd-4377-8225-c0103b65a68d\") " pod="openshift-marketplace/community-operators-khftk" Sep 30 14:11:41 crc kubenswrapper[4676]: I0930 14:11:41.643289 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-khftk" Sep 30 14:11:42 crc kubenswrapper[4676]: I0930 14:11:42.190827 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-khftk"] Sep 30 14:11:42 crc kubenswrapper[4676]: I0930 14:11:42.373018 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-khftk" event={"ID":"33a3b1ba-20dd-4377-8225-c0103b65a68d","Type":"ContainerStarted","Data":"2e5fc7551bda9eaa7665ce408f0e3324a79149228a6bcf970a014acd4921777d"} Sep 30 14:11:42 crc kubenswrapper[4676]: I0930 14:11:42.373069 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-khftk" event={"ID":"33a3b1ba-20dd-4377-8225-c0103b65a68d","Type":"ContainerStarted","Data":"6dd404be19b7d7554d2e5811deb70089feb382acf0e3481f1f82851de31f9fc9"} Sep 30 14:11:43 crc kubenswrapper[4676]: I0930 14:11:43.381810 4676 generic.go:334] "Generic (PLEG): container finished" podID="33a3b1ba-20dd-4377-8225-c0103b65a68d" containerID="2e5fc7551bda9eaa7665ce408f0e3324a79149228a6bcf970a014acd4921777d" exitCode=0 Sep 30 14:11:43 crc kubenswrapper[4676]: I0930 14:11:43.381919 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-khftk" event={"ID":"33a3b1ba-20dd-4377-8225-c0103b65a68d","Type":"ContainerDied","Data":"2e5fc7551bda9eaa7665ce408f0e3324a79149228a6bcf970a014acd4921777d"} Sep 30 14:11:44 crc kubenswrapper[4676]: I0930 14:11:44.391993 4676 generic.go:334] "Generic (PLEG): container finished" podID="33a3b1ba-20dd-4377-8225-c0103b65a68d" containerID="55657124539b5d89602b934b55eaca5975b95e70751cdb502b1dfc1f01618d92" exitCode=0 Sep 30 14:11:44 crc kubenswrapper[4676]: I0930 14:11:44.392049 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-khftk" event={"ID":"33a3b1ba-20dd-4377-8225-c0103b65a68d","Type":"ContainerDied","Data":"55657124539b5d89602b934b55eaca5975b95e70751cdb502b1dfc1f01618d92"} Sep 30 14:11:45 crc kubenswrapper[4676]: I0930 14:11:45.399128 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-khftk" event={"ID":"33a3b1ba-20dd-4377-8225-c0103b65a68d","Type":"ContainerStarted","Data":"6203286c6e0c355901767abe7ef5a1201e7df5880ea892f0d826852200594e54"} Sep 30 14:11:45 crc kubenswrapper[4676]: I0930 14:11:45.416426 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-khftk" podStartSLOduration=1.905512734 podStartE2EDuration="4.416403199s" podCreationTimestamp="2025-09-30 14:11:41 +0000 UTC" firstStartedPulling="2025-09-30 14:11:42.375021964 +0000 UTC m=+806.358110393" lastFinishedPulling="2025-09-30 14:11:44.885912429 +0000 UTC m=+808.869000858" observedRunningTime="2025-09-30 14:11:45.414857738 +0000 UTC m=+809.397946187" watchObservedRunningTime="2025-09-30 14:11:45.416403199 +0000 UTC m=+809.399491648" Sep 30 14:11:48 crc kubenswrapper[4676]: I0930 14:11:48.898249 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-fb5687d59-w22bc" Sep 30 14:11:51 crc kubenswrapper[4676]: I0930 14:11:51.643846 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-khftk" Sep 30 14:11:51 crc kubenswrapper[4676]: I0930 14:11:51.644272 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-khftk" Sep 30 14:11:51 crc kubenswrapper[4676]: I0930 14:11:51.684145 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-khftk" Sep 30 14:11:52 crc kubenswrapper[4676]: I0930 14:11:52.479105 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-khftk" Sep 30 14:11:52 crc kubenswrapper[4676]: I0930 14:11:52.529317 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-khftk"] Sep 30 14:11:54 crc kubenswrapper[4676]: I0930 14:11:54.324018 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hhbgw"] Sep 30 14:11:54 crc kubenswrapper[4676]: I0930 14:11:54.326025 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hhbgw" Sep 30 14:11:54 crc kubenswrapper[4676]: I0930 14:11:54.343335 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hhbgw"] Sep 30 14:11:54 crc kubenswrapper[4676]: I0930 14:11:54.413808 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b639324-ea71-45e6-8525-57b11772d9c6-utilities\") pod \"certified-operators-hhbgw\" (UID: \"3b639324-ea71-45e6-8525-57b11772d9c6\") " pod="openshift-marketplace/certified-operators-hhbgw" Sep 30 14:11:54 crc kubenswrapper[4676]: I0930 14:11:54.413860 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwkcw\" (UniqueName: \"kubernetes.io/projected/3b639324-ea71-45e6-8525-57b11772d9c6-kube-api-access-hwkcw\") pod \"certified-operators-hhbgw\" (UID: \"3b639324-ea71-45e6-8525-57b11772d9c6\") " pod="openshift-marketplace/certified-operators-hhbgw" Sep 30 14:11:54 crc kubenswrapper[4676]: I0930 14:11:54.414000 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b639324-ea71-45e6-8525-57b11772d9c6-catalog-content\") pod \"certified-operators-hhbgw\" (UID: \"3b639324-ea71-45e6-8525-57b11772d9c6\") " pod="openshift-marketplace/certified-operators-hhbgw" Sep 30 14:11:54 crc kubenswrapper[4676]: I0930 14:11:54.451584 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-khftk" podUID="33a3b1ba-20dd-4377-8225-c0103b65a68d" containerName="registry-server" containerID="cri-o://6203286c6e0c355901767abe7ef5a1201e7df5880ea892f0d826852200594e54" gracePeriod=2 Sep 30 14:11:54 crc kubenswrapper[4676]: I0930 14:11:54.515558 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b639324-ea71-45e6-8525-57b11772d9c6-catalog-content\") pod \"certified-operators-hhbgw\" (UID: \"3b639324-ea71-45e6-8525-57b11772d9c6\") " pod="openshift-marketplace/certified-operators-hhbgw" Sep 30 14:11:54 crc kubenswrapper[4676]: I0930 14:11:54.515744 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b639324-ea71-45e6-8525-57b11772d9c6-utilities\") pod \"certified-operators-hhbgw\" (UID: \"3b639324-ea71-45e6-8525-57b11772d9c6\") " pod="openshift-marketplace/certified-operators-hhbgw" Sep 30 14:11:54 crc kubenswrapper[4676]: I0930 14:11:54.515771 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwkcw\" (UniqueName: \"kubernetes.io/projected/3b639324-ea71-45e6-8525-57b11772d9c6-kube-api-access-hwkcw\") pod \"certified-operators-hhbgw\" (UID: \"3b639324-ea71-45e6-8525-57b11772d9c6\") " pod="openshift-marketplace/certified-operators-hhbgw" Sep 30 14:11:54 crc kubenswrapper[4676]: I0930 14:11:54.516156 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b639324-ea71-45e6-8525-57b11772d9c6-catalog-content\") pod \"certified-operators-hhbgw\" (UID: \"3b639324-ea71-45e6-8525-57b11772d9c6\") " pod="openshift-marketplace/certified-operators-hhbgw" Sep 30 14:11:54 crc kubenswrapper[4676]: I0930 14:11:54.517303 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b639324-ea71-45e6-8525-57b11772d9c6-utilities\") pod \"certified-operators-hhbgw\" (UID: \"3b639324-ea71-45e6-8525-57b11772d9c6\") " pod="openshift-marketplace/certified-operators-hhbgw" Sep 30 14:11:54 crc kubenswrapper[4676]: I0930 14:11:54.541283 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwkcw\" (UniqueName: \"kubernetes.io/projected/3b639324-ea71-45e6-8525-57b11772d9c6-kube-api-access-hwkcw\") pod \"certified-operators-hhbgw\" (UID: \"3b639324-ea71-45e6-8525-57b11772d9c6\") " pod="openshift-marketplace/certified-operators-hhbgw" Sep 30 14:11:54 crc kubenswrapper[4676]: I0930 14:11:54.645764 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hhbgw" Sep 30 14:11:54 crc kubenswrapper[4676]: I0930 14:11:54.951309 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hhbgw"] Sep 30 14:11:55 crc kubenswrapper[4676]: I0930 14:11:55.458665 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hhbgw" event={"ID":"3b639324-ea71-45e6-8525-57b11772d9c6","Type":"ContainerStarted","Data":"352e8057579e35b3e045c736ca0352649289b93ccc97cac7d80fce2f885eb43c"} Sep 30 14:11:56 crc kubenswrapper[4676]: I0930 14:11:56.465283 4676 generic.go:334] "Generic (PLEG): container finished" podID="3b639324-ea71-45e6-8525-57b11772d9c6" containerID="47ee6b904fead1e71e465258b420e20c6d5c11c9e28d7fdf515aba0ce73fda83" exitCode=0 Sep 30 14:11:56 crc kubenswrapper[4676]: I0930 14:11:56.465360 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hhbgw" event={"ID":"3b639324-ea71-45e6-8525-57b11772d9c6","Type":"ContainerDied","Data":"47ee6b904fead1e71e465258b420e20c6d5c11c9e28d7fdf515aba0ce73fda83"} Sep 30 14:11:56 crc kubenswrapper[4676]: I0930 14:11:56.470153 4676 generic.go:334] "Generic (PLEG): container finished" podID="33a3b1ba-20dd-4377-8225-c0103b65a68d" containerID="6203286c6e0c355901767abe7ef5a1201e7df5880ea892f0d826852200594e54" exitCode=0 Sep 30 14:11:56 crc kubenswrapper[4676]: I0930 14:11:56.470203 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-khftk" event={"ID":"33a3b1ba-20dd-4377-8225-c0103b65a68d","Type":"ContainerDied","Data":"6203286c6e0c355901767abe7ef5a1201e7df5880ea892f0d826852200594e54"} Sep 30 14:11:56 crc kubenswrapper[4676]: I0930 14:11:56.470237 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-khftk" event={"ID":"33a3b1ba-20dd-4377-8225-c0103b65a68d","Type":"ContainerDied","Data":"6dd404be19b7d7554d2e5811deb70089feb382acf0e3481f1f82851de31f9fc9"} Sep 30 14:11:56 crc kubenswrapper[4676]: I0930 14:11:56.470250 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6dd404be19b7d7554d2e5811deb70089feb382acf0e3481f1f82851de31f9fc9" Sep 30 14:11:56 crc kubenswrapper[4676]: I0930 14:11:56.508740 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-khftk" Sep 30 14:11:56 crc kubenswrapper[4676]: I0930 14:11:56.545938 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33a3b1ba-20dd-4377-8225-c0103b65a68d-utilities\") pod \"33a3b1ba-20dd-4377-8225-c0103b65a68d\" (UID: \"33a3b1ba-20dd-4377-8225-c0103b65a68d\") " Sep 30 14:11:56 crc kubenswrapper[4676]: I0930 14:11:56.546137 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33a3b1ba-20dd-4377-8225-c0103b65a68d-catalog-content\") pod \"33a3b1ba-20dd-4377-8225-c0103b65a68d\" (UID: \"33a3b1ba-20dd-4377-8225-c0103b65a68d\") " Sep 30 14:11:56 crc kubenswrapper[4676]: I0930 14:11:56.546231 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bl78\" (UniqueName: \"kubernetes.io/projected/33a3b1ba-20dd-4377-8225-c0103b65a68d-kube-api-access-8bl78\") pod \"33a3b1ba-20dd-4377-8225-c0103b65a68d\" (UID: \"33a3b1ba-20dd-4377-8225-c0103b65a68d\") " Sep 30 14:11:56 crc kubenswrapper[4676]: I0930 14:11:56.547217 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33a3b1ba-20dd-4377-8225-c0103b65a68d-utilities" (OuterVolumeSpecName: "utilities") pod "33a3b1ba-20dd-4377-8225-c0103b65a68d" (UID: "33a3b1ba-20dd-4377-8225-c0103b65a68d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:11:56 crc kubenswrapper[4676]: I0930 14:11:56.553268 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33a3b1ba-20dd-4377-8225-c0103b65a68d-kube-api-access-8bl78" (OuterVolumeSpecName: "kube-api-access-8bl78") pod "33a3b1ba-20dd-4377-8225-c0103b65a68d" (UID: "33a3b1ba-20dd-4377-8225-c0103b65a68d"). InnerVolumeSpecName "kube-api-access-8bl78". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:11:56 crc kubenswrapper[4676]: I0930 14:11:56.596847 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33a3b1ba-20dd-4377-8225-c0103b65a68d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "33a3b1ba-20dd-4377-8225-c0103b65a68d" (UID: "33a3b1ba-20dd-4377-8225-c0103b65a68d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:11:56 crc kubenswrapper[4676]: I0930 14:11:56.648224 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bl78\" (UniqueName: \"kubernetes.io/projected/33a3b1ba-20dd-4377-8225-c0103b65a68d-kube-api-access-8bl78\") on node \"crc\" DevicePath \"\"" Sep 30 14:11:56 crc kubenswrapper[4676]: I0930 14:11:56.648269 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33a3b1ba-20dd-4377-8225-c0103b65a68d-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 14:11:56 crc kubenswrapper[4676]: I0930 14:11:56.648280 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33a3b1ba-20dd-4377-8225-c0103b65a68d-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 14:11:57 crc kubenswrapper[4676]: I0930 14:11:57.477415 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-khftk" Sep 30 14:11:57 crc kubenswrapper[4676]: I0930 14:11:57.497080 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-khftk"] Sep 30 14:11:57 crc kubenswrapper[4676]: I0930 14:11:57.501590 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-khftk"] Sep 30 14:11:58 crc kubenswrapper[4676]: I0930 14:11:58.487264 4676 generic.go:334] "Generic (PLEG): container finished" podID="3b639324-ea71-45e6-8525-57b11772d9c6" containerID="46d335d4b9082f27c9bddf4c57305c642760cc9eaa10976c7cf48f1ed3091dc0" exitCode=0 Sep 30 14:11:58 crc kubenswrapper[4676]: I0930 14:11:58.487329 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hhbgw" event={"ID":"3b639324-ea71-45e6-8525-57b11772d9c6","Type":"ContainerDied","Data":"46d335d4b9082f27c9bddf4c57305c642760cc9eaa10976c7cf48f1ed3091dc0"} Sep 30 14:11:59 crc kubenswrapper[4676]: I0930 14:11:59.441845 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33a3b1ba-20dd-4377-8225-c0103b65a68d" path="/var/lib/kubelet/pods/33a3b1ba-20dd-4377-8225-c0103b65a68d/volumes" Sep 30 14:11:59 crc kubenswrapper[4676]: I0930 14:11:59.495590 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hhbgw" event={"ID":"3b639324-ea71-45e6-8525-57b11772d9c6","Type":"ContainerStarted","Data":"f224b95993298320a5d732f6f4afabc6056c4dc6704fbce9d455e941dde57df0"} Sep 30 14:11:59 crc kubenswrapper[4676]: I0930 14:11:59.519982 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hhbgw" podStartSLOduration=2.724553753 podStartE2EDuration="5.519961196s" podCreationTimestamp="2025-09-30 14:11:54 +0000 UTC" firstStartedPulling="2025-09-30 14:11:56.466828304 +0000 UTC m=+820.449916733" lastFinishedPulling="2025-09-30 14:11:59.262235707 +0000 UTC m=+823.245324176" observedRunningTime="2025-09-30 14:11:59.514573671 +0000 UTC m=+823.497662110" watchObservedRunningTime="2025-09-30 14:11:59.519961196 +0000 UTC m=+823.503049645" Sep 30 14:11:59 crc kubenswrapper[4676]: I0930 14:11:59.919844 4676 patch_prober.go:28] interesting pod/machine-config-daemon-4k2dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:11:59 crc kubenswrapper[4676]: I0930 14:11:59.919956 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:12:04 crc kubenswrapper[4676]: I0930 14:12:04.647505 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hhbgw" Sep 30 14:12:04 crc kubenswrapper[4676]: I0930 14:12:04.650070 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hhbgw" Sep 30 14:12:04 crc kubenswrapper[4676]: I0930 14:12:04.692143 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hhbgw" Sep 30 14:12:05 crc kubenswrapper[4676]: I0930 14:12:05.581189 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hhbgw" Sep 30 14:12:05 crc kubenswrapper[4676]: I0930 14:12:05.633329 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hhbgw"] Sep 30 14:12:07 crc kubenswrapper[4676]: I0930 14:12:07.553512 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hhbgw" podUID="3b639324-ea71-45e6-8525-57b11772d9c6" containerName="registry-server" containerID="cri-o://f224b95993298320a5d732f6f4afabc6056c4dc6704fbce9d455e941dde57df0" gracePeriod=2 Sep 30 14:12:07 crc kubenswrapper[4676]: I0930 14:12:07.970161 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hhbgw" Sep 30 14:12:08 crc kubenswrapper[4676]: I0930 14:12:08.018758 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b639324-ea71-45e6-8525-57b11772d9c6-utilities\") pod \"3b639324-ea71-45e6-8525-57b11772d9c6\" (UID: \"3b639324-ea71-45e6-8525-57b11772d9c6\") " Sep 30 14:12:08 crc kubenswrapper[4676]: I0930 14:12:08.018847 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwkcw\" (UniqueName: \"kubernetes.io/projected/3b639324-ea71-45e6-8525-57b11772d9c6-kube-api-access-hwkcw\") pod \"3b639324-ea71-45e6-8525-57b11772d9c6\" (UID: \"3b639324-ea71-45e6-8525-57b11772d9c6\") " Sep 30 14:12:08 crc kubenswrapper[4676]: I0930 14:12:08.018996 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b639324-ea71-45e6-8525-57b11772d9c6-catalog-content\") pod \"3b639324-ea71-45e6-8525-57b11772d9c6\" (UID: \"3b639324-ea71-45e6-8525-57b11772d9c6\") " Sep 30 14:12:08 crc kubenswrapper[4676]: I0930 14:12:08.020684 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b639324-ea71-45e6-8525-57b11772d9c6-utilities" (OuterVolumeSpecName: "utilities") pod "3b639324-ea71-45e6-8525-57b11772d9c6" (UID: "3b639324-ea71-45e6-8525-57b11772d9c6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:12:08 crc kubenswrapper[4676]: I0930 14:12:08.035399 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b639324-ea71-45e6-8525-57b11772d9c6-kube-api-access-hwkcw" (OuterVolumeSpecName: "kube-api-access-hwkcw") pod "3b639324-ea71-45e6-8525-57b11772d9c6" (UID: "3b639324-ea71-45e6-8525-57b11772d9c6"). InnerVolumeSpecName "kube-api-access-hwkcw". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:12:08 crc kubenswrapper[4676]: I0930 14:12:08.073488 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b639324-ea71-45e6-8525-57b11772d9c6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3b639324-ea71-45e6-8525-57b11772d9c6" (UID: "3b639324-ea71-45e6-8525-57b11772d9c6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:12:08 crc kubenswrapper[4676]: I0930 14:12:08.121047 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b639324-ea71-45e6-8525-57b11772d9c6-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 14:12:08 crc kubenswrapper[4676]: I0930 14:12:08.121082 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b639324-ea71-45e6-8525-57b11772d9c6-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 14:12:08 crc kubenswrapper[4676]: I0930 14:12:08.121121 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwkcw\" (UniqueName: \"kubernetes.io/projected/3b639324-ea71-45e6-8525-57b11772d9c6-kube-api-access-hwkcw\") on node \"crc\" DevicePath \"\"" Sep 30 14:12:08 crc kubenswrapper[4676]: I0930 14:12:08.501026 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-76c7cc4875-dchh6" Sep 30 14:12:08 crc kubenswrapper[4676]: I0930 14:12:08.563369 4676 generic.go:334] "Generic (PLEG): container finished" podID="3b639324-ea71-45e6-8525-57b11772d9c6" containerID="f224b95993298320a5d732f6f4afabc6056c4dc6704fbce9d455e941dde57df0" exitCode=0 Sep 30 14:12:08 crc kubenswrapper[4676]: I0930 14:12:08.563429 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hhbgw" event={"ID":"3b639324-ea71-45e6-8525-57b11772d9c6","Type":"ContainerDied","Data":"f224b95993298320a5d732f6f4afabc6056c4dc6704fbce9d455e941dde57df0"} Sep 30 14:12:08 crc kubenswrapper[4676]: I0930 14:12:08.563468 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hhbgw" Sep 30 14:12:08 crc kubenswrapper[4676]: I0930 14:12:08.563493 4676 scope.go:117] "RemoveContainer" containerID="f224b95993298320a5d732f6f4afabc6056c4dc6704fbce9d455e941dde57df0" Sep 30 14:12:08 crc kubenswrapper[4676]: I0930 14:12:08.563482 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hhbgw" event={"ID":"3b639324-ea71-45e6-8525-57b11772d9c6","Type":"ContainerDied","Data":"352e8057579e35b3e045c736ca0352649289b93ccc97cac7d80fce2f885eb43c"} Sep 30 14:12:08 crc kubenswrapper[4676]: I0930 14:12:08.583571 4676 scope.go:117] "RemoveContainer" containerID="46d335d4b9082f27c9bddf4c57305c642760cc9eaa10976c7cf48f1ed3091dc0" Sep 30 14:12:08 crc kubenswrapper[4676]: I0930 14:12:08.599353 4676 scope.go:117] "RemoveContainer" containerID="47ee6b904fead1e71e465258b420e20c6d5c11c9e28d7fdf515aba0ce73fda83" Sep 30 14:12:08 crc kubenswrapper[4676]: I0930 14:12:08.611363 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hhbgw"] Sep 30 14:12:08 crc kubenswrapper[4676]: I0930 14:12:08.615978 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hhbgw"] Sep 30 14:12:08 crc kubenswrapper[4676]: I0930 14:12:08.629505 4676 scope.go:117] "RemoveContainer" containerID="f224b95993298320a5d732f6f4afabc6056c4dc6704fbce9d455e941dde57df0" Sep 30 14:12:08 crc kubenswrapper[4676]: E0930 14:12:08.630595 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f224b95993298320a5d732f6f4afabc6056c4dc6704fbce9d455e941dde57df0\": container with ID starting with f224b95993298320a5d732f6f4afabc6056c4dc6704fbce9d455e941dde57df0 not found: ID does not exist" containerID="f224b95993298320a5d732f6f4afabc6056c4dc6704fbce9d455e941dde57df0" Sep 30 14:12:08 crc kubenswrapper[4676]: I0930 14:12:08.630628 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f224b95993298320a5d732f6f4afabc6056c4dc6704fbce9d455e941dde57df0"} err="failed to get container status \"f224b95993298320a5d732f6f4afabc6056c4dc6704fbce9d455e941dde57df0\": rpc error: code = NotFound desc = could not find container \"f224b95993298320a5d732f6f4afabc6056c4dc6704fbce9d455e941dde57df0\": container with ID starting with f224b95993298320a5d732f6f4afabc6056c4dc6704fbce9d455e941dde57df0 not found: ID does not exist" Sep 30 14:12:08 crc kubenswrapper[4676]: I0930 14:12:08.630652 4676 scope.go:117] "RemoveContainer" containerID="46d335d4b9082f27c9bddf4c57305c642760cc9eaa10976c7cf48f1ed3091dc0" Sep 30 14:12:08 crc kubenswrapper[4676]: E0930 14:12:08.630947 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46d335d4b9082f27c9bddf4c57305c642760cc9eaa10976c7cf48f1ed3091dc0\": container with ID starting with 46d335d4b9082f27c9bddf4c57305c642760cc9eaa10976c7cf48f1ed3091dc0 not found: ID does not exist" containerID="46d335d4b9082f27c9bddf4c57305c642760cc9eaa10976c7cf48f1ed3091dc0" Sep 30 14:12:08 crc kubenswrapper[4676]: I0930 14:12:08.630970 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46d335d4b9082f27c9bddf4c57305c642760cc9eaa10976c7cf48f1ed3091dc0"} err="failed to get container status \"46d335d4b9082f27c9bddf4c57305c642760cc9eaa10976c7cf48f1ed3091dc0\": rpc error: code = NotFound desc = could not find container \"46d335d4b9082f27c9bddf4c57305c642760cc9eaa10976c7cf48f1ed3091dc0\": container with ID starting with 46d335d4b9082f27c9bddf4c57305c642760cc9eaa10976c7cf48f1ed3091dc0 not found: ID does not exist" Sep 30 14:12:08 crc kubenswrapper[4676]: I0930 14:12:08.630983 4676 scope.go:117] "RemoveContainer" containerID="47ee6b904fead1e71e465258b420e20c6d5c11c9e28d7fdf515aba0ce73fda83" Sep 30 14:12:08 crc kubenswrapper[4676]: E0930 14:12:08.631383 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47ee6b904fead1e71e465258b420e20c6d5c11c9e28d7fdf515aba0ce73fda83\": container with ID starting with 47ee6b904fead1e71e465258b420e20c6d5c11c9e28d7fdf515aba0ce73fda83 not found: ID does not exist" containerID="47ee6b904fead1e71e465258b420e20c6d5c11c9e28d7fdf515aba0ce73fda83" Sep 30 14:12:08 crc kubenswrapper[4676]: I0930 14:12:08.631457 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47ee6b904fead1e71e465258b420e20c6d5c11c9e28d7fdf515aba0ce73fda83"} err="failed to get container status \"47ee6b904fead1e71e465258b420e20c6d5c11c9e28d7fdf515aba0ce73fda83\": rpc error: code = NotFound desc = could not find container \"47ee6b904fead1e71e465258b420e20c6d5c11c9e28d7fdf515aba0ce73fda83\": container with ID starting with 47ee6b904fead1e71e465258b420e20c6d5c11c9e28d7fdf515aba0ce73fda83 not found: ID does not exist" Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.209173 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-x4mlw"] Sep 30 14:12:09 crc kubenswrapper[4676]: E0930 14:12:09.209505 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33a3b1ba-20dd-4377-8225-c0103b65a68d" containerName="extract-utilities" Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.209526 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="33a3b1ba-20dd-4377-8225-c0103b65a68d" containerName="extract-utilities" Sep 30 14:12:09 crc kubenswrapper[4676]: E0930 14:12:09.209540 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b639324-ea71-45e6-8525-57b11772d9c6" containerName="registry-server" Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.209553 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b639324-ea71-45e6-8525-57b11772d9c6" containerName="registry-server" Sep 30 14:12:09 crc kubenswrapper[4676]: E0930 14:12:09.209565 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b639324-ea71-45e6-8525-57b11772d9c6" containerName="extract-content" Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.209572 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b639324-ea71-45e6-8525-57b11772d9c6" containerName="extract-content" Sep 30 14:12:09 crc kubenswrapper[4676]: E0930 14:12:09.209594 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33a3b1ba-20dd-4377-8225-c0103b65a68d" containerName="registry-server" Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.209602 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="33a3b1ba-20dd-4377-8225-c0103b65a68d" containerName="registry-server" Sep 30 14:12:09 crc kubenswrapper[4676]: E0930 14:12:09.209613 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33a3b1ba-20dd-4377-8225-c0103b65a68d" containerName="extract-content" Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.209621 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="33a3b1ba-20dd-4377-8225-c0103b65a68d" containerName="extract-content" Sep 30 14:12:09 crc kubenswrapper[4676]: E0930 14:12:09.209635 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b639324-ea71-45e6-8525-57b11772d9c6" containerName="extract-utilities" Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.209643 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b639324-ea71-45e6-8525-57b11772d9c6" containerName="extract-utilities" Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.209771 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b639324-ea71-45e6-8525-57b11772d9c6" containerName="registry-server" Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.209794 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="33a3b1ba-20dd-4377-8225-c0103b65a68d" containerName="registry-server" Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.212438 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-x4mlw" Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.230035 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-rdndd"] Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.230830 4676 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-lwmp9" Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.230828 4676 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.231004 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-rdndd" Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.231302 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.239383 4676 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.239773 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5-metrics\") pod \"frr-k8s-x4mlw\" (UID: \"5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5\") " pod="metallb-system/frr-k8s-x4mlw" Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.239806 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5-frr-sockets\") pod \"frr-k8s-x4mlw\" (UID: \"5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5\") " pod="metallb-system/frr-k8s-x4mlw" Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.239834 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5-reloader\") pod \"frr-k8s-x4mlw\" (UID: \"5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5\") " pod="metallb-system/frr-k8s-x4mlw" Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.239860 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5-metrics-certs\") pod \"frr-k8s-x4mlw\" (UID: \"5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5\") " pod="metallb-system/frr-k8s-x4mlw" Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.239913 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtc4r\" (UniqueName: \"kubernetes.io/projected/5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5-kube-api-access-rtc4r\") pod \"frr-k8s-x4mlw\" (UID: \"5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5\") " pod="metallb-system/frr-k8s-x4mlw" Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.239959 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5-frr-startup\") pod \"frr-k8s-x4mlw\" (UID: \"5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5\") " pod="metallb-system/frr-k8s-x4mlw" Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.239985 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5-frr-conf\") pod \"frr-k8s-x4mlw\" (UID: \"5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5\") " pod="metallb-system/frr-k8s-x4mlw" Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.255143 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-rdndd"] Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.341872 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5-frr-conf\") pod \"frr-k8s-x4mlw\" (UID: \"5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5\") " pod="metallb-system/frr-k8s-x4mlw" Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.341961 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/518077e0-6a46-480c-9cdd-d5d5c64814b7-cert\") pod \"frr-k8s-webhook-server-5478bdb765-rdndd\" (UID: \"518077e0-6a46-480c-9cdd-d5d5c64814b7\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-rdndd" Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.341989 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5-metrics\") pod \"frr-k8s-x4mlw\" (UID: \"5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5\") " pod="metallb-system/frr-k8s-x4mlw" Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.342006 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5-frr-sockets\") pod \"frr-k8s-x4mlw\" (UID: \"5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5\") " pod="metallb-system/frr-k8s-x4mlw" Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.342030 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5-reloader\") pod \"frr-k8s-x4mlw\" (UID: \"5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5\") " pod="metallb-system/frr-k8s-x4mlw" Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.342056 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5-metrics-certs\") pod \"frr-k8s-x4mlw\" (UID: \"5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5\") " pod="metallb-system/frr-k8s-x4mlw" Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.342077 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtc4r\" (UniqueName: \"kubernetes.io/projected/5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5-kube-api-access-rtc4r\") pod \"frr-k8s-x4mlw\" (UID: \"5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5\") " pod="metallb-system/frr-k8s-x4mlw" Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.342100 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmkz5\" (UniqueName: \"kubernetes.io/projected/518077e0-6a46-480c-9cdd-d5d5c64814b7-kube-api-access-xmkz5\") pod \"frr-k8s-webhook-server-5478bdb765-rdndd\" (UID: \"518077e0-6a46-480c-9cdd-d5d5c64814b7\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-rdndd" Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.342153 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5-frr-startup\") pod \"frr-k8s-x4mlw\" (UID: \"5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5\") " pod="metallb-system/frr-k8s-x4mlw" Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.342509 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5-metrics\") pod \"frr-k8s-x4mlw\" (UID: \"5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5\") " pod="metallb-system/frr-k8s-x4mlw" Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.342770 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5-frr-sockets\") pod \"frr-k8s-x4mlw\" (UID: \"5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5\") " pod="metallb-system/frr-k8s-x4mlw" Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.342902 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5-reloader\") pod \"frr-k8s-x4mlw\" (UID: \"5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5\") " pod="metallb-system/frr-k8s-x4mlw" Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.342912 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5-frr-conf\") pod \"frr-k8s-x4mlw\" (UID: \"5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5\") " pod="metallb-system/frr-k8s-x4mlw" Sep 30 14:12:09 crc kubenswrapper[4676]: E0930 14:12:09.343032 4676 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.343081 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5-frr-startup\") pod \"frr-k8s-x4mlw\" (UID: \"5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5\") " pod="metallb-system/frr-k8s-x4mlw" Sep 30 14:12:09 crc kubenswrapper[4676]: E0930 14:12:09.343103 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5-metrics-certs podName:5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5 nodeName:}" failed. No retries permitted until 2025-09-30 14:12:09.843079821 +0000 UTC m=+833.826168250 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5-metrics-certs") pod "frr-k8s-x4mlw" (UID: "5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5") : secret "frr-k8s-certs-secret" not found Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.378494 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtc4r\" (UniqueName: \"kubernetes.io/projected/5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5-kube-api-access-rtc4r\") pod \"frr-k8s-x4mlw\" (UID: \"5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5\") " pod="metallb-system/frr-k8s-x4mlw" Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.386523 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-4gcd5"] Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.387741 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-4gcd5" Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.394307 4676 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.394314 4676 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-fwh4z" Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.394327 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.394452 4676 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.408323 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-5d688f5ffc-c24th"] Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.409662 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5d688f5ffc-c24th" Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.415345 4676 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.443451 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/32604773-f635-41b2-a665-740ace937075-memberlist\") pod \"speaker-4gcd5\" (UID: \"32604773-f635-41b2-a665-740ace937075\") " pod="metallb-system/speaker-4gcd5" Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.443530 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmpzl\" (UniqueName: \"kubernetes.io/projected/43a57b66-554a-40f3-ae9c-1f8dd4053405-kube-api-access-fmpzl\") pod \"controller-5d688f5ffc-c24th\" (UID: \"43a57b66-554a-40f3-ae9c-1f8dd4053405\") " pod="metallb-system/controller-5d688f5ffc-c24th" Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.443564 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmkz5\" (UniqueName: \"kubernetes.io/projected/518077e0-6a46-480c-9cdd-d5d5c64814b7-kube-api-access-xmkz5\") pod \"frr-k8s-webhook-server-5478bdb765-rdndd\" (UID: \"518077e0-6a46-480c-9cdd-d5d5c64814b7\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-rdndd" Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.443595 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwsnc\" (UniqueName: \"kubernetes.io/projected/32604773-f635-41b2-a665-740ace937075-kube-api-access-rwsnc\") pod \"speaker-4gcd5\" (UID: \"32604773-f635-41b2-a665-740ace937075\") " pod="metallb-system/speaker-4gcd5" Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.443626 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/43a57b66-554a-40f3-ae9c-1f8dd4053405-metrics-certs\") pod \"controller-5d688f5ffc-c24th\" (UID: \"43a57b66-554a-40f3-ae9c-1f8dd4053405\") " pod="metallb-system/controller-5d688f5ffc-c24th" Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.443645 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/32604773-f635-41b2-a665-740ace937075-metrics-certs\") pod \"speaker-4gcd5\" (UID: \"32604773-f635-41b2-a665-740ace937075\") " pod="metallb-system/speaker-4gcd5" Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.443658 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/43a57b66-554a-40f3-ae9c-1f8dd4053405-cert\") pod \"controller-5d688f5ffc-c24th\" (UID: \"43a57b66-554a-40f3-ae9c-1f8dd4053405\") " pod="metallb-system/controller-5d688f5ffc-c24th" Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.443687 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/32604773-f635-41b2-a665-740ace937075-metallb-excludel2\") pod \"speaker-4gcd5\" (UID: \"32604773-f635-41b2-a665-740ace937075\") " pod="metallb-system/speaker-4gcd5" Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.443708 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/518077e0-6a46-480c-9cdd-d5d5c64814b7-cert\") pod \"frr-k8s-webhook-server-5478bdb765-rdndd\" (UID: \"518077e0-6a46-480c-9cdd-d5d5c64814b7\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-rdndd" Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.448266 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b639324-ea71-45e6-8525-57b11772d9c6" path="/var/lib/kubelet/pods/3b639324-ea71-45e6-8525-57b11772d9c6/volumes" Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.449028 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5d688f5ffc-c24th"] Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.450747 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/518077e0-6a46-480c-9cdd-d5d5c64814b7-cert\") pod \"frr-k8s-webhook-server-5478bdb765-rdndd\" (UID: \"518077e0-6a46-480c-9cdd-d5d5c64814b7\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-rdndd" Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.495347 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmkz5\" (UniqueName: \"kubernetes.io/projected/518077e0-6a46-480c-9cdd-d5d5c64814b7-kube-api-access-xmkz5\") pod \"frr-k8s-webhook-server-5478bdb765-rdndd\" (UID: \"518077e0-6a46-480c-9cdd-d5d5c64814b7\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-rdndd" Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.545174 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/32604773-f635-41b2-a665-740ace937075-memberlist\") pod \"speaker-4gcd5\" (UID: \"32604773-f635-41b2-a665-740ace937075\") " pod="metallb-system/speaker-4gcd5" Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.545233 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmpzl\" (UniqueName: \"kubernetes.io/projected/43a57b66-554a-40f3-ae9c-1f8dd4053405-kube-api-access-fmpzl\") pod \"controller-5d688f5ffc-c24th\" (UID: \"43a57b66-554a-40f3-ae9c-1f8dd4053405\") " pod="metallb-system/controller-5d688f5ffc-c24th" Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.545277 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwsnc\" (UniqueName: \"kubernetes.io/projected/32604773-f635-41b2-a665-740ace937075-kube-api-access-rwsnc\") pod \"speaker-4gcd5\" (UID: \"32604773-f635-41b2-a665-740ace937075\") " pod="metallb-system/speaker-4gcd5" Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.545306 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/43a57b66-554a-40f3-ae9c-1f8dd4053405-metrics-certs\") pod \"controller-5d688f5ffc-c24th\" (UID: \"43a57b66-554a-40f3-ae9c-1f8dd4053405\") " pod="metallb-system/controller-5d688f5ffc-c24th" Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.545324 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/32604773-f635-41b2-a665-740ace937075-metrics-certs\") pod \"speaker-4gcd5\" (UID: \"32604773-f635-41b2-a665-740ace937075\") " pod="metallb-system/speaker-4gcd5" Sep 30 14:12:09 crc kubenswrapper[4676]: E0930 14:12:09.545332 4676 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Sep 30 14:12:09 crc kubenswrapper[4676]: E0930 14:12:09.545402 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32604773-f635-41b2-a665-740ace937075-memberlist podName:32604773-f635-41b2-a665-740ace937075 nodeName:}" failed. No retries permitted until 2025-09-30 14:12:10.045381515 +0000 UTC m=+834.028469944 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/32604773-f635-41b2-a665-740ace937075-memberlist") pod "speaker-4gcd5" (UID: "32604773-f635-41b2-a665-740ace937075") : secret "metallb-memberlist" not found Sep 30 14:12:09 crc kubenswrapper[4676]: E0930 14:12:09.545477 4676 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Sep 30 14:12:09 crc kubenswrapper[4676]: E0930 14:12:09.545568 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32604773-f635-41b2-a665-740ace937075-metrics-certs podName:32604773-f635-41b2-a665-740ace937075 nodeName:}" failed. No retries permitted until 2025-09-30 14:12:10.045545599 +0000 UTC m=+834.028634028 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/32604773-f635-41b2-a665-740ace937075-metrics-certs") pod "speaker-4gcd5" (UID: "32604773-f635-41b2-a665-740ace937075") : secret "speaker-certs-secret" not found Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.545338 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/43a57b66-554a-40f3-ae9c-1f8dd4053405-cert\") pod \"controller-5d688f5ffc-c24th\" (UID: \"43a57b66-554a-40f3-ae9c-1f8dd4053405\") " pod="metallb-system/controller-5d688f5ffc-c24th" Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.545709 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/32604773-f635-41b2-a665-740ace937075-metallb-excludel2\") pod \"speaker-4gcd5\" (UID: \"32604773-f635-41b2-a665-740ace937075\") " pod="metallb-system/speaker-4gcd5" Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.546473 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/32604773-f635-41b2-a665-740ace937075-metallb-excludel2\") pod \"speaker-4gcd5\" (UID: \"32604773-f635-41b2-a665-740ace937075\") " pod="metallb-system/speaker-4gcd5" Sep 30 14:12:09 crc kubenswrapper[4676]: E0930 14:12:09.546559 4676 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Sep 30 14:12:09 crc kubenswrapper[4676]: E0930 14:12:09.546606 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/43a57b66-554a-40f3-ae9c-1f8dd4053405-metrics-certs podName:43a57b66-554a-40f3-ae9c-1f8dd4053405 nodeName:}" failed. No retries permitted until 2025-09-30 14:12:10.046596658 +0000 UTC m=+834.029685077 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/43a57b66-554a-40f3-ae9c-1f8dd4053405-metrics-certs") pod "controller-5d688f5ffc-c24th" (UID: "43a57b66-554a-40f3-ae9c-1f8dd4053405") : secret "controller-certs-secret" not found Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.548871 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-rdndd" Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.550026 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/43a57b66-554a-40f3-ae9c-1f8dd4053405-cert\") pod \"controller-5d688f5ffc-c24th\" (UID: \"43a57b66-554a-40f3-ae9c-1f8dd4053405\") " pod="metallb-system/controller-5d688f5ffc-c24th" Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.563815 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwsnc\" (UniqueName: \"kubernetes.io/projected/32604773-f635-41b2-a665-740ace937075-kube-api-access-rwsnc\") pod \"speaker-4gcd5\" (UID: \"32604773-f635-41b2-a665-740ace937075\") " pod="metallb-system/speaker-4gcd5" Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.568941 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmpzl\" (UniqueName: \"kubernetes.io/projected/43a57b66-554a-40f3-ae9c-1f8dd4053405-kube-api-access-fmpzl\") pod \"controller-5d688f5ffc-c24th\" (UID: \"43a57b66-554a-40f3-ae9c-1f8dd4053405\") " pod="metallb-system/controller-5d688f5ffc-c24th" Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.852462 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5-metrics-certs\") pod \"frr-k8s-x4mlw\" (UID: \"5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5\") " pod="metallb-system/frr-k8s-x4mlw" Sep 30 14:12:09 crc kubenswrapper[4676]: I0930 14:12:09.858833 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5-metrics-certs\") pod \"frr-k8s-x4mlw\" (UID: \"5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5\") " pod="metallb-system/frr-k8s-x4mlw" Sep 30 14:12:10 crc kubenswrapper[4676]: I0930 14:12:10.007739 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-rdndd"] Sep 30 14:12:10 crc kubenswrapper[4676]: I0930 14:12:10.059652 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/32604773-f635-41b2-a665-740ace937075-memberlist\") pod \"speaker-4gcd5\" (UID: \"32604773-f635-41b2-a665-740ace937075\") " pod="metallb-system/speaker-4gcd5" Sep 30 14:12:10 crc kubenswrapper[4676]: I0930 14:12:10.059751 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/43a57b66-554a-40f3-ae9c-1f8dd4053405-metrics-certs\") pod \"controller-5d688f5ffc-c24th\" (UID: \"43a57b66-554a-40f3-ae9c-1f8dd4053405\") " pod="metallb-system/controller-5d688f5ffc-c24th" Sep 30 14:12:10 crc kubenswrapper[4676]: I0930 14:12:10.059784 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/32604773-f635-41b2-a665-740ace937075-metrics-certs\") pod \"speaker-4gcd5\" (UID: \"32604773-f635-41b2-a665-740ace937075\") " pod="metallb-system/speaker-4gcd5" Sep 30 14:12:10 crc kubenswrapper[4676]: E0930 14:12:10.060818 4676 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Sep 30 14:12:10 crc kubenswrapper[4676]: E0930 14:12:10.060954 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32604773-f635-41b2-a665-740ace937075-memberlist podName:32604773-f635-41b2-a665-740ace937075 nodeName:}" failed. No retries permitted until 2025-09-30 14:12:11.060927204 +0000 UTC m=+835.044015643 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/32604773-f635-41b2-a665-740ace937075-memberlist") pod "speaker-4gcd5" (UID: "32604773-f635-41b2-a665-740ace937075") : secret "metallb-memberlist" not found Sep 30 14:12:10 crc kubenswrapper[4676]: I0930 14:12:10.064172 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/32604773-f635-41b2-a665-740ace937075-metrics-certs\") pod \"speaker-4gcd5\" (UID: \"32604773-f635-41b2-a665-740ace937075\") " pod="metallb-system/speaker-4gcd5" Sep 30 14:12:10 crc kubenswrapper[4676]: I0930 14:12:10.064488 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/43a57b66-554a-40f3-ae9c-1f8dd4053405-metrics-certs\") pod \"controller-5d688f5ffc-c24th\" (UID: \"43a57b66-554a-40f3-ae9c-1f8dd4053405\") " pod="metallb-system/controller-5d688f5ffc-c24th" Sep 30 14:12:10 crc kubenswrapper[4676]: I0930 14:12:10.129632 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-x4mlw" Sep 30 14:12:10 crc kubenswrapper[4676]: I0930 14:12:10.336796 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5d688f5ffc-c24th" Sep 30 14:12:10 crc kubenswrapper[4676]: I0930 14:12:10.585243 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-rdndd" event={"ID":"518077e0-6a46-480c-9cdd-d5d5c64814b7","Type":"ContainerStarted","Data":"ff53394a4901ce3054bf855e1dee97efd89fe3f3c437d33ac158dd747c52e9d8"} Sep 30 14:12:10 crc kubenswrapper[4676]: I0930 14:12:10.588106 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-x4mlw" event={"ID":"5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5","Type":"ContainerStarted","Data":"33917e8ce8eac846aa43fd60dd2754fbfe309affb32d3bc4d0d64385fd136c84"} Sep 30 14:12:10 crc kubenswrapper[4676]: I0930 14:12:10.758690 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5d688f5ffc-c24th"] Sep 30 14:12:10 crc kubenswrapper[4676]: W0930 14:12:10.763464 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43a57b66_554a_40f3_ae9c_1f8dd4053405.slice/crio-c5e999e48b3f72f2c55738a8499ddbc1aafbd27608df259fc8b73e7cc14e522a WatchSource:0}: Error finding container c5e999e48b3f72f2c55738a8499ddbc1aafbd27608df259fc8b73e7cc14e522a: Status 404 returned error can't find the container with id c5e999e48b3f72f2c55738a8499ddbc1aafbd27608df259fc8b73e7cc14e522a Sep 30 14:12:11 crc kubenswrapper[4676]: I0930 14:12:11.074810 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/32604773-f635-41b2-a665-740ace937075-memberlist\") pod \"speaker-4gcd5\" (UID: \"32604773-f635-41b2-a665-740ace937075\") " pod="metallb-system/speaker-4gcd5" Sep 30 14:12:11 crc kubenswrapper[4676]: I0930 14:12:11.084499 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/32604773-f635-41b2-a665-740ace937075-memberlist\") pod \"speaker-4gcd5\" (UID: \"32604773-f635-41b2-a665-740ace937075\") " pod="metallb-system/speaker-4gcd5" Sep 30 14:12:11 crc kubenswrapper[4676]: I0930 14:12:11.229552 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-4gcd5" Sep 30 14:12:11 crc kubenswrapper[4676]: I0930 14:12:11.598005 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-4gcd5" event={"ID":"32604773-f635-41b2-a665-740ace937075","Type":"ContainerStarted","Data":"f1a0dec0940267eaea627327a5d005dc592222d57fbc235c26cafabecfd0557a"} Sep 30 14:12:11 crc kubenswrapper[4676]: I0930 14:12:11.598068 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-4gcd5" event={"ID":"32604773-f635-41b2-a665-740ace937075","Type":"ContainerStarted","Data":"bbe6160f5f71ed1de5b37fd994dac0ca05cd243241d6b2164bf37bde40c085f8"} Sep 30 14:12:11 crc kubenswrapper[4676]: I0930 14:12:11.602234 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-c24th" event={"ID":"43a57b66-554a-40f3-ae9c-1f8dd4053405","Type":"ContainerStarted","Data":"5d4f6e2cb66054542192c2771edec63521405b884c9ad1e5e007a63bef37da2f"} Sep 30 14:12:11 crc kubenswrapper[4676]: I0930 14:12:11.602290 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-c24th" event={"ID":"43a57b66-554a-40f3-ae9c-1f8dd4053405","Type":"ContainerStarted","Data":"e3f9e9ce69f8267f079cdbb32ece28cc48b5a3b2fe40587d42a202286076d1ff"} Sep 30 14:12:11 crc kubenswrapper[4676]: I0930 14:12:11.602305 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-c24th" event={"ID":"43a57b66-554a-40f3-ae9c-1f8dd4053405","Type":"ContainerStarted","Data":"c5e999e48b3f72f2c55738a8499ddbc1aafbd27608df259fc8b73e7cc14e522a"} Sep 30 14:12:11 crc kubenswrapper[4676]: I0930 14:12:11.602422 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-5d688f5ffc-c24th" Sep 30 14:12:11 crc kubenswrapper[4676]: I0930 14:12:11.629715 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-5d688f5ffc-c24th" podStartSLOduration=2.6296899270000003 podStartE2EDuration="2.629689927s" podCreationTimestamp="2025-09-30 14:12:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:12:11.626647774 +0000 UTC m=+835.609736203" watchObservedRunningTime="2025-09-30 14:12:11.629689927 +0000 UTC m=+835.612778346" Sep 30 14:12:12 crc kubenswrapper[4676]: I0930 14:12:12.614992 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-4gcd5" event={"ID":"32604773-f635-41b2-a665-740ace937075","Type":"ContainerStarted","Data":"d61740a1d7cb3f444ef59fa4faa6f6d6f47626c54dbf827faab6bb0e8e1bf160"} Sep 30 14:12:12 crc kubenswrapper[4676]: I0930 14:12:12.642781 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-4gcd5" podStartSLOduration=3.642755232 podStartE2EDuration="3.642755232s" podCreationTimestamp="2025-09-30 14:12:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:12:12.637004448 +0000 UTC m=+836.620092897" watchObservedRunningTime="2025-09-30 14:12:12.642755232 +0000 UTC m=+836.625843661" Sep 30 14:12:13 crc kubenswrapper[4676]: I0930 14:12:13.628389 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-4gcd5" Sep 30 14:12:17 crc kubenswrapper[4676]: E0930 14:12:17.605776 4676 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d8ff8dd_f29a_4701_abdd_0cc1751a0ca5.slice/crio-conmon-974a8e792aff76d716e6bafd928d91c98b84f8a7d6b487ec8cca7c7b446eb02e.scope\": RecentStats: unable to find data in memory cache]" Sep 30 14:12:17 crc kubenswrapper[4676]: I0930 14:12:17.656451 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-rdndd" event={"ID":"518077e0-6a46-480c-9cdd-d5d5c64814b7","Type":"ContainerStarted","Data":"9635b0915f5ec5867621e8307e3bfd3417a0061b81dc39d3954e5c433ccab490"} Sep 30 14:12:17 crc kubenswrapper[4676]: I0930 14:12:17.656907 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-rdndd" Sep 30 14:12:17 crc kubenswrapper[4676]: I0930 14:12:17.659256 4676 generic.go:334] "Generic (PLEG): container finished" podID="5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5" containerID="974a8e792aff76d716e6bafd928d91c98b84f8a7d6b487ec8cca7c7b446eb02e" exitCode=0 Sep 30 14:12:17 crc kubenswrapper[4676]: I0930 14:12:17.659324 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-x4mlw" event={"ID":"5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5","Type":"ContainerDied","Data":"974a8e792aff76d716e6bafd928d91c98b84f8a7d6b487ec8cca7c7b446eb02e"} Sep 30 14:12:17 crc kubenswrapper[4676]: I0930 14:12:17.706840 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-rdndd" podStartSLOduration=1.396478119 podStartE2EDuration="8.706818059s" podCreationTimestamp="2025-09-30 14:12:09 +0000 UTC" firstStartedPulling="2025-09-30 14:12:10.016355079 +0000 UTC m=+833.999443508" lastFinishedPulling="2025-09-30 14:12:17.326695019 +0000 UTC m=+841.309783448" observedRunningTime="2025-09-30 14:12:17.675397627 +0000 UTC m=+841.658486076" watchObservedRunningTime="2025-09-30 14:12:17.706818059 +0000 UTC m=+841.689906488" Sep 30 14:12:18 crc kubenswrapper[4676]: I0930 14:12:18.668298 4676 generic.go:334] "Generic (PLEG): container finished" podID="5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5" containerID="182051e349c6f3d53a20c4319350fdf5c11affbfbf3720aabc54064f2359e6eb" exitCode=0 Sep 30 14:12:18 crc kubenswrapper[4676]: I0930 14:12:18.668369 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-x4mlw" event={"ID":"5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5","Type":"ContainerDied","Data":"182051e349c6f3d53a20c4319350fdf5c11affbfbf3720aabc54064f2359e6eb"} Sep 30 14:12:19 crc kubenswrapper[4676]: I0930 14:12:19.676870 4676 generic.go:334] "Generic (PLEG): container finished" podID="5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5" containerID="2fcde1b2e8a7be70d781e650198123eddf09cb71bf01798a35a3b4fc9b593963" exitCode=0 Sep 30 14:12:19 crc kubenswrapper[4676]: I0930 14:12:19.677010 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-x4mlw" event={"ID":"5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5","Type":"ContainerDied","Data":"2fcde1b2e8a7be70d781e650198123eddf09cb71bf01798a35a3b4fc9b593963"} Sep 30 14:12:20 crc kubenswrapper[4676]: I0930 14:12:20.341543 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-5d688f5ffc-c24th" Sep 30 14:12:20 crc kubenswrapper[4676]: I0930 14:12:20.688063 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-x4mlw" event={"ID":"5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5","Type":"ContainerStarted","Data":"595663a450411accd0f27b0586d40defc7a67d873029aaaa66549c279e30cebc"} Sep 30 14:12:20 crc kubenswrapper[4676]: I0930 14:12:20.688492 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-x4mlw" event={"ID":"5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5","Type":"ContainerStarted","Data":"b52237ea1dbbc35c7ab76878954d5c92fae4ef69b4809548bf356ca19b8b8f8c"} Sep 30 14:12:20 crc kubenswrapper[4676]: I0930 14:12:20.688508 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-x4mlw" event={"ID":"5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5","Type":"ContainerStarted","Data":"dc84993ea83c1d1f6d01246fb064944d1edd400a388acbba9ffb437f3d88a309"} Sep 30 14:12:20 crc kubenswrapper[4676]: I0930 14:12:20.688518 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-x4mlw" event={"ID":"5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5","Type":"ContainerStarted","Data":"6a3d903ddb2fbd218b192d05fe422d711b58e9dbad1bac12157d588607861d8a"} Sep 30 14:12:20 crc kubenswrapper[4676]: I0930 14:12:20.688528 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-x4mlw" event={"ID":"5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5","Type":"ContainerStarted","Data":"9fbf63fe2007422ff7f2bb002252cae8765be84320dc2a73785dff4e3f2bbd67"} Sep 30 14:12:21 crc kubenswrapper[4676]: I0930 14:12:21.236508 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-4gcd5" Sep 30 14:12:21 crc kubenswrapper[4676]: I0930 14:12:21.697917 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-x4mlw" event={"ID":"5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5","Type":"ContainerStarted","Data":"68313ea829a45a9c2216b50222d8fe649cd91628ad58a43ac9fc1419d83f9507"} Sep 30 14:12:21 crc kubenswrapper[4676]: I0930 14:12:21.698119 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-x4mlw" Sep 30 14:12:21 crc kubenswrapper[4676]: I0930 14:12:21.723041 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-x4mlw" podStartSLOduration=5.660321901 podStartE2EDuration="12.723007803s" podCreationTimestamp="2025-09-30 14:12:09 +0000 UTC" firstStartedPulling="2025-09-30 14:12:10.282287848 +0000 UTC m=+834.265376277" lastFinishedPulling="2025-09-30 14:12:17.34497375 +0000 UTC m=+841.328062179" observedRunningTime="2025-09-30 14:12:21.720516535 +0000 UTC m=+845.703604974" watchObservedRunningTime="2025-09-30 14:12:21.723007803 +0000 UTC m=+845.706096232" Sep 30 14:12:24 crc kubenswrapper[4676]: I0930 14:12:24.298010 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-dsqzq"] Sep 30 14:12:24 crc kubenswrapper[4676]: I0930 14:12:24.299240 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-dsqzq" Sep 30 14:12:24 crc kubenswrapper[4676]: I0930 14:12:24.303446 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-hq79c" Sep 30 14:12:24 crc kubenswrapper[4676]: I0930 14:12:24.304187 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Sep 30 14:12:24 crc kubenswrapper[4676]: I0930 14:12:24.312109 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Sep 30 14:12:24 crc kubenswrapper[4676]: I0930 14:12:24.328694 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-dsqzq"] Sep 30 14:12:24 crc kubenswrapper[4676]: I0930 14:12:24.411334 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cnp2\" (UniqueName: \"kubernetes.io/projected/90da8db7-34a9-49da-9b7a-af0952da2e68-kube-api-access-8cnp2\") pod \"openstack-operator-index-dsqzq\" (UID: \"90da8db7-34a9-49da-9b7a-af0952da2e68\") " pod="openstack-operators/openstack-operator-index-dsqzq" Sep 30 14:12:24 crc kubenswrapper[4676]: I0930 14:12:24.513270 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cnp2\" (UniqueName: \"kubernetes.io/projected/90da8db7-34a9-49da-9b7a-af0952da2e68-kube-api-access-8cnp2\") pod \"openstack-operator-index-dsqzq\" (UID: \"90da8db7-34a9-49da-9b7a-af0952da2e68\") " pod="openstack-operators/openstack-operator-index-dsqzq" Sep 30 14:12:24 crc kubenswrapper[4676]: I0930 14:12:24.542064 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cnp2\" (UniqueName: \"kubernetes.io/projected/90da8db7-34a9-49da-9b7a-af0952da2e68-kube-api-access-8cnp2\") pod \"openstack-operator-index-dsqzq\" (UID: \"90da8db7-34a9-49da-9b7a-af0952da2e68\") " pod="openstack-operators/openstack-operator-index-dsqzq" Sep 30 14:12:24 crc kubenswrapper[4676]: I0930 14:12:24.621500 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-dsqzq" Sep 30 14:12:25 crc kubenswrapper[4676]: I0930 14:12:25.040544 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-dsqzq"] Sep 30 14:12:25 crc kubenswrapper[4676]: W0930 14:12:25.048163 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90da8db7_34a9_49da_9b7a_af0952da2e68.slice/crio-f4e44a1e172276840f1d8098245ea953c4f0aa9461acad90ca218c8aade5661e WatchSource:0}: Error finding container f4e44a1e172276840f1d8098245ea953c4f0aa9461acad90ca218c8aade5661e: Status 404 returned error can't find the container with id f4e44a1e172276840f1d8098245ea953c4f0aa9461acad90ca218c8aade5661e Sep 30 14:12:25 crc kubenswrapper[4676]: I0930 14:12:25.130916 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-x4mlw" Sep 30 14:12:25 crc kubenswrapper[4676]: I0930 14:12:25.168577 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-x4mlw" Sep 30 14:12:25 crc kubenswrapper[4676]: I0930 14:12:25.726320 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-dsqzq" event={"ID":"90da8db7-34a9-49da-9b7a-af0952da2e68","Type":"ContainerStarted","Data":"f4e44a1e172276840f1d8098245ea953c4f0aa9461acad90ca218c8aade5661e"} Sep 30 14:12:28 crc kubenswrapper[4676]: I0930 14:12:28.077804 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-dsqzq"] Sep 30 14:12:28 crc kubenswrapper[4676]: I0930 14:12:28.888921 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-fhxvs"] Sep 30 14:12:28 crc kubenswrapper[4676]: I0930 14:12:28.893845 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-fhxvs" Sep 30 14:12:28 crc kubenswrapper[4676]: I0930 14:12:28.901735 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-fhxvs"] Sep 30 14:12:28 crc kubenswrapper[4676]: I0930 14:12:28.987799 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzl2d\" (UniqueName: \"kubernetes.io/projected/2bf0efcb-3cb6-491b-961f-6655a84de268-kube-api-access-tzl2d\") pod \"openstack-operator-index-fhxvs\" (UID: \"2bf0efcb-3cb6-491b-961f-6655a84de268\") " pod="openstack-operators/openstack-operator-index-fhxvs" Sep 30 14:12:29 crc kubenswrapper[4676]: I0930 14:12:29.089785 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzl2d\" (UniqueName: \"kubernetes.io/projected/2bf0efcb-3cb6-491b-961f-6655a84de268-kube-api-access-tzl2d\") pod \"openstack-operator-index-fhxvs\" (UID: \"2bf0efcb-3cb6-491b-961f-6655a84de268\") " pod="openstack-operators/openstack-operator-index-fhxvs" Sep 30 14:12:29 crc kubenswrapper[4676]: I0930 14:12:29.110726 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzl2d\" (UniqueName: \"kubernetes.io/projected/2bf0efcb-3cb6-491b-961f-6655a84de268-kube-api-access-tzl2d\") pod \"openstack-operator-index-fhxvs\" (UID: \"2bf0efcb-3cb6-491b-961f-6655a84de268\") " pod="openstack-operators/openstack-operator-index-fhxvs" Sep 30 14:12:29 crc kubenswrapper[4676]: I0930 14:12:29.221906 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-fhxvs" Sep 30 14:12:29 crc kubenswrapper[4676]: I0930 14:12:29.557487 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-rdndd" Sep 30 14:12:29 crc kubenswrapper[4676]: I0930 14:12:29.920229 4676 patch_prober.go:28] interesting pod/machine-config-daemon-4k2dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:12:29 crc kubenswrapper[4676]: I0930 14:12:29.920309 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:12:29 crc kubenswrapper[4676]: I0930 14:12:29.920368 4676 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" Sep 30 14:12:29 crc kubenswrapper[4676]: I0930 14:12:29.921182 4676 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8220f97a191b5b5caecda4ca09b3bb938693c25c347212483dfba93ae90b896c"} pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 14:12:29 crc kubenswrapper[4676]: I0930 14:12:29.921259 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerName="machine-config-daemon" containerID="cri-o://8220f97a191b5b5caecda4ca09b3bb938693c25c347212483dfba93ae90b896c" gracePeriod=600 Sep 30 14:12:30 crc kubenswrapper[4676]: I0930 14:12:30.134730 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-x4mlw" Sep 30 14:12:30 crc kubenswrapper[4676]: I0930 14:12:30.483702 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-fhxvs"] Sep 30 14:12:30 crc kubenswrapper[4676]: W0930 14:12:30.488073 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bf0efcb_3cb6_491b_961f_6655a84de268.slice/crio-cd1b36ca0cbe8ac414cf479c05ecafbae1bd1b63811f52d87bac5fd3b893a80e WatchSource:0}: Error finding container cd1b36ca0cbe8ac414cf479c05ecafbae1bd1b63811f52d87bac5fd3b893a80e: Status 404 returned error can't find the container with id cd1b36ca0cbe8ac414cf479c05ecafbae1bd1b63811f52d87bac5fd3b893a80e Sep 30 14:12:30 crc kubenswrapper[4676]: I0930 14:12:30.759624 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-dsqzq" event={"ID":"90da8db7-34a9-49da-9b7a-af0952da2e68","Type":"ContainerStarted","Data":"fcfeb0ca29c5e5332d080a05d6b41eff16655dffea79700b9893743f92cded70"} Sep 30 14:12:30 crc kubenswrapper[4676]: I0930 14:12:30.759913 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-dsqzq" podUID="90da8db7-34a9-49da-9b7a-af0952da2e68" containerName="registry-server" containerID="cri-o://fcfeb0ca29c5e5332d080a05d6b41eff16655dffea79700b9893743f92cded70" gracePeriod=2 Sep 30 14:12:30 crc kubenswrapper[4676]: I0930 14:12:30.765402 4676 generic.go:334] "Generic (PLEG): container finished" podID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerID="8220f97a191b5b5caecda4ca09b3bb938693c25c347212483dfba93ae90b896c" exitCode=0 Sep 30 14:12:30 crc kubenswrapper[4676]: I0930 14:12:30.765637 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" event={"ID":"af133cb7-f0e4-428e-b348-c6e81493fc1d","Type":"ContainerDied","Data":"8220f97a191b5b5caecda4ca09b3bb938693c25c347212483dfba93ae90b896c"} Sep 30 14:12:30 crc kubenswrapper[4676]: I0930 14:12:30.765732 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" event={"ID":"af133cb7-f0e4-428e-b348-c6e81493fc1d","Type":"ContainerStarted","Data":"124b2a96400d24919cd22f949ea67e3aa5eaa3e8b7e92aeb3ff38f0b94aac9fe"} Sep 30 14:12:30 crc kubenswrapper[4676]: I0930 14:12:30.765805 4676 scope.go:117] "RemoveContainer" containerID="3afeb7a072f24b3d5a2bdb9ea432b2f6f591ebb112783320e64d4cf45e06e1b0" Sep 30 14:12:30 crc kubenswrapper[4676]: I0930 14:12:30.767921 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-fhxvs" event={"ID":"2bf0efcb-3cb6-491b-961f-6655a84de268","Type":"ContainerStarted","Data":"6996b5820fd337cac51f711460e93186496b0328be47fb841ace86d29dada3ac"} Sep 30 14:12:30 crc kubenswrapper[4676]: I0930 14:12:30.767984 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-fhxvs" event={"ID":"2bf0efcb-3cb6-491b-961f-6655a84de268","Type":"ContainerStarted","Data":"cd1b36ca0cbe8ac414cf479c05ecafbae1bd1b63811f52d87bac5fd3b893a80e"} Sep 30 14:12:30 crc kubenswrapper[4676]: I0930 14:12:30.778072 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-dsqzq" podStartSLOduration=1.5978263849999998 podStartE2EDuration="6.778059794s" podCreationTimestamp="2025-09-30 14:12:24 +0000 UTC" firstStartedPulling="2025-09-30 14:12:25.049831297 +0000 UTC m=+849.032919726" lastFinishedPulling="2025-09-30 14:12:30.230064706 +0000 UTC m=+854.213153135" observedRunningTime="2025-09-30 14:12:30.774265619 +0000 UTC m=+854.757354058" watchObservedRunningTime="2025-09-30 14:12:30.778059794 +0000 UTC m=+854.761148213" Sep 30 14:12:30 crc kubenswrapper[4676]: I0930 14:12:30.827963 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-fhxvs" podStartSLOduration=2.770541096 podStartE2EDuration="2.827920918s" podCreationTimestamp="2025-09-30 14:12:28 +0000 UTC" firstStartedPulling="2025-09-30 14:12:30.491945616 +0000 UTC m=+854.475034045" lastFinishedPulling="2025-09-30 14:12:30.549325428 +0000 UTC m=+854.532413867" observedRunningTime="2025-09-30 14:12:30.811797314 +0000 UTC m=+854.794885743" watchObservedRunningTime="2025-09-30 14:12:30.827920918 +0000 UTC m=+854.811009347" Sep 30 14:12:31 crc kubenswrapper[4676]: I0930 14:12:31.105322 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-dsqzq" Sep 30 14:12:31 crc kubenswrapper[4676]: I0930 14:12:31.222109 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cnp2\" (UniqueName: \"kubernetes.io/projected/90da8db7-34a9-49da-9b7a-af0952da2e68-kube-api-access-8cnp2\") pod \"90da8db7-34a9-49da-9b7a-af0952da2e68\" (UID: \"90da8db7-34a9-49da-9b7a-af0952da2e68\") " Sep 30 14:12:31 crc kubenswrapper[4676]: I0930 14:12:31.230407 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90da8db7-34a9-49da-9b7a-af0952da2e68-kube-api-access-8cnp2" (OuterVolumeSpecName: "kube-api-access-8cnp2") pod "90da8db7-34a9-49da-9b7a-af0952da2e68" (UID: "90da8db7-34a9-49da-9b7a-af0952da2e68"). InnerVolumeSpecName "kube-api-access-8cnp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:12:31 crc kubenswrapper[4676]: I0930 14:12:31.323766 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cnp2\" (UniqueName: \"kubernetes.io/projected/90da8db7-34a9-49da-9b7a-af0952da2e68-kube-api-access-8cnp2\") on node \"crc\" DevicePath \"\"" Sep 30 14:12:31 crc kubenswrapper[4676]: I0930 14:12:31.775128 4676 generic.go:334] "Generic (PLEG): container finished" podID="90da8db7-34a9-49da-9b7a-af0952da2e68" containerID="fcfeb0ca29c5e5332d080a05d6b41eff16655dffea79700b9893743f92cded70" exitCode=0 Sep 30 14:12:31 crc kubenswrapper[4676]: I0930 14:12:31.775199 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-dsqzq" event={"ID":"90da8db7-34a9-49da-9b7a-af0952da2e68","Type":"ContainerDied","Data":"fcfeb0ca29c5e5332d080a05d6b41eff16655dffea79700b9893743f92cded70"} Sep 30 14:12:31 crc kubenswrapper[4676]: I0930 14:12:31.775231 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-dsqzq" event={"ID":"90da8db7-34a9-49da-9b7a-af0952da2e68","Type":"ContainerDied","Data":"f4e44a1e172276840f1d8098245ea953c4f0aa9461acad90ca218c8aade5661e"} Sep 30 14:12:31 crc kubenswrapper[4676]: I0930 14:12:31.775268 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-dsqzq" Sep 30 14:12:31 crc kubenswrapper[4676]: I0930 14:12:31.775284 4676 scope.go:117] "RemoveContainer" containerID="fcfeb0ca29c5e5332d080a05d6b41eff16655dffea79700b9893743f92cded70" Sep 30 14:12:31 crc kubenswrapper[4676]: I0930 14:12:31.795188 4676 scope.go:117] "RemoveContainer" containerID="fcfeb0ca29c5e5332d080a05d6b41eff16655dffea79700b9893743f92cded70" Sep 30 14:12:31 crc kubenswrapper[4676]: E0930 14:12:31.795704 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fcfeb0ca29c5e5332d080a05d6b41eff16655dffea79700b9893743f92cded70\": container with ID starting with fcfeb0ca29c5e5332d080a05d6b41eff16655dffea79700b9893743f92cded70 not found: ID does not exist" containerID="fcfeb0ca29c5e5332d080a05d6b41eff16655dffea79700b9893743f92cded70" Sep 30 14:12:31 crc kubenswrapper[4676]: I0930 14:12:31.795766 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcfeb0ca29c5e5332d080a05d6b41eff16655dffea79700b9893743f92cded70"} err="failed to get container status \"fcfeb0ca29c5e5332d080a05d6b41eff16655dffea79700b9893743f92cded70\": rpc error: code = NotFound desc = could not find container \"fcfeb0ca29c5e5332d080a05d6b41eff16655dffea79700b9893743f92cded70\": container with ID starting with fcfeb0ca29c5e5332d080a05d6b41eff16655dffea79700b9893743f92cded70 not found: ID does not exist" Sep 30 14:12:31 crc kubenswrapper[4676]: I0930 14:12:31.801066 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-dsqzq"] Sep 30 14:12:31 crc kubenswrapper[4676]: I0930 14:12:31.803020 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-dsqzq"] Sep 30 14:12:33 crc kubenswrapper[4676]: I0930 14:12:33.439421 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90da8db7-34a9-49da-9b7a-af0952da2e68" path="/var/lib/kubelet/pods/90da8db7-34a9-49da-9b7a-af0952da2e68/volumes" Sep 30 14:12:39 crc kubenswrapper[4676]: I0930 14:12:39.222425 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-fhxvs" Sep 30 14:12:39 crc kubenswrapper[4676]: I0930 14:12:39.223939 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-fhxvs" Sep 30 14:12:39 crc kubenswrapper[4676]: I0930 14:12:39.274767 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-fhxvs" Sep 30 14:12:39 crc kubenswrapper[4676]: I0930 14:12:39.866494 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-fhxvs" Sep 30 14:12:41 crc kubenswrapper[4676]: I0930 14:12:41.489215 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-skbzd"] Sep 30 14:12:41 crc kubenswrapper[4676]: E0930 14:12:41.489997 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90da8db7-34a9-49da-9b7a-af0952da2e68" containerName="registry-server" Sep 30 14:12:41 crc kubenswrapper[4676]: I0930 14:12:41.490017 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="90da8db7-34a9-49da-9b7a-af0952da2e68" containerName="registry-server" Sep 30 14:12:41 crc kubenswrapper[4676]: I0930 14:12:41.490206 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="90da8db7-34a9-49da-9b7a-af0952da2e68" containerName="registry-server" Sep 30 14:12:41 crc kubenswrapper[4676]: I0930 14:12:41.491587 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-skbzd" Sep 30 14:12:41 crc kubenswrapper[4676]: I0930 14:12:41.506248 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-skbzd"] Sep 30 14:12:41 crc kubenswrapper[4676]: I0930 14:12:41.581384 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8t42\" (UniqueName: \"kubernetes.io/projected/823d6d8e-2968-4663-96e6-2499f7a86b12-kube-api-access-f8t42\") pod \"redhat-marketplace-skbzd\" (UID: \"823d6d8e-2968-4663-96e6-2499f7a86b12\") " pod="openshift-marketplace/redhat-marketplace-skbzd" Sep 30 14:12:41 crc kubenswrapper[4676]: I0930 14:12:41.581592 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/823d6d8e-2968-4663-96e6-2499f7a86b12-catalog-content\") pod \"redhat-marketplace-skbzd\" (UID: \"823d6d8e-2968-4663-96e6-2499f7a86b12\") " pod="openshift-marketplace/redhat-marketplace-skbzd" Sep 30 14:12:41 crc kubenswrapper[4676]: I0930 14:12:41.581729 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/823d6d8e-2968-4663-96e6-2499f7a86b12-utilities\") pod \"redhat-marketplace-skbzd\" (UID: \"823d6d8e-2968-4663-96e6-2499f7a86b12\") " pod="openshift-marketplace/redhat-marketplace-skbzd" Sep 30 14:12:41 crc kubenswrapper[4676]: I0930 14:12:41.682787 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8t42\" (UniqueName: \"kubernetes.io/projected/823d6d8e-2968-4663-96e6-2499f7a86b12-kube-api-access-f8t42\") pod \"redhat-marketplace-skbzd\" (UID: \"823d6d8e-2968-4663-96e6-2499f7a86b12\") " pod="openshift-marketplace/redhat-marketplace-skbzd" Sep 30 14:12:41 crc kubenswrapper[4676]: I0930 14:12:41.682866 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/823d6d8e-2968-4663-96e6-2499f7a86b12-catalog-content\") pod \"redhat-marketplace-skbzd\" (UID: \"823d6d8e-2968-4663-96e6-2499f7a86b12\") " pod="openshift-marketplace/redhat-marketplace-skbzd" Sep 30 14:12:41 crc kubenswrapper[4676]: I0930 14:12:41.682922 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/823d6d8e-2968-4663-96e6-2499f7a86b12-utilities\") pod \"redhat-marketplace-skbzd\" (UID: \"823d6d8e-2968-4663-96e6-2499f7a86b12\") " pod="openshift-marketplace/redhat-marketplace-skbzd" Sep 30 14:12:41 crc kubenswrapper[4676]: I0930 14:12:41.683418 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/823d6d8e-2968-4663-96e6-2499f7a86b12-catalog-content\") pod \"redhat-marketplace-skbzd\" (UID: \"823d6d8e-2968-4663-96e6-2499f7a86b12\") " pod="openshift-marketplace/redhat-marketplace-skbzd" Sep 30 14:12:41 crc kubenswrapper[4676]: I0930 14:12:41.683431 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/823d6d8e-2968-4663-96e6-2499f7a86b12-utilities\") pod \"redhat-marketplace-skbzd\" (UID: \"823d6d8e-2968-4663-96e6-2499f7a86b12\") " pod="openshift-marketplace/redhat-marketplace-skbzd" Sep 30 14:12:41 crc kubenswrapper[4676]: I0930 14:12:41.703333 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8t42\" (UniqueName: \"kubernetes.io/projected/823d6d8e-2968-4663-96e6-2499f7a86b12-kube-api-access-f8t42\") pod \"redhat-marketplace-skbzd\" (UID: \"823d6d8e-2968-4663-96e6-2499f7a86b12\") " pod="openshift-marketplace/redhat-marketplace-skbzd" Sep 30 14:12:41 crc kubenswrapper[4676]: I0930 14:12:41.813177 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-skbzd" Sep 30 14:12:42 crc kubenswrapper[4676]: I0930 14:12:42.269237 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-skbzd"] Sep 30 14:12:42 crc kubenswrapper[4676]: I0930 14:12:42.859473 4676 generic.go:334] "Generic (PLEG): container finished" podID="823d6d8e-2968-4663-96e6-2499f7a86b12" containerID="2e6fb42acae3f093fd1cc813d96048ed8bf19970a68a0ea8a580ff56a53cf3b8" exitCode=0 Sep 30 14:12:42 crc kubenswrapper[4676]: I0930 14:12:42.859538 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-skbzd" event={"ID":"823d6d8e-2968-4663-96e6-2499f7a86b12","Type":"ContainerDied","Data":"2e6fb42acae3f093fd1cc813d96048ed8bf19970a68a0ea8a580ff56a53cf3b8"} Sep 30 14:12:42 crc kubenswrapper[4676]: I0930 14:12:42.859949 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-skbzd" event={"ID":"823d6d8e-2968-4663-96e6-2499f7a86b12","Type":"ContainerStarted","Data":"41ecd4d345764a3dd649e9c7489c06fdb62c0aab10ac5c4ed8a45f26300691e2"} Sep 30 14:12:43 crc kubenswrapper[4676]: I0930 14:12:43.868907 4676 generic.go:334] "Generic (PLEG): container finished" podID="823d6d8e-2968-4663-96e6-2499f7a86b12" containerID="5446e534a962855d0578baf36e67b91b665e931221e6ffd60001f3fe08fce706" exitCode=0 Sep 30 14:12:43 crc kubenswrapper[4676]: I0930 14:12:43.869112 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-skbzd" event={"ID":"823d6d8e-2968-4663-96e6-2499f7a86b12","Type":"ContainerDied","Data":"5446e534a962855d0578baf36e67b91b665e931221e6ffd60001f3fe08fce706"} Sep 30 14:12:44 crc kubenswrapper[4676]: I0930 14:12:44.885794 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-skbzd" event={"ID":"823d6d8e-2968-4663-96e6-2499f7a86b12","Type":"ContainerStarted","Data":"40053c8435896d839b644385d2b1f3d9d47e373f7e410c0483347f329fb36b23"} Sep 30 14:12:45 crc kubenswrapper[4676]: I0930 14:12:45.932599 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-skbzd" podStartSLOduration=3.5054680400000002 podStartE2EDuration="4.932567922s" podCreationTimestamp="2025-09-30 14:12:41 +0000 UTC" firstStartedPulling="2025-09-30 14:12:42.862691572 +0000 UTC m=+866.845779991" lastFinishedPulling="2025-09-30 14:12:44.289791444 +0000 UTC m=+868.272879873" observedRunningTime="2025-09-30 14:12:44.904729066 +0000 UTC m=+868.887817485" watchObservedRunningTime="2025-09-30 14:12:45.932567922 +0000 UTC m=+869.915656351" Sep 30 14:12:45 crc kubenswrapper[4676]: I0930 14:12:45.938902 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/64d7a47084c6b06ecfa189b2173a2aa65d74cada1beaa569dc58135644wj2pm"] Sep 30 14:12:45 crc kubenswrapper[4676]: I0930 14:12:45.944078 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/64d7a47084c6b06ecfa189b2173a2aa65d74cada1beaa569dc58135644wj2pm" Sep 30 14:12:45 crc kubenswrapper[4676]: I0930 14:12:45.953026 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-th7nm" Sep 30 14:12:45 crc kubenswrapper[4676]: I0930 14:12:45.955468 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/64d7a47084c6b06ecfa189b2173a2aa65d74cada1beaa569dc58135644wj2pm"] Sep 30 14:12:46 crc kubenswrapper[4676]: I0930 14:12:46.049093 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wh6vw\" (UniqueName: \"kubernetes.io/projected/0c22d3ee-9eda-44ac-b7af-367b71fc5505-kube-api-access-wh6vw\") pod \"64d7a47084c6b06ecfa189b2173a2aa65d74cada1beaa569dc58135644wj2pm\" (UID: \"0c22d3ee-9eda-44ac-b7af-367b71fc5505\") " pod="openstack-operators/64d7a47084c6b06ecfa189b2173a2aa65d74cada1beaa569dc58135644wj2pm" Sep 30 14:12:46 crc kubenswrapper[4676]: I0930 14:12:46.049157 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0c22d3ee-9eda-44ac-b7af-367b71fc5505-bundle\") pod \"64d7a47084c6b06ecfa189b2173a2aa65d74cada1beaa569dc58135644wj2pm\" (UID: \"0c22d3ee-9eda-44ac-b7af-367b71fc5505\") " pod="openstack-operators/64d7a47084c6b06ecfa189b2173a2aa65d74cada1beaa569dc58135644wj2pm" Sep 30 14:12:46 crc kubenswrapper[4676]: I0930 14:12:46.049200 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0c22d3ee-9eda-44ac-b7af-367b71fc5505-util\") pod \"64d7a47084c6b06ecfa189b2173a2aa65d74cada1beaa569dc58135644wj2pm\" (UID: \"0c22d3ee-9eda-44ac-b7af-367b71fc5505\") " pod="openstack-operators/64d7a47084c6b06ecfa189b2173a2aa65d74cada1beaa569dc58135644wj2pm" Sep 30 14:12:46 crc kubenswrapper[4676]: I0930 14:12:46.150637 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wh6vw\" (UniqueName: \"kubernetes.io/projected/0c22d3ee-9eda-44ac-b7af-367b71fc5505-kube-api-access-wh6vw\") pod \"64d7a47084c6b06ecfa189b2173a2aa65d74cada1beaa569dc58135644wj2pm\" (UID: \"0c22d3ee-9eda-44ac-b7af-367b71fc5505\") " pod="openstack-operators/64d7a47084c6b06ecfa189b2173a2aa65d74cada1beaa569dc58135644wj2pm" Sep 30 14:12:46 crc kubenswrapper[4676]: I0930 14:12:46.150700 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0c22d3ee-9eda-44ac-b7af-367b71fc5505-bundle\") pod \"64d7a47084c6b06ecfa189b2173a2aa65d74cada1beaa569dc58135644wj2pm\" (UID: \"0c22d3ee-9eda-44ac-b7af-367b71fc5505\") " pod="openstack-operators/64d7a47084c6b06ecfa189b2173a2aa65d74cada1beaa569dc58135644wj2pm" Sep 30 14:12:46 crc kubenswrapper[4676]: I0930 14:12:46.150754 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0c22d3ee-9eda-44ac-b7af-367b71fc5505-util\") pod \"64d7a47084c6b06ecfa189b2173a2aa65d74cada1beaa569dc58135644wj2pm\" (UID: \"0c22d3ee-9eda-44ac-b7af-367b71fc5505\") " pod="openstack-operators/64d7a47084c6b06ecfa189b2173a2aa65d74cada1beaa569dc58135644wj2pm" Sep 30 14:12:46 crc kubenswrapper[4676]: I0930 14:12:46.151331 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0c22d3ee-9eda-44ac-b7af-367b71fc5505-bundle\") pod \"64d7a47084c6b06ecfa189b2173a2aa65d74cada1beaa569dc58135644wj2pm\" (UID: \"0c22d3ee-9eda-44ac-b7af-367b71fc5505\") " pod="openstack-operators/64d7a47084c6b06ecfa189b2173a2aa65d74cada1beaa569dc58135644wj2pm" Sep 30 14:12:46 crc kubenswrapper[4676]: I0930 14:12:46.151551 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0c22d3ee-9eda-44ac-b7af-367b71fc5505-util\") pod \"64d7a47084c6b06ecfa189b2173a2aa65d74cada1beaa569dc58135644wj2pm\" (UID: \"0c22d3ee-9eda-44ac-b7af-367b71fc5505\") " pod="openstack-operators/64d7a47084c6b06ecfa189b2173a2aa65d74cada1beaa569dc58135644wj2pm" Sep 30 14:12:46 crc kubenswrapper[4676]: I0930 14:12:46.173492 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wh6vw\" (UniqueName: \"kubernetes.io/projected/0c22d3ee-9eda-44ac-b7af-367b71fc5505-kube-api-access-wh6vw\") pod \"64d7a47084c6b06ecfa189b2173a2aa65d74cada1beaa569dc58135644wj2pm\" (UID: \"0c22d3ee-9eda-44ac-b7af-367b71fc5505\") " pod="openstack-operators/64d7a47084c6b06ecfa189b2173a2aa65d74cada1beaa569dc58135644wj2pm" Sep 30 14:12:46 crc kubenswrapper[4676]: I0930 14:12:46.272489 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/64d7a47084c6b06ecfa189b2173a2aa65d74cada1beaa569dc58135644wj2pm" Sep 30 14:12:46 crc kubenswrapper[4676]: I0930 14:12:46.718213 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/64d7a47084c6b06ecfa189b2173a2aa65d74cada1beaa569dc58135644wj2pm"] Sep 30 14:12:46 crc kubenswrapper[4676]: W0930 14:12:46.725916 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c22d3ee_9eda_44ac_b7af_367b71fc5505.slice/crio-93a0bbc3153e431679a287c7f7449d2826203834d51521aa89d4e250ad07924c WatchSource:0}: Error finding container 93a0bbc3153e431679a287c7f7449d2826203834d51521aa89d4e250ad07924c: Status 404 returned error can't find the container with id 93a0bbc3153e431679a287c7f7449d2826203834d51521aa89d4e250ad07924c Sep 30 14:12:46 crc kubenswrapper[4676]: I0930 14:12:46.902097 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/64d7a47084c6b06ecfa189b2173a2aa65d74cada1beaa569dc58135644wj2pm" event={"ID":"0c22d3ee-9eda-44ac-b7af-367b71fc5505","Type":"ContainerStarted","Data":"93a0bbc3153e431679a287c7f7449d2826203834d51521aa89d4e250ad07924c"} Sep 30 14:12:47 crc kubenswrapper[4676]: I0930 14:12:47.910643 4676 generic.go:334] "Generic (PLEG): container finished" podID="0c22d3ee-9eda-44ac-b7af-367b71fc5505" containerID="fd8a996920c3fa9c3369ce791c17a99e1229dc225ce83a50615f85977efc6fd1" exitCode=0 Sep 30 14:12:47 crc kubenswrapper[4676]: I0930 14:12:47.910719 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/64d7a47084c6b06ecfa189b2173a2aa65d74cada1beaa569dc58135644wj2pm" event={"ID":"0c22d3ee-9eda-44ac-b7af-367b71fc5505","Type":"ContainerDied","Data":"fd8a996920c3fa9c3369ce791c17a99e1229dc225ce83a50615f85977efc6fd1"} Sep 30 14:12:48 crc kubenswrapper[4676]: I0930 14:12:48.920728 4676 generic.go:334] "Generic (PLEG): container finished" podID="0c22d3ee-9eda-44ac-b7af-367b71fc5505" containerID="b7c90214b2c1657299ec6c7dc37e0d31ac0b969b93fec2177d2ac4eb649ef1fb" exitCode=0 Sep 30 14:12:48 crc kubenswrapper[4676]: I0930 14:12:48.920789 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/64d7a47084c6b06ecfa189b2173a2aa65d74cada1beaa569dc58135644wj2pm" event={"ID":"0c22d3ee-9eda-44ac-b7af-367b71fc5505","Type":"ContainerDied","Data":"b7c90214b2c1657299ec6c7dc37e0d31ac0b969b93fec2177d2ac4eb649ef1fb"} Sep 30 14:12:49 crc kubenswrapper[4676]: I0930 14:12:49.936785 4676 generic.go:334] "Generic (PLEG): container finished" podID="0c22d3ee-9eda-44ac-b7af-367b71fc5505" containerID="0a318442cd6807abc2c52b8e28fca9c66739ba2e1eecd19114d0940529d5aa6c" exitCode=0 Sep 30 14:12:49 crc kubenswrapper[4676]: I0930 14:12:49.936902 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/64d7a47084c6b06ecfa189b2173a2aa65d74cada1beaa569dc58135644wj2pm" event={"ID":"0c22d3ee-9eda-44ac-b7af-367b71fc5505","Type":"ContainerDied","Data":"0a318442cd6807abc2c52b8e28fca9c66739ba2e1eecd19114d0940529d5aa6c"} Sep 30 14:12:51 crc kubenswrapper[4676]: I0930 14:12:51.258692 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/64d7a47084c6b06ecfa189b2173a2aa65d74cada1beaa569dc58135644wj2pm" Sep 30 14:12:51 crc kubenswrapper[4676]: I0930 14:12:51.338729 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0c22d3ee-9eda-44ac-b7af-367b71fc5505-bundle\") pod \"0c22d3ee-9eda-44ac-b7af-367b71fc5505\" (UID: \"0c22d3ee-9eda-44ac-b7af-367b71fc5505\") " Sep 30 14:12:51 crc kubenswrapper[4676]: I0930 14:12:51.338917 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0c22d3ee-9eda-44ac-b7af-367b71fc5505-util\") pod \"0c22d3ee-9eda-44ac-b7af-367b71fc5505\" (UID: \"0c22d3ee-9eda-44ac-b7af-367b71fc5505\") " Sep 30 14:12:51 crc kubenswrapper[4676]: I0930 14:12:51.338949 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wh6vw\" (UniqueName: \"kubernetes.io/projected/0c22d3ee-9eda-44ac-b7af-367b71fc5505-kube-api-access-wh6vw\") pod \"0c22d3ee-9eda-44ac-b7af-367b71fc5505\" (UID: \"0c22d3ee-9eda-44ac-b7af-367b71fc5505\") " Sep 30 14:12:51 crc kubenswrapper[4676]: I0930 14:12:51.340783 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c22d3ee-9eda-44ac-b7af-367b71fc5505-bundle" (OuterVolumeSpecName: "bundle") pod "0c22d3ee-9eda-44ac-b7af-367b71fc5505" (UID: "0c22d3ee-9eda-44ac-b7af-367b71fc5505"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:12:51 crc kubenswrapper[4676]: I0930 14:12:51.347468 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c22d3ee-9eda-44ac-b7af-367b71fc5505-kube-api-access-wh6vw" (OuterVolumeSpecName: "kube-api-access-wh6vw") pod "0c22d3ee-9eda-44ac-b7af-367b71fc5505" (UID: "0c22d3ee-9eda-44ac-b7af-367b71fc5505"). InnerVolumeSpecName "kube-api-access-wh6vw". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:12:51 crc kubenswrapper[4676]: I0930 14:12:51.355739 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c22d3ee-9eda-44ac-b7af-367b71fc5505-util" (OuterVolumeSpecName: "util") pod "0c22d3ee-9eda-44ac-b7af-367b71fc5505" (UID: "0c22d3ee-9eda-44ac-b7af-367b71fc5505"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:12:51 crc kubenswrapper[4676]: I0930 14:12:51.441009 4676 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0c22d3ee-9eda-44ac-b7af-367b71fc5505-util\") on node \"crc\" DevicePath \"\"" Sep 30 14:12:51 crc kubenswrapper[4676]: I0930 14:12:51.441046 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wh6vw\" (UniqueName: \"kubernetes.io/projected/0c22d3ee-9eda-44ac-b7af-367b71fc5505-kube-api-access-wh6vw\") on node \"crc\" DevicePath \"\"" Sep 30 14:12:51 crc kubenswrapper[4676]: I0930 14:12:51.441059 4676 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0c22d3ee-9eda-44ac-b7af-367b71fc5505-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:12:51 crc kubenswrapper[4676]: I0930 14:12:51.813662 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-skbzd" Sep 30 14:12:51 crc kubenswrapper[4676]: I0930 14:12:51.813777 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-skbzd" Sep 30 14:12:51 crc kubenswrapper[4676]: I0930 14:12:51.863221 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-skbzd" Sep 30 14:12:51 crc kubenswrapper[4676]: I0930 14:12:51.956922 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/64d7a47084c6b06ecfa189b2173a2aa65d74cada1beaa569dc58135644wj2pm" event={"ID":"0c22d3ee-9eda-44ac-b7af-367b71fc5505","Type":"ContainerDied","Data":"93a0bbc3153e431679a287c7f7449d2826203834d51521aa89d4e250ad07924c"} Sep 30 14:12:51 crc kubenswrapper[4676]: I0930 14:12:51.956958 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/64d7a47084c6b06ecfa189b2173a2aa65d74cada1beaa569dc58135644wj2pm" Sep 30 14:12:51 crc kubenswrapper[4676]: I0930 14:12:51.956987 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93a0bbc3153e431679a287c7f7449d2826203834d51521aa89d4e250ad07924c" Sep 30 14:12:52 crc kubenswrapper[4676]: I0930 14:12:52.004536 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-skbzd" Sep 30 14:12:54 crc kubenswrapper[4676]: I0930 14:12:54.278231 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-skbzd"] Sep 30 14:12:54 crc kubenswrapper[4676]: I0930 14:12:54.279068 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-skbzd" podUID="823d6d8e-2968-4663-96e6-2499f7a86b12" containerName="registry-server" containerID="cri-o://40053c8435896d839b644385d2b1f3d9d47e373f7e410c0483347f329fb36b23" gracePeriod=2 Sep 30 14:12:54 crc kubenswrapper[4676]: I0930 14:12:54.712526 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-skbzd" Sep 30 14:12:54 crc kubenswrapper[4676]: I0930 14:12:54.791233 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8t42\" (UniqueName: \"kubernetes.io/projected/823d6d8e-2968-4663-96e6-2499f7a86b12-kube-api-access-f8t42\") pod \"823d6d8e-2968-4663-96e6-2499f7a86b12\" (UID: \"823d6d8e-2968-4663-96e6-2499f7a86b12\") " Sep 30 14:12:54 crc kubenswrapper[4676]: I0930 14:12:54.791383 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/823d6d8e-2968-4663-96e6-2499f7a86b12-catalog-content\") pod \"823d6d8e-2968-4663-96e6-2499f7a86b12\" (UID: \"823d6d8e-2968-4663-96e6-2499f7a86b12\") " Sep 30 14:12:54 crc kubenswrapper[4676]: I0930 14:12:54.791448 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/823d6d8e-2968-4663-96e6-2499f7a86b12-utilities\") pod \"823d6d8e-2968-4663-96e6-2499f7a86b12\" (UID: \"823d6d8e-2968-4663-96e6-2499f7a86b12\") " Sep 30 14:12:54 crc kubenswrapper[4676]: I0930 14:12:54.792723 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/823d6d8e-2968-4663-96e6-2499f7a86b12-utilities" (OuterVolumeSpecName: "utilities") pod "823d6d8e-2968-4663-96e6-2499f7a86b12" (UID: "823d6d8e-2968-4663-96e6-2499f7a86b12"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:12:54 crc kubenswrapper[4676]: I0930 14:12:54.798678 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/823d6d8e-2968-4663-96e6-2499f7a86b12-kube-api-access-f8t42" (OuterVolumeSpecName: "kube-api-access-f8t42") pod "823d6d8e-2968-4663-96e6-2499f7a86b12" (UID: "823d6d8e-2968-4663-96e6-2499f7a86b12"). InnerVolumeSpecName "kube-api-access-f8t42". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:12:54 crc kubenswrapper[4676]: I0930 14:12:54.804488 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/823d6d8e-2968-4663-96e6-2499f7a86b12-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "823d6d8e-2968-4663-96e6-2499f7a86b12" (UID: "823d6d8e-2968-4663-96e6-2499f7a86b12"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:12:54 crc kubenswrapper[4676]: I0930 14:12:54.893433 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/823d6d8e-2968-4663-96e6-2499f7a86b12-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 14:12:54 crc kubenswrapper[4676]: I0930 14:12:54.893464 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/823d6d8e-2968-4663-96e6-2499f7a86b12-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 14:12:54 crc kubenswrapper[4676]: I0930 14:12:54.893475 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8t42\" (UniqueName: \"kubernetes.io/projected/823d6d8e-2968-4663-96e6-2499f7a86b12-kube-api-access-f8t42\") on node \"crc\" DevicePath \"\"" Sep 30 14:12:54 crc kubenswrapper[4676]: I0930 14:12:54.980370 4676 generic.go:334] "Generic (PLEG): container finished" podID="823d6d8e-2968-4663-96e6-2499f7a86b12" containerID="40053c8435896d839b644385d2b1f3d9d47e373f7e410c0483347f329fb36b23" exitCode=0 Sep 30 14:12:54 crc kubenswrapper[4676]: I0930 14:12:54.980425 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-skbzd" event={"ID":"823d6d8e-2968-4663-96e6-2499f7a86b12","Type":"ContainerDied","Data":"40053c8435896d839b644385d2b1f3d9d47e373f7e410c0483347f329fb36b23"} Sep 30 14:12:54 crc kubenswrapper[4676]: I0930 14:12:54.980468 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-skbzd" event={"ID":"823d6d8e-2968-4663-96e6-2499f7a86b12","Type":"ContainerDied","Data":"41ecd4d345764a3dd649e9c7489c06fdb62c0aab10ac5c4ed8a45f26300691e2"} Sep 30 14:12:54 crc kubenswrapper[4676]: I0930 14:12:54.980493 4676 scope.go:117] "RemoveContainer" containerID="40053c8435896d839b644385d2b1f3d9d47e373f7e410c0483347f329fb36b23" Sep 30 14:12:54 crc kubenswrapper[4676]: I0930 14:12:54.980491 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-skbzd" Sep 30 14:12:55 crc kubenswrapper[4676]: I0930 14:12:55.002743 4676 scope.go:117] "RemoveContainer" containerID="5446e534a962855d0578baf36e67b91b665e931221e6ffd60001f3fe08fce706" Sep 30 14:12:55 crc kubenswrapper[4676]: I0930 14:12:55.022164 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-skbzd"] Sep 30 14:12:55 crc kubenswrapper[4676]: I0930 14:12:55.025403 4676 scope.go:117] "RemoveContainer" containerID="2e6fb42acae3f093fd1cc813d96048ed8bf19970a68a0ea8a580ff56a53cf3b8" Sep 30 14:12:55 crc kubenswrapper[4676]: I0930 14:12:55.027657 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-skbzd"] Sep 30 14:12:55 crc kubenswrapper[4676]: I0930 14:12:55.049160 4676 scope.go:117] "RemoveContainer" containerID="40053c8435896d839b644385d2b1f3d9d47e373f7e410c0483347f329fb36b23" Sep 30 14:12:55 crc kubenswrapper[4676]: E0930 14:12:55.050094 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40053c8435896d839b644385d2b1f3d9d47e373f7e410c0483347f329fb36b23\": container with ID starting with 40053c8435896d839b644385d2b1f3d9d47e373f7e410c0483347f329fb36b23 not found: ID does not exist" containerID="40053c8435896d839b644385d2b1f3d9d47e373f7e410c0483347f329fb36b23" Sep 30 14:12:55 crc kubenswrapper[4676]: I0930 14:12:55.050178 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40053c8435896d839b644385d2b1f3d9d47e373f7e410c0483347f329fb36b23"} err="failed to get container status \"40053c8435896d839b644385d2b1f3d9d47e373f7e410c0483347f329fb36b23\": rpc error: code = NotFound desc = could not find container \"40053c8435896d839b644385d2b1f3d9d47e373f7e410c0483347f329fb36b23\": container with ID starting with 40053c8435896d839b644385d2b1f3d9d47e373f7e410c0483347f329fb36b23 not found: ID does not exist" Sep 30 14:12:55 crc kubenswrapper[4676]: I0930 14:12:55.050240 4676 scope.go:117] "RemoveContainer" containerID="5446e534a962855d0578baf36e67b91b665e931221e6ffd60001f3fe08fce706" Sep 30 14:12:55 crc kubenswrapper[4676]: E0930 14:12:55.051067 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5446e534a962855d0578baf36e67b91b665e931221e6ffd60001f3fe08fce706\": container with ID starting with 5446e534a962855d0578baf36e67b91b665e931221e6ffd60001f3fe08fce706 not found: ID does not exist" containerID="5446e534a962855d0578baf36e67b91b665e931221e6ffd60001f3fe08fce706" Sep 30 14:12:55 crc kubenswrapper[4676]: I0930 14:12:55.051105 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5446e534a962855d0578baf36e67b91b665e931221e6ffd60001f3fe08fce706"} err="failed to get container status \"5446e534a962855d0578baf36e67b91b665e931221e6ffd60001f3fe08fce706\": rpc error: code = NotFound desc = could not find container \"5446e534a962855d0578baf36e67b91b665e931221e6ffd60001f3fe08fce706\": container with ID starting with 5446e534a962855d0578baf36e67b91b665e931221e6ffd60001f3fe08fce706 not found: ID does not exist" Sep 30 14:12:55 crc kubenswrapper[4676]: I0930 14:12:55.051132 4676 scope.go:117] "RemoveContainer" containerID="2e6fb42acae3f093fd1cc813d96048ed8bf19970a68a0ea8a580ff56a53cf3b8" Sep 30 14:12:55 crc kubenswrapper[4676]: E0930 14:12:55.051410 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e6fb42acae3f093fd1cc813d96048ed8bf19970a68a0ea8a580ff56a53cf3b8\": container with ID starting with 2e6fb42acae3f093fd1cc813d96048ed8bf19970a68a0ea8a580ff56a53cf3b8 not found: ID does not exist" containerID="2e6fb42acae3f093fd1cc813d96048ed8bf19970a68a0ea8a580ff56a53cf3b8" Sep 30 14:12:55 crc kubenswrapper[4676]: I0930 14:12:55.051432 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e6fb42acae3f093fd1cc813d96048ed8bf19970a68a0ea8a580ff56a53cf3b8"} err="failed to get container status \"2e6fb42acae3f093fd1cc813d96048ed8bf19970a68a0ea8a580ff56a53cf3b8\": rpc error: code = NotFound desc = could not find container \"2e6fb42acae3f093fd1cc813d96048ed8bf19970a68a0ea8a580ff56a53cf3b8\": container with ID starting with 2e6fb42acae3f093fd1cc813d96048ed8bf19970a68a0ea8a580ff56a53cf3b8 not found: ID does not exist" Sep 30 14:12:55 crc kubenswrapper[4676]: I0930 14:12:55.442619 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="823d6d8e-2968-4663-96e6-2499f7a86b12" path="/var/lib/kubelet/pods/823d6d8e-2968-4663-96e6-2499f7a86b12/volumes" Sep 30 14:12:56 crc kubenswrapper[4676]: I0930 14:12:56.890174 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-55ccb8ddf4-slxtv"] Sep 30 14:12:56 crc kubenswrapper[4676]: E0930 14:12:56.890699 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c22d3ee-9eda-44ac-b7af-367b71fc5505" containerName="util" Sep 30 14:12:56 crc kubenswrapper[4676]: I0930 14:12:56.890712 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c22d3ee-9eda-44ac-b7af-367b71fc5505" containerName="util" Sep 30 14:12:56 crc kubenswrapper[4676]: E0930 14:12:56.890730 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c22d3ee-9eda-44ac-b7af-367b71fc5505" containerName="pull" Sep 30 14:12:56 crc kubenswrapper[4676]: I0930 14:12:56.890737 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c22d3ee-9eda-44ac-b7af-367b71fc5505" containerName="pull" Sep 30 14:12:56 crc kubenswrapper[4676]: E0930 14:12:56.890745 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c22d3ee-9eda-44ac-b7af-367b71fc5505" containerName="extract" Sep 30 14:12:56 crc kubenswrapper[4676]: I0930 14:12:56.890752 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c22d3ee-9eda-44ac-b7af-367b71fc5505" containerName="extract" Sep 30 14:12:56 crc kubenswrapper[4676]: E0930 14:12:56.890760 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="823d6d8e-2968-4663-96e6-2499f7a86b12" containerName="registry-server" Sep 30 14:12:56 crc kubenswrapper[4676]: I0930 14:12:56.890766 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="823d6d8e-2968-4663-96e6-2499f7a86b12" containerName="registry-server" Sep 30 14:12:56 crc kubenswrapper[4676]: E0930 14:12:56.890776 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="823d6d8e-2968-4663-96e6-2499f7a86b12" containerName="extract-utilities" Sep 30 14:12:56 crc kubenswrapper[4676]: I0930 14:12:56.890782 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="823d6d8e-2968-4663-96e6-2499f7a86b12" containerName="extract-utilities" Sep 30 14:12:56 crc kubenswrapper[4676]: E0930 14:12:56.890793 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="823d6d8e-2968-4663-96e6-2499f7a86b12" containerName="extract-content" Sep 30 14:12:56 crc kubenswrapper[4676]: I0930 14:12:56.890799 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="823d6d8e-2968-4663-96e6-2499f7a86b12" containerName="extract-content" Sep 30 14:12:56 crc kubenswrapper[4676]: I0930 14:12:56.890914 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c22d3ee-9eda-44ac-b7af-367b71fc5505" containerName="extract" Sep 30 14:12:56 crc kubenswrapper[4676]: I0930 14:12:56.890928 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="823d6d8e-2968-4663-96e6-2499f7a86b12" containerName="registry-server" Sep 30 14:12:56 crc kubenswrapper[4676]: I0930 14:12:56.891731 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-55ccb8ddf4-slxtv" Sep 30 14:12:56 crc kubenswrapper[4676]: I0930 14:12:56.894828 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-llpqx" Sep 30 14:12:56 crc kubenswrapper[4676]: I0930 14:12:56.920319 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-55ccb8ddf4-slxtv"] Sep 30 14:12:57 crc kubenswrapper[4676]: I0930 14:12:57.026955 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qj2xh\" (UniqueName: \"kubernetes.io/projected/df3c9717-78cf-49b2-a967-7177da8f2e17-kube-api-access-qj2xh\") pod \"openstack-operator-controller-operator-55ccb8ddf4-slxtv\" (UID: \"df3c9717-78cf-49b2-a967-7177da8f2e17\") " pod="openstack-operators/openstack-operator-controller-operator-55ccb8ddf4-slxtv" Sep 30 14:12:57 crc kubenswrapper[4676]: I0930 14:12:57.128219 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qj2xh\" (UniqueName: \"kubernetes.io/projected/df3c9717-78cf-49b2-a967-7177da8f2e17-kube-api-access-qj2xh\") pod \"openstack-operator-controller-operator-55ccb8ddf4-slxtv\" (UID: \"df3c9717-78cf-49b2-a967-7177da8f2e17\") " pod="openstack-operators/openstack-operator-controller-operator-55ccb8ddf4-slxtv" Sep 30 14:12:57 crc kubenswrapper[4676]: I0930 14:12:57.149218 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qj2xh\" (UniqueName: \"kubernetes.io/projected/df3c9717-78cf-49b2-a967-7177da8f2e17-kube-api-access-qj2xh\") pod \"openstack-operator-controller-operator-55ccb8ddf4-slxtv\" (UID: \"df3c9717-78cf-49b2-a967-7177da8f2e17\") " pod="openstack-operators/openstack-operator-controller-operator-55ccb8ddf4-slxtv" Sep 30 14:12:57 crc kubenswrapper[4676]: I0930 14:12:57.211570 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-55ccb8ddf4-slxtv" Sep 30 14:12:57 crc kubenswrapper[4676]: I0930 14:12:57.654926 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-55ccb8ddf4-slxtv"] Sep 30 14:12:58 crc kubenswrapper[4676]: I0930 14:12:58.002927 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-55ccb8ddf4-slxtv" event={"ID":"df3c9717-78cf-49b2-a967-7177da8f2e17","Type":"ContainerStarted","Data":"fe89c780fcf192b108fa482a7d83ed07d29ba2488c40b488d7a213b2bcceca66"} Sep 30 14:13:02 crc kubenswrapper[4676]: I0930 14:13:02.038806 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-55ccb8ddf4-slxtv" event={"ID":"df3c9717-78cf-49b2-a967-7177da8f2e17","Type":"ContainerStarted","Data":"0b6304e13562e0c370283453d8a406e22e60d122fd01ae734a2020b655ed5e8f"} Sep 30 14:13:05 crc kubenswrapper[4676]: I0930 14:13:05.061451 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-55ccb8ddf4-slxtv" event={"ID":"df3c9717-78cf-49b2-a967-7177da8f2e17","Type":"ContainerStarted","Data":"9bd879646a0c97497967dd75c2b4b4f8d2537f8572e79b3b31c03c26e0bb4560"} Sep 30 14:13:05 crc kubenswrapper[4676]: I0930 14:13:05.062017 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-55ccb8ddf4-slxtv" Sep 30 14:13:05 crc kubenswrapper[4676]: I0930 14:13:05.106143 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-55ccb8ddf4-slxtv" podStartSLOduration=2.217946044 podStartE2EDuration="9.106121168s" podCreationTimestamp="2025-09-30 14:12:56 +0000 UTC" firstStartedPulling="2025-09-30 14:12:57.664479626 +0000 UTC m=+881.647568055" lastFinishedPulling="2025-09-30 14:13:04.55265475 +0000 UTC m=+888.535743179" observedRunningTime="2025-09-30 14:13:05.101766598 +0000 UTC m=+889.084855027" watchObservedRunningTime="2025-09-30 14:13:05.106121168 +0000 UTC m=+889.089209597" Sep 30 14:13:06 crc kubenswrapper[4676]: I0930 14:13:06.071636 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-55ccb8ddf4-slxtv" Sep 30 14:13:43 crc kubenswrapper[4676]: I0930 14:13:43.265614 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6ff8b75857-7b844"] Sep 30 14:13:43 crc kubenswrapper[4676]: I0930 14:13:43.267851 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-7b844" Sep 30 14:13:43 crc kubenswrapper[4676]: I0930 14:13:43.270988 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-nshk9" Sep 30 14:13:43 crc kubenswrapper[4676]: I0930 14:13:43.280863 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-644bddb6d8-7tf2x"] Sep 30 14:13:43 crc kubenswrapper[4676]: I0930 14:13:43.282046 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-7tf2x" Sep 30 14:13:43 crc kubenswrapper[4676]: I0930 14:13:43.287782 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-fpfkl" Sep 30 14:13:43 crc kubenswrapper[4676]: I0930 14:13:43.295318 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6ff8b75857-7b844"] Sep 30 14:13:43 crc kubenswrapper[4676]: I0930 14:13:43.304672 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-84f4f7b77b-7qc9p"] Sep 30 14:13:43 crc kubenswrapper[4676]: I0930 14:13:43.306335 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-7qc9p" Sep 30 14:13:43 crc kubenswrapper[4676]: I0930 14:13:43.309125 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-qd67v" Sep 30 14:13:43 crc kubenswrapper[4676]: I0930 14:13:43.337640 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-84f4f7b77b-7qc9p"] Sep 30 14:13:43 crc kubenswrapper[4676]: I0930 14:13:43.338404 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fx8d\" (UniqueName: \"kubernetes.io/projected/9e7d83e3-0f96-4a53-88ad-568d39435e5f-kube-api-access-5fx8d\") pod \"barbican-operator-controller-manager-6ff8b75857-7b844\" (UID: \"9e7d83e3-0f96-4a53-88ad-568d39435e5f\") " pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-7b844" Sep 30 14:13:43 crc kubenswrapper[4676]: I0930 14:13:43.338516 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6fld\" (UniqueName: \"kubernetes.io/projected/2342a742-ce41-4487-9d32-34fc69cb4445-kube-api-access-t6fld\") pod \"cinder-operator-controller-manager-644bddb6d8-7tf2x\" (UID: \"2342a742-ce41-4487-9d32-34fc69cb4445\") " pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-7tf2x" Sep 30 14:13:43 crc kubenswrapper[4676]: I0930 14:13:43.349640 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-644bddb6d8-7tf2x"] Sep 30 14:13:43 crc kubenswrapper[4676]: I0930 14:13:43.355524 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-84958c4d49-zn6zg"] Sep 30 14:13:43 crc kubenswrapper[4676]: I0930 14:13:43.356934 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-zn6zg" Sep 30 14:13:43 crc kubenswrapper[4676]: I0930 14:13:43.383379 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-4v498" Sep 30 14:13:43 crc kubenswrapper[4676]: I0930 14:13:43.445454 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swzd7\" (UniqueName: \"kubernetes.io/projected/b5ead6b1-3f68-454e-847c-89cac8d7f1f0-kube-api-access-swzd7\") pod \"glance-operator-controller-manager-84958c4d49-zn6zg\" (UID: \"b5ead6b1-3f68-454e-847c-89cac8d7f1f0\") " pod="openstack-operators/glance-operator-controller-manager-84958c4d49-zn6zg" Sep 30 14:13:43 crc kubenswrapper[4676]: I0930 14:13:43.445545 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdbl6\" (UniqueName: \"kubernetes.io/projected/8f9d1069-29eb-42e5-8029-1ed616f31c4a-kube-api-access-pdbl6\") pod \"designate-operator-controller-manager-84f4f7b77b-7qc9p\" (UID: \"8f9d1069-29eb-42e5-8029-1ed616f31c4a\") " pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-7qc9p" Sep 30 14:13:43 crc kubenswrapper[4676]: I0930 14:13:43.445586 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6fld\" (UniqueName: \"kubernetes.io/projected/2342a742-ce41-4487-9d32-34fc69cb4445-kube-api-access-t6fld\") pod \"cinder-operator-controller-manager-644bddb6d8-7tf2x\" (UID: \"2342a742-ce41-4487-9d32-34fc69cb4445\") " pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-7tf2x" Sep 30 14:13:43 crc kubenswrapper[4676]: I0930 14:13:43.445678 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fx8d\" (UniqueName: \"kubernetes.io/projected/9e7d83e3-0f96-4a53-88ad-568d39435e5f-kube-api-access-5fx8d\") pod \"barbican-operator-controller-manager-6ff8b75857-7b844\" (UID: \"9e7d83e3-0f96-4a53-88ad-568d39435e5f\") " pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-7b844" Sep 30 14:13:43 crc kubenswrapper[4676]: I0930 14:13:43.454508 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5d889d78cf-6qcf5"] Sep 30 14:13:43 crc kubenswrapper[4676]: I0930 14:13:43.455844 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-6qcf5" Sep 30 14:13:43 crc kubenswrapper[4676]: I0930 14:13:43.460414 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-k57w5" Sep 30 14:13:43 crc kubenswrapper[4676]: I0930 14:13:43.460788 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-9f4696d94-s9znt"] Sep 30 14:13:43 crc kubenswrapper[4676]: I0930 14:13:43.461871 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-s9znt" Sep 30 14:13:43 crc kubenswrapper[4676]: I0930 14:13:43.467281 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-6mxb6" Sep 30 14:13:43 crc kubenswrapper[4676]: I0930 14:13:43.472426 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5d889d78cf-6qcf5"] Sep 30 14:13:43 crc kubenswrapper[4676]: I0930 14:13:43.487017 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fx8d\" (UniqueName: \"kubernetes.io/projected/9e7d83e3-0f96-4a53-88ad-568d39435e5f-kube-api-access-5fx8d\") pod \"barbican-operator-controller-manager-6ff8b75857-7b844\" (UID: \"9e7d83e3-0f96-4a53-88ad-568d39435e5f\") " pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-7b844" Sep 30 14:13:43 crc kubenswrapper[4676]: I0930 14:13:43.488911 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-84958c4d49-zn6zg"] Sep 30 14:13:43 crc kubenswrapper[4676]: I0930 14:13:43.498193 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-9f4696d94-s9znt"] Sep 30 14:13:43 crc kubenswrapper[4676]: I0930 14:13:43.507751 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-7975b88857-4dzg7"] Sep 30 14:13:43 crc kubenswrapper[4676]: I0930 14:13:43.509172 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-4dzg7" Sep 30 14:13:43 crc kubenswrapper[4676]: I0930 14:13:43.510952 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6fld\" (UniqueName: \"kubernetes.io/projected/2342a742-ce41-4487-9d32-34fc69cb4445-kube-api-access-t6fld\") pod \"cinder-operator-controller-manager-644bddb6d8-7tf2x\" (UID: \"2342a742-ce41-4487-9d32-34fc69cb4445\") " pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-7tf2x" Sep 30 14:13:43 crc kubenswrapper[4676]: I0930 14:13:43.522246 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-7d857cc749-mg54v"] Sep 30 14:13:43 crc kubenswrapper[4676]: I0930 14:13:43.523922 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-mg54v" Sep 30 14:13:43 crc kubenswrapper[4676]: I0930 14:13:43.538068 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-7975b88857-4dzg7"] Sep 30 14:13:43 crc kubenswrapper[4676]: I0930 14:13:43.539406 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7d857cc749-mg54v"] Sep 30 14:13:43 crc kubenswrapper[4676]: I0930 14:13:43.541179 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-pmrcc" Sep 30 14:13:43 crc kubenswrapper[4676]: I0930 14:13:43.547483 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdbl6\" (UniqueName: \"kubernetes.io/projected/8f9d1069-29eb-42e5-8029-1ed616f31c4a-kube-api-access-pdbl6\") pod \"designate-operator-controller-manager-84f4f7b77b-7qc9p\" (UID: \"8f9d1069-29eb-42e5-8029-1ed616f31c4a\") " pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-7qc9p" Sep 30 14:13:43 crc kubenswrapper[4676]: I0930 14:13:43.547535 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4rln\" (UniqueName: \"kubernetes.io/projected/1d51c97e-7c47-4274-8bd4-bc3d7402a378-kube-api-access-g4rln\") pod \"horizon-operator-controller-manager-9f4696d94-s9znt\" (UID: \"1d51c97e-7c47-4274-8bd4-bc3d7402a378\") " pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-s9znt" Sep 30 14:13:43 crc kubenswrapper[4676]: I0930 14:13:43.547608 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j779g\" (UniqueName: \"kubernetes.io/projected/930c8b21-3bfd-497b-9bc7-60f2cf7abde6-kube-api-access-j779g\") pod \"heat-operator-controller-manager-5d889d78cf-6qcf5\" (UID: \"930c8b21-3bfd-497b-9bc7-60f2cf7abde6\") " pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-6qcf5" Sep 30 14:13:43 crc kubenswrapper[4676]: I0930 14:13:43.547641 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swzd7\" (UniqueName: \"kubernetes.io/projected/b5ead6b1-3f68-454e-847c-89cac8d7f1f0-kube-api-access-swzd7\") pod \"glance-operator-controller-manager-84958c4d49-zn6zg\" (UID: \"b5ead6b1-3f68-454e-847c-89cac8d7f1f0\") " pod="openstack-operators/glance-operator-controller-manager-84958c4d49-zn6zg" Sep 30 14:13:43 crc kubenswrapper[4676]: I0930 14:13:43.560043 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-6d68dbc695-gh2vw"] Sep 30 14:13:43 crc kubenswrapper[4676]: I0930 14:13:43.569034 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-gh2vw" Sep 30 14:13:43 crc kubenswrapper[4676]: I0930 14:13:43.570778 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Sep 30 14:13:43 crc kubenswrapper[4676]: I0930 14:13:43.571028 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-hs8nx" Sep 30 14:13:43 crc kubenswrapper[4676]: I0930 14:13:43.597014 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6d68dbc695-gh2vw"] Sep 30 14:13:43 crc kubenswrapper[4676]: I0930 14:13:43.597286 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-2csn9" Sep 30 14:13:43 crc kubenswrapper[4676]: I0930 14:13:43.597845 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-7b844" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:43.608461 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdbl6\" (UniqueName: \"kubernetes.io/projected/8f9d1069-29eb-42e5-8029-1ed616f31c4a-kube-api-access-pdbl6\") pod \"designate-operator-controller-manager-84f4f7b77b-7qc9p\" (UID: \"8f9d1069-29eb-42e5-8029-1ed616f31c4a\") " pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-7qc9p" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:43.608984 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swzd7\" (UniqueName: \"kubernetes.io/projected/b5ead6b1-3f68-454e-847c-89cac8d7f1f0-kube-api-access-swzd7\") pod \"glance-operator-controller-manager-84958c4d49-zn6zg\" (UID: \"b5ead6b1-3f68-454e-847c-89cac8d7f1f0\") " pod="openstack-operators/glance-operator-controller-manager-84958c4d49-zn6zg" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:43.612949 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-7tf2x" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:43.619966 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64d7b59854-t6h2t"] Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:43.621366 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-t6h2t" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:43.628737 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-7qc9p" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:43.628792 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-d7slg" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:43.650810 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c15f6efe-27f0-4f55-b1f0-957366ff23a4-cert\") pod \"infra-operator-controller-manager-7d857cc749-mg54v\" (UID: \"c15f6efe-27f0-4f55-b1f0-957366ff23a4\") " pod="openstack-operators/infra-operator-controller-manager-7d857cc749-mg54v" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:43.650869 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j779g\" (UniqueName: \"kubernetes.io/projected/930c8b21-3bfd-497b-9bc7-60f2cf7abde6-kube-api-access-j779g\") pod \"heat-operator-controller-manager-5d889d78cf-6qcf5\" (UID: \"930c8b21-3bfd-497b-9bc7-60f2cf7abde6\") " pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-6qcf5" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:43.650926 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77g2g\" (UniqueName: \"kubernetes.io/projected/daee0b60-331c-4108-8881-66cf4eb731e0-kube-api-access-77g2g\") pod \"manila-operator-controller-manager-6d68dbc695-gh2vw\" (UID: \"daee0b60-331c-4108-8881-66cf4eb731e0\") " pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-gh2vw" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:43.650971 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zng6k\" (UniqueName: \"kubernetes.io/projected/5e535753-178a-4b7b-b20c-e13fa0be5ce1-kube-api-access-zng6k\") pod \"ironic-operator-controller-manager-7975b88857-4dzg7\" (UID: \"5e535753-178a-4b7b-b20c-e13fa0be5ce1\") " pod="openstack-operators/ironic-operator-controller-manager-7975b88857-4dzg7" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:43.651050 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4rln\" (UniqueName: \"kubernetes.io/projected/1d51c97e-7c47-4274-8bd4-bc3d7402a378-kube-api-access-g4rln\") pod \"horizon-operator-controller-manager-9f4696d94-s9znt\" (UID: \"1d51c97e-7c47-4274-8bd4-bc3d7402a378\") " pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-s9znt" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:43.651106 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4v6x\" (UniqueName: \"kubernetes.io/projected/c15f6efe-27f0-4f55-b1f0-957366ff23a4-kube-api-access-z4v6x\") pod \"infra-operator-controller-manager-7d857cc749-mg54v\" (UID: \"c15f6efe-27f0-4f55-b1f0-957366ff23a4\") " pod="openstack-operators/infra-operator-controller-manager-7d857cc749-mg54v" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:43.654972 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-88c7-n5djz"] Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:43.657208 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-88c7-n5djz" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:43.680463 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-wh2ns" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:43.686870 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j779g\" (UniqueName: \"kubernetes.io/projected/930c8b21-3bfd-497b-9bc7-60f2cf7abde6-kube-api-access-j779g\") pod \"heat-operator-controller-manager-5d889d78cf-6qcf5\" (UID: \"930c8b21-3bfd-497b-9bc7-60f2cf7abde6\") " pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-6qcf5" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:43.695337 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4rln\" (UniqueName: \"kubernetes.io/projected/1d51c97e-7c47-4274-8bd4-bc3d7402a378-kube-api-access-g4rln\") pod \"horizon-operator-controller-manager-9f4696d94-s9znt\" (UID: \"1d51c97e-7c47-4274-8bd4-bc3d7402a378\") " pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-s9znt" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:43.706667 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-c7c776c96-8fx9n"] Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:43.719963 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-8fx9n" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:43.720632 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64d7b59854-t6h2t"] Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:43.728439 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-jgsg5" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:43.729412 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-zn6zg" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:43.739777 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-88c7-n5djz"] Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:43.739826 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5bd55b4bff-8pv7q"] Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:43.741220 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-8pv7q" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:43.752361 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-8kd4f" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:43.757176 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-c7c776c96-8fx9n"] Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:43.758139 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zng6k\" (UniqueName: \"kubernetes.io/projected/5e535753-178a-4b7b-b20c-e13fa0be5ce1-kube-api-access-zng6k\") pod \"ironic-operator-controller-manager-7975b88857-4dzg7\" (UID: \"5e535753-178a-4b7b-b20c-e13fa0be5ce1\") " pod="openstack-operators/ironic-operator-controller-manager-7975b88857-4dzg7" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:43.758193 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmps6\" (UniqueName: \"kubernetes.io/projected/5dbc4210-e31a-4bf8-a5cb-6f00a7406743-kube-api-access-kmps6\") pod \"nova-operator-controller-manager-c7c776c96-8fx9n\" (UID: \"5dbc4210-e31a-4bf8-a5cb-6f00a7406743\") " pod="openstack-operators/nova-operator-controller-manager-c7c776c96-8fx9n" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:43.758231 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pssh\" (UniqueName: \"kubernetes.io/projected/dbe0db98-4cbd-49d2-9f6a-f54a8189c64b-kube-api-access-7pssh\") pod \"neutron-operator-controller-manager-64d7b59854-t6h2t\" (UID: \"dbe0db98-4cbd-49d2-9f6a-f54a8189c64b\") " pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-t6h2t" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:43.758256 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4v6x\" (UniqueName: \"kubernetes.io/projected/c15f6efe-27f0-4f55-b1f0-957366ff23a4-kube-api-access-z4v6x\") pod \"infra-operator-controller-manager-7d857cc749-mg54v\" (UID: \"c15f6efe-27f0-4f55-b1f0-957366ff23a4\") " pod="openstack-operators/infra-operator-controller-manager-7d857cc749-mg54v" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:43.758287 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c15f6efe-27f0-4f55-b1f0-957366ff23a4-cert\") pod \"infra-operator-controller-manager-7d857cc749-mg54v\" (UID: \"c15f6efe-27f0-4f55-b1f0-957366ff23a4\") " pod="openstack-operators/infra-operator-controller-manager-7d857cc749-mg54v" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:43.758306 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqdzt\" (UniqueName: \"kubernetes.io/projected/d9d6dec4-1cc3-40c5-b90a-71b5f7a7da85-kube-api-access-lqdzt\") pod \"mariadb-operator-controller-manager-88c7-n5djz\" (UID: \"d9d6dec4-1cc3-40c5-b90a-71b5f7a7da85\") " pod="openstack-operators/mariadb-operator-controller-manager-88c7-n5djz" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:43.758338 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77g2g\" (UniqueName: \"kubernetes.io/projected/daee0b60-331c-4108-8881-66cf4eb731e0-kube-api-access-77g2g\") pod \"manila-operator-controller-manager-6d68dbc695-gh2vw\" (UID: \"daee0b60-331c-4108-8881-66cf4eb731e0\") " pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-gh2vw" Sep 30 14:13:44 crc kubenswrapper[4676]: E0930 14:13:43.758969 4676 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Sep 30 14:13:44 crc kubenswrapper[4676]: E0930 14:13:43.759029 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c15f6efe-27f0-4f55-b1f0-957366ff23a4-cert podName:c15f6efe-27f0-4f55-b1f0-957366ff23a4 nodeName:}" failed. No retries permitted until 2025-09-30 14:13:44.258997809 +0000 UTC m=+928.242086238 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c15f6efe-27f0-4f55-b1f0-957366ff23a4-cert") pod "infra-operator-controller-manager-7d857cc749-mg54v" (UID: "c15f6efe-27f0-4f55-b1f0-957366ff23a4") : secret "infra-operator-webhook-server-cert" not found Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:43.770029 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5bd55b4bff-8pv7q"] Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:43.781913 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-8m45v"] Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:43.783201 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-8m45v" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:43.792797 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-6qcf5" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:43.801547 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-ffqqr" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:43.806514 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zng6k\" (UniqueName: \"kubernetes.io/projected/5e535753-178a-4b7b-b20c-e13fa0be5ce1-kube-api-access-zng6k\") pod \"ironic-operator-controller-manager-7975b88857-4dzg7\" (UID: \"5e535753-178a-4b7b-b20c-e13fa0be5ce1\") " pod="openstack-operators/ironic-operator-controller-manager-7975b88857-4dzg7" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:43.817634 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-8m45v"] Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:43.839702 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4v6x\" (UniqueName: \"kubernetes.io/projected/c15f6efe-27f0-4f55-b1f0-957366ff23a4-kube-api-access-z4v6x\") pod \"infra-operator-controller-manager-7d857cc749-mg54v\" (UID: \"c15f6efe-27f0-4f55-b1f0-957366ff23a4\") " pod="openstack-operators/infra-operator-controller-manager-7d857cc749-mg54v" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:43.840332 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77g2g\" (UniqueName: \"kubernetes.io/projected/daee0b60-331c-4108-8881-66cf4eb731e0-kube-api-access-77g2g\") pod \"manila-operator-controller-manager-6d68dbc695-gh2vw\" (UID: \"daee0b60-331c-4108-8881-66cf4eb731e0\") " pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-gh2vw" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:43.852956 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-qd7b8"] Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:43.854360 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-qd7b8" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:43.865001 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-9976ff44c-g2s9t"] Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:43.866229 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-wpzcn" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:43.866456 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-g2s9t" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:43.867964 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkcwx\" (UniqueName: \"kubernetes.io/projected/7e6672d2-5e94-4d5d-b927-ad3573b95469-kube-api-access-wkcwx\") pod \"keystone-operator-controller-manager-5bd55b4bff-8pv7q\" (UID: \"7e6672d2-5e94-4d5d-b927-ad3573b95469\") " pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-8pv7q" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:43.868038 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmps6\" (UniqueName: \"kubernetes.io/projected/5dbc4210-e31a-4bf8-a5cb-6f00a7406743-kube-api-access-kmps6\") pod \"nova-operator-controller-manager-c7c776c96-8fx9n\" (UID: \"5dbc4210-e31a-4bf8-a5cb-6f00a7406743\") " pod="openstack-operators/nova-operator-controller-manager-c7c776c96-8fx9n" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:43.868085 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pssh\" (UniqueName: \"kubernetes.io/projected/dbe0db98-4cbd-49d2-9f6a-f54a8189c64b-kube-api-access-7pssh\") pod \"neutron-operator-controller-manager-64d7b59854-t6h2t\" (UID: \"dbe0db98-4cbd-49d2-9f6a-f54a8189c64b\") " pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-t6h2t" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:43.868139 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqdzt\" (UniqueName: \"kubernetes.io/projected/d9d6dec4-1cc3-40c5-b90a-71b5f7a7da85-kube-api-access-lqdzt\") pod \"mariadb-operator-controller-manager-88c7-n5djz\" (UID: \"d9d6dec4-1cc3-40c5-b90a-71b5f7a7da85\") " pod="openstack-operators/mariadb-operator-controller-manager-88c7-n5djz" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:43.868176 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgrfl\" (UniqueName: \"kubernetes.io/projected/43f93725-c577-4253-ae9c-7d14e8aec0b9-kube-api-access-kgrfl\") pod \"octavia-operator-controller-manager-76fcc6dc7c-8m45v\" (UID: \"43f93725-c577-4253-ae9c-7d14e8aec0b9\") " pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-8m45v" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:43.879222 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:43.879518 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-8pf7d" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:43.906727 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqdzt\" (UniqueName: \"kubernetes.io/projected/d9d6dec4-1cc3-40c5-b90a-71b5f7a7da85-kube-api-access-lqdzt\") pod \"mariadb-operator-controller-manager-88c7-n5djz\" (UID: \"d9d6dec4-1cc3-40c5-b90a-71b5f7a7da85\") " pod="openstack-operators/mariadb-operator-controller-manager-88c7-n5djz" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:43.926639 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmps6\" (UniqueName: \"kubernetes.io/projected/5dbc4210-e31a-4bf8-a5cb-6f00a7406743-kube-api-access-kmps6\") pod \"nova-operator-controller-manager-c7c776c96-8fx9n\" (UID: \"5dbc4210-e31a-4bf8-a5cb-6f00a7406743\") " pod="openstack-operators/nova-operator-controller-manager-c7c776c96-8fx9n" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:43.953086 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pssh\" (UniqueName: \"kubernetes.io/projected/dbe0db98-4cbd-49d2-9f6a-f54a8189c64b-kube-api-access-7pssh\") pod \"neutron-operator-controller-manager-64d7b59854-t6h2t\" (UID: \"dbe0db98-4cbd-49d2-9f6a-f54a8189c64b\") " pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-t6h2t" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:43.971930 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stqs6\" (UniqueName: \"kubernetes.io/projected/e51ea15a-8d04-4d56-956d-0fcf41846eb8-kube-api-access-stqs6\") pod \"ovn-operator-controller-manager-9976ff44c-g2s9t\" (UID: \"e51ea15a-8d04-4d56-956d-0fcf41846eb8\") " pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-g2s9t" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:43.971998 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/96c8a26f-c044-429d-90eb-d0342486c32f-cert\") pod \"openstack-baremetal-operator-controller-manager-6d776955-qd7b8\" (UID: \"96c8a26f-c044-429d-90eb-d0342486c32f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-qd7b8" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:43.972035 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqmb4\" (UniqueName: \"kubernetes.io/projected/96c8a26f-c044-429d-90eb-d0342486c32f-kube-api-access-fqmb4\") pod \"openstack-baremetal-operator-controller-manager-6d776955-qd7b8\" (UID: \"96c8a26f-c044-429d-90eb-d0342486c32f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-qd7b8" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:43.972095 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgrfl\" (UniqueName: \"kubernetes.io/projected/43f93725-c577-4253-ae9c-7d14e8aec0b9-kube-api-access-kgrfl\") pod \"octavia-operator-controller-manager-76fcc6dc7c-8m45v\" (UID: \"43f93725-c577-4253-ae9c-7d14e8aec0b9\") " pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-8m45v" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:43.972148 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkcwx\" (UniqueName: \"kubernetes.io/projected/7e6672d2-5e94-4d5d-b927-ad3573b95469-kube-api-access-wkcwx\") pod \"keystone-operator-controller-manager-5bd55b4bff-8pv7q\" (UID: \"7e6672d2-5e94-4d5d-b927-ad3573b95469\") " pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-8pv7q" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.003087 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-9976ff44c-g2s9t"] Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.013004 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-qd7b8"] Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.025784 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgrfl\" (UniqueName: \"kubernetes.io/projected/43f93725-c577-4253-ae9c-7d14e8aec0b9-kube-api-access-kgrfl\") pod \"octavia-operator-controller-manager-76fcc6dc7c-8m45v\" (UID: \"43f93725-c577-4253-ae9c-7d14e8aec0b9\") " pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-8m45v" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.101964 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkcwx\" (UniqueName: \"kubernetes.io/projected/7e6672d2-5e94-4d5d-b927-ad3573b95469-kube-api-access-wkcwx\") pod \"keystone-operator-controller-manager-5bd55b4bff-8pv7q\" (UID: \"7e6672d2-5e94-4d5d-b927-ad3573b95469\") " pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-8pv7q" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.103964 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-wf6fb"] Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.123698 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stqs6\" (UniqueName: \"kubernetes.io/projected/e51ea15a-8d04-4d56-956d-0fcf41846eb8-kube-api-access-stqs6\") pod \"ovn-operator-controller-manager-9976ff44c-g2s9t\" (UID: \"e51ea15a-8d04-4d56-956d-0fcf41846eb8\") " pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-g2s9t" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.123760 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/96c8a26f-c044-429d-90eb-d0342486c32f-cert\") pod \"openstack-baremetal-operator-controller-manager-6d776955-qd7b8\" (UID: \"96c8a26f-c044-429d-90eb-d0342486c32f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-qd7b8" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.123801 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqmb4\" (UniqueName: \"kubernetes.io/projected/96c8a26f-c044-429d-90eb-d0342486c32f-kube-api-access-fqmb4\") pod \"openstack-baremetal-operator-controller-manager-6d776955-qd7b8\" (UID: \"96c8a26f-c044-429d-90eb-d0342486c32f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-qd7b8" Sep 30 14:13:44 crc kubenswrapper[4676]: E0930 14:13:44.124568 4676 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Sep 30 14:13:44 crc kubenswrapper[4676]: E0930 14:13:44.124624 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/96c8a26f-c044-429d-90eb-d0342486c32f-cert podName:96c8a26f-c044-429d-90eb-d0342486c32f nodeName:}" failed. No retries permitted until 2025-09-30 14:13:44.624609392 +0000 UTC m=+928.607697821 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/96c8a26f-c044-429d-90eb-d0342486c32f-cert") pod "openstack-baremetal-operator-controller-manager-6d776955-qd7b8" (UID: "96c8a26f-c044-429d-90eb-d0342486c32f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.156505 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqmb4\" (UniqueName: \"kubernetes.io/projected/96c8a26f-c044-429d-90eb-d0342486c32f-kube-api-access-fqmb4\") pod \"openstack-baremetal-operator-controller-manager-6d776955-qd7b8\" (UID: \"96c8a26f-c044-429d-90eb-d0342486c32f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-qd7b8" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.179573 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-wf6fb" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.194723 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-t46jw" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.203369 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-589c58c6c-9dzn5"] Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.218243 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-9dzn5" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.233036 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stqs6\" (UniqueName: \"kubernetes.io/projected/e51ea15a-8d04-4d56-956d-0fcf41846eb8-kube-api-access-stqs6\") pod \"ovn-operator-controller-manager-9976ff44c-g2s9t\" (UID: \"e51ea15a-8d04-4d56-956d-0fcf41846eb8\") " pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-g2s9t" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.233695 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-whwc2" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.244708 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-bc7dc7bd9-l4p8f"] Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.246543 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-l4p8f" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.251612 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-pb46s" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.252903 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-s9znt" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.253477 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-bc7dc7bd9-l4p8f"] Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.268993 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-wf6fb"] Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.274320 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-4dzg7" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.296670 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-589c58c6c-9dzn5"] Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.343612 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9bdm\" (UniqueName: \"kubernetes.io/projected/cedc986e-ac92-45e8-862a-fc4dcb60455d-kube-api-access-l9bdm\") pod \"swift-operator-controller-manager-bc7dc7bd9-l4p8f\" (UID: \"cedc986e-ac92-45e8-862a-fc4dcb60455d\") " pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-l4p8f" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.343702 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c15f6efe-27f0-4f55-b1f0-957366ff23a4-cert\") pod \"infra-operator-controller-manager-7d857cc749-mg54v\" (UID: \"c15f6efe-27f0-4f55-b1f0-957366ff23a4\") " pod="openstack-operators/infra-operator-controller-manager-7d857cc749-mg54v" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.343768 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbztv\" (UniqueName: \"kubernetes.io/projected/a3762232-4e9f-452e-aea4-c5feb443ad75-kube-api-access-gbztv\") pod \"placement-operator-controller-manager-589c58c6c-9dzn5\" (UID: \"a3762232-4e9f-452e-aea4-c5feb443ad75\") " pod="openstack-operators/placement-operator-controller-manager-589c58c6c-9dzn5" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.343813 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdjqx\" (UniqueName: \"kubernetes.io/projected/aa6dd699-ccd5-476f-ab9c-3d4841ed591a-kube-api-access-pdjqx\") pod \"telemetry-operator-controller-manager-b8d54b5d7-wf6fb\" (UID: \"aa6dd699-ccd5-476f-ab9c-3d4841ed591a\") " pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-wf6fb" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.348788 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-88c7-n5djz" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.358292 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c15f6efe-27f0-4f55-b1f0-957366ff23a4-cert\") pod \"infra-operator-controller-manager-7d857cc749-mg54v\" (UID: \"c15f6efe-27f0-4f55-b1f0-957366ff23a4\") " pod="openstack-operators/infra-operator-controller-manager-7d857cc749-mg54v" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.365113 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-f66b554c6-bvj6l"] Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.367178 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-f66b554c6-bvj6l" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.371312 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-f66b554c6-bvj6l"] Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.372010 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-db4jp" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.386872 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-76669f99c-bwq8t"] Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.388913 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-bwq8t" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.391945 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-xf79r" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.392892 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-76669f99c-bwq8t"] Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.400940 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-t6h2t" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.435023 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-8pv7q" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.446266 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdjqx\" (UniqueName: \"kubernetes.io/projected/aa6dd699-ccd5-476f-ab9c-3d4841ed591a-kube-api-access-pdjqx\") pod \"telemetry-operator-controller-manager-b8d54b5d7-wf6fb\" (UID: \"aa6dd699-ccd5-476f-ab9c-3d4841ed591a\") " pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-wf6fb" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.446378 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9bdm\" (UniqueName: \"kubernetes.io/projected/cedc986e-ac92-45e8-862a-fc4dcb60455d-kube-api-access-l9bdm\") pod \"swift-operator-controller-manager-bc7dc7bd9-l4p8f\" (UID: \"cedc986e-ac92-45e8-862a-fc4dcb60455d\") " pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-l4p8f" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.446541 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbztv\" (UniqueName: \"kubernetes.io/projected/a3762232-4e9f-452e-aea4-c5feb443ad75-kube-api-access-gbztv\") pod \"placement-operator-controller-manager-589c58c6c-9dzn5\" (UID: \"a3762232-4e9f-452e-aea4-c5feb443ad75\") " pod="openstack-operators/placement-operator-controller-manager-589c58c6c-9dzn5" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.470259 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdjqx\" (UniqueName: \"kubernetes.io/projected/aa6dd699-ccd5-476f-ab9c-3d4841ed591a-kube-api-access-pdjqx\") pod \"telemetry-operator-controller-manager-b8d54b5d7-wf6fb\" (UID: \"aa6dd699-ccd5-476f-ab9c-3d4841ed591a\") " pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-wf6fb" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.470460 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-8fx9n" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.470326 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9bdm\" (UniqueName: \"kubernetes.io/projected/cedc986e-ac92-45e8-862a-fc4dcb60455d-kube-api-access-l9bdm\") pod \"swift-operator-controller-manager-bc7dc7bd9-l4p8f\" (UID: \"cedc986e-ac92-45e8-862a-fc4dcb60455d\") " pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-l4p8f" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.473368 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5f5687bfdd-7nt8g"] Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.478357 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5f5687bfdd-7nt8g" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.482742 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-zmbk6" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.483204 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.491078 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbztv\" (UniqueName: \"kubernetes.io/projected/a3762232-4e9f-452e-aea4-c5feb443ad75-kube-api-access-gbztv\") pod \"placement-operator-controller-manager-589c58c6c-9dzn5\" (UID: \"a3762232-4e9f-452e-aea4-c5feb443ad75\") " pod="openstack-operators/placement-operator-controller-manager-589c58c6c-9dzn5" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.495724 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5f5687bfdd-7nt8g"] Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.523704 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-xkkwv"] Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.524688 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-xkkwv" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.527331 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-xkkwv"] Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.531152 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-hkgfg" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.559691 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6hmh\" (UniqueName: \"kubernetes.io/projected/129d5672-c8dd-4a63-8d48-dc95c84a45b2-kube-api-access-g6hmh\") pod \"test-operator-controller-manager-f66b554c6-bvj6l\" (UID: \"129d5672-c8dd-4a63-8d48-dc95c84a45b2\") " pod="openstack-operators/test-operator-controller-manager-f66b554c6-bvj6l" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.565287 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sl5lq\" (UniqueName: \"kubernetes.io/projected/347e3ac8-4477-4bab-a64b-a443098bb400-kube-api-access-sl5lq\") pod \"watcher-operator-controller-manager-76669f99c-bwq8t\" (UID: \"347e3ac8-4477-4bab-a64b-a443098bb400\") " pod="openstack-operators/watcher-operator-controller-manager-76669f99c-bwq8t" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.566108 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6ff8b75857-7b844"] Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.583163 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-gh2vw" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.631568 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-mg54v" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.671030 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/72b68346-543d-4b80-ba31-9bcb856b6989-cert\") pod \"openstack-operator-controller-manager-5f5687bfdd-7nt8g\" (UID: \"72b68346-543d-4b80-ba31-9bcb856b6989\") " pod="openstack-operators/openstack-operator-controller-manager-5f5687bfdd-7nt8g" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.671106 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sl5lq\" (UniqueName: \"kubernetes.io/projected/347e3ac8-4477-4bab-a64b-a443098bb400-kube-api-access-sl5lq\") pod \"watcher-operator-controller-manager-76669f99c-bwq8t\" (UID: \"347e3ac8-4477-4bab-a64b-a443098bb400\") " pod="openstack-operators/watcher-operator-controller-manager-76669f99c-bwq8t" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.671157 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/96c8a26f-c044-429d-90eb-d0342486c32f-cert\") pod \"openstack-baremetal-operator-controller-manager-6d776955-qd7b8\" (UID: \"96c8a26f-c044-429d-90eb-d0342486c32f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-qd7b8" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.671176 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wh5sr\" (UniqueName: \"kubernetes.io/projected/677c476e-c8df-4a21-9968-b2bd23b246f6-kube-api-access-wh5sr\") pod \"rabbitmq-cluster-operator-manager-79d8469568-xkkwv\" (UID: \"677c476e-c8df-4a21-9968-b2bd23b246f6\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-xkkwv" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.671207 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6hmh\" (UniqueName: \"kubernetes.io/projected/129d5672-c8dd-4a63-8d48-dc95c84a45b2-kube-api-access-g6hmh\") pod \"test-operator-controller-manager-f66b554c6-bvj6l\" (UID: \"129d5672-c8dd-4a63-8d48-dc95c84a45b2\") " pod="openstack-operators/test-operator-controller-manager-f66b554c6-bvj6l" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.671243 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5b9b\" (UniqueName: \"kubernetes.io/projected/72b68346-543d-4b80-ba31-9bcb856b6989-kube-api-access-l5b9b\") pod \"openstack-operator-controller-manager-5f5687bfdd-7nt8g\" (UID: \"72b68346-543d-4b80-ba31-9bcb856b6989\") " pod="openstack-operators/openstack-operator-controller-manager-5f5687bfdd-7nt8g" Sep 30 14:13:44 crc kubenswrapper[4676]: E0930 14:13:44.671600 4676 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Sep 30 14:13:44 crc kubenswrapper[4676]: E0930 14:13:44.671667 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/96c8a26f-c044-429d-90eb-d0342486c32f-cert podName:96c8a26f-c044-429d-90eb-d0342486c32f nodeName:}" failed. No retries permitted until 2025-09-30 14:13:45.671646655 +0000 UTC m=+929.654735074 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/96c8a26f-c044-429d-90eb-d0342486c32f-cert") pod "openstack-baremetal-operator-controller-manager-6d776955-qd7b8" (UID: "96c8a26f-c044-429d-90eb-d0342486c32f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.693981 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-644bddb6d8-7tf2x"] Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.707080 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6hmh\" (UniqueName: \"kubernetes.io/projected/129d5672-c8dd-4a63-8d48-dc95c84a45b2-kube-api-access-g6hmh\") pod \"test-operator-controller-manager-f66b554c6-bvj6l\" (UID: \"129d5672-c8dd-4a63-8d48-dc95c84a45b2\") " pod="openstack-operators/test-operator-controller-manager-f66b554c6-bvj6l" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.717792 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sl5lq\" (UniqueName: \"kubernetes.io/projected/347e3ac8-4477-4bab-a64b-a443098bb400-kube-api-access-sl5lq\") pod \"watcher-operator-controller-manager-76669f99c-bwq8t\" (UID: \"347e3ac8-4477-4bab-a64b-a443098bb400\") " pod="openstack-operators/watcher-operator-controller-manager-76669f99c-bwq8t" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.744339 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-g2s9t" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.758997 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-8m45v" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.776454 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wh5sr\" (UniqueName: \"kubernetes.io/projected/677c476e-c8df-4a21-9968-b2bd23b246f6-kube-api-access-wh5sr\") pod \"rabbitmq-cluster-operator-manager-79d8469568-xkkwv\" (UID: \"677c476e-c8df-4a21-9968-b2bd23b246f6\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-xkkwv" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.776614 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5b9b\" (UniqueName: \"kubernetes.io/projected/72b68346-543d-4b80-ba31-9bcb856b6989-kube-api-access-l5b9b\") pod \"openstack-operator-controller-manager-5f5687bfdd-7nt8g\" (UID: \"72b68346-543d-4b80-ba31-9bcb856b6989\") " pod="openstack-operators/openstack-operator-controller-manager-5f5687bfdd-7nt8g" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.776704 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/72b68346-543d-4b80-ba31-9bcb856b6989-cert\") pod \"openstack-operator-controller-manager-5f5687bfdd-7nt8g\" (UID: \"72b68346-543d-4b80-ba31-9bcb856b6989\") " pod="openstack-operators/openstack-operator-controller-manager-5f5687bfdd-7nt8g" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.783846 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/72b68346-543d-4b80-ba31-9bcb856b6989-cert\") pod \"openstack-operator-controller-manager-5f5687bfdd-7nt8g\" (UID: \"72b68346-543d-4b80-ba31-9bcb856b6989\") " pod="openstack-operators/openstack-operator-controller-manager-5f5687bfdd-7nt8g" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.786187 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-wf6fb" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.802395 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5b9b\" (UniqueName: \"kubernetes.io/projected/72b68346-543d-4b80-ba31-9bcb856b6989-kube-api-access-l5b9b\") pod \"openstack-operator-controller-manager-5f5687bfdd-7nt8g\" (UID: \"72b68346-543d-4b80-ba31-9bcb856b6989\") " pod="openstack-operators/openstack-operator-controller-manager-5f5687bfdd-7nt8g" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.807697 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wh5sr\" (UniqueName: \"kubernetes.io/projected/677c476e-c8df-4a21-9968-b2bd23b246f6-kube-api-access-wh5sr\") pod \"rabbitmq-cluster-operator-manager-79d8469568-xkkwv\" (UID: \"677c476e-c8df-4a21-9968-b2bd23b246f6\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-xkkwv" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.819875 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-9dzn5" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.845860 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5d889d78cf-6qcf5"] Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.854954 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-l4p8f" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.895163 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-f66b554c6-bvj6l" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.908774 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-84f4f7b77b-7qc9p"] Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.920308 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-84958c4d49-zn6zg"] Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.921451 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-bwq8t" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.935561 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5f5687bfdd-7nt8g" Sep 30 14:13:44 crc kubenswrapper[4676]: I0930 14:13:44.956562 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-xkkwv" Sep 30 14:13:45 crc kubenswrapper[4676]: I0930 14:13:45.155105 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-7975b88857-4dzg7"] Sep 30 14:13:45 crc kubenswrapper[4676]: I0930 14:13:45.171271 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-9f4696d94-s9znt"] Sep 30 14:13:45 crc kubenswrapper[4676]: I0930 14:13:45.204080 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-88c7-n5djz"] Sep 30 14:13:45 crc kubenswrapper[4676]: I0930 14:13:45.342974 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64d7b59854-t6h2t"] Sep 30 14:13:45 crc kubenswrapper[4676]: I0930 14:13:45.345786 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-7tf2x" event={"ID":"2342a742-ce41-4487-9d32-34fc69cb4445","Type":"ContainerStarted","Data":"28ad79bc92f5956165ac5f54be3230a6436509d074b547764767f9f0685b0899"} Sep 30 14:13:45 crc kubenswrapper[4676]: I0930 14:13:45.352068 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-4dzg7" event={"ID":"5e535753-178a-4b7b-b20c-e13fa0be5ce1","Type":"ContainerStarted","Data":"822395e4e8f4e763ac20368d174bc9461c9922954b47d8af17a543e83a1e8b8b"} Sep 30 14:13:45 crc kubenswrapper[4676]: I0930 14:13:45.353328 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-c7c776c96-8fx9n"] Sep 30 14:13:45 crc kubenswrapper[4676]: I0930 14:13:45.353387 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-zn6zg" event={"ID":"b5ead6b1-3f68-454e-847c-89cac8d7f1f0","Type":"ContainerStarted","Data":"20de6280a0869fda4c4a888f5c7c744826700174f5349afb773ebcd1574a4899"} Sep 30 14:13:45 crc kubenswrapper[4676]: I0930 14:13:45.355673 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-s9znt" event={"ID":"1d51c97e-7c47-4274-8bd4-bc3d7402a378","Type":"ContainerStarted","Data":"adbeedb64721cd3d50a80c174fe96d6af9ebea0473c99d9fdbaa8c67874e745c"} Sep 30 14:13:45 crc kubenswrapper[4676]: I0930 14:13:45.357159 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-7b844" event={"ID":"9e7d83e3-0f96-4a53-88ad-568d39435e5f","Type":"ContainerStarted","Data":"d5f45fc50528743cdd078e75f178f6e6c8f541dd95f229d366f9ef636b8ce0b5"} Sep 30 14:13:45 crc kubenswrapper[4676]: I0930 14:13:45.358220 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-88c7-n5djz" event={"ID":"d9d6dec4-1cc3-40c5-b90a-71b5f7a7da85","Type":"ContainerStarted","Data":"6563b8a8a00c0588ec50073a5c4a7a0d10c062ddd8195fb78c0521ddd7848d7f"} Sep 30 14:13:45 crc kubenswrapper[4676]: I0930 14:13:45.359158 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-8fx9n" event={"ID":"5dbc4210-e31a-4bf8-a5cb-6f00a7406743","Type":"ContainerStarted","Data":"ca7327dae4bb274031f6aefaf734e0d5e30065e761ad85058aaa385745998af7"} Sep 30 14:13:45 crc kubenswrapper[4676]: I0930 14:13:45.359985 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-6qcf5" event={"ID":"930c8b21-3bfd-497b-9bc7-60f2cf7abde6","Type":"ContainerStarted","Data":"312e4dbd904a38f7755a4b9457fb88b4abad4b3c8834da890c23a30dd8e9bcb6"} Sep 30 14:13:45 crc kubenswrapper[4676]: I0930 14:13:45.364294 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-7qc9p" event={"ID":"8f9d1069-29eb-42e5-8029-1ed616f31c4a","Type":"ContainerStarted","Data":"f05e396b16122bb0cdf303de33e7573363a8af36e4d4b78898cd2de1c82af309"} Sep 30 14:13:45 crc kubenswrapper[4676]: I0930 14:13:45.434220 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7d857cc749-mg54v"] Sep 30 14:13:45 crc kubenswrapper[4676]: I0930 14:13:45.436406 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6d68dbc695-gh2vw"] Sep 30 14:13:45 crc kubenswrapper[4676]: I0930 14:13:45.460559 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5bd55b4bff-8pv7q"] Sep 30 14:13:45 crc kubenswrapper[4676]: I0930 14:13:45.615900 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-9976ff44c-g2s9t"] Sep 30 14:13:45 crc kubenswrapper[4676]: I0930 14:13:45.620721 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-wf6fb"] Sep 30 14:13:45 crc kubenswrapper[4676]: W0930 14:13:45.623078 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode51ea15a_8d04_4d56_956d_0fcf41846eb8.slice/crio-b6944fab10c6a90d348613534082fbac06057ece70d7ee9189c93e96f2f42ce5 WatchSource:0}: Error finding container b6944fab10c6a90d348613534082fbac06057ece70d7ee9189c93e96f2f42ce5: Status 404 returned error can't find the container with id b6944fab10c6a90d348613534082fbac06057ece70d7ee9189c93e96f2f42ce5 Sep 30 14:13:45 crc kubenswrapper[4676]: E0930 14:13:45.631866 4676 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:8fdf377daf05e2fa7346505017078fa81981dd945bf635a64c8022633c68118f,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pdjqx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-b8d54b5d7-wf6fb_openstack-operators(aa6dd699-ccd5-476f-ab9c-3d4841ed591a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 14:13:45 crc kubenswrapper[4676]: I0930 14:13:45.719780 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-bc7dc7bd9-l4p8f"] Sep 30 14:13:45 crc kubenswrapper[4676]: I0930 14:13:45.724391 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-8m45v"] Sep 30 14:13:45 crc kubenswrapper[4676]: I0930 14:13:45.728534 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/96c8a26f-c044-429d-90eb-d0342486c32f-cert\") pod \"openstack-baremetal-operator-controller-manager-6d776955-qd7b8\" (UID: \"96c8a26f-c044-429d-90eb-d0342486c32f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-qd7b8" Sep 30 14:13:45 crc kubenswrapper[4676]: I0930 14:13:45.740230 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/96c8a26f-c044-429d-90eb-d0342486c32f-cert\") pod \"openstack-baremetal-operator-controller-manager-6d776955-qd7b8\" (UID: \"96c8a26f-c044-429d-90eb-d0342486c32f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-qd7b8" Sep 30 14:13:45 crc kubenswrapper[4676]: W0930 14:13:45.744160 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43f93725_c577_4253_ae9c_7d14e8aec0b9.slice/crio-3cde8b2e280297b680255cdbc38df091fe359fdc8f2db45aa7e22c7e9a9316ae WatchSource:0}: Error finding container 3cde8b2e280297b680255cdbc38df091fe359fdc8f2db45aa7e22c7e9a9316ae: Status 404 returned error can't find the container with id 3cde8b2e280297b680255cdbc38df091fe359fdc8f2db45aa7e22c7e9a9316ae Sep 30 14:13:45 crc kubenswrapper[4676]: E0930 14:13:45.748733 4676 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:4d08afd31dc5ded10c54a5541f514ac351e9b40a183285b3db27d0757a6354c8,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kgrfl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-76fcc6dc7c-8m45v_openstack-operators(43f93725-c577-4253-ae9c-7d14e8aec0b9): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 14:13:45 crc kubenswrapper[4676]: E0930 14:13:45.816098 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-wf6fb" podUID="aa6dd699-ccd5-476f-ab9c-3d4841ed591a" Sep 30 14:13:45 crc kubenswrapper[4676]: I0930 14:13:45.818094 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5f5687bfdd-7nt8g"] Sep 30 14:13:45 crc kubenswrapper[4676]: I0930 14:13:45.837911 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-f66b554c6-bvj6l"] Sep 30 14:13:45 crc kubenswrapper[4676]: I0930 14:13:45.856457 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-xkkwv"] Sep 30 14:13:45 crc kubenswrapper[4676]: W0930 14:13:45.861335 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod129d5672_c8dd_4a63_8d48_dc95c84a45b2.slice/crio-3e5317b4f735514fb8aeb10131273be6d284b9d3508d12e82de355f7ef7697d1 WatchSource:0}: Error finding container 3e5317b4f735514fb8aeb10131273be6d284b9d3508d12e82de355f7ef7697d1: Status 404 returned error can't find the container with id 3e5317b4f735514fb8aeb10131273be6d284b9d3508d12e82de355f7ef7697d1 Sep 30 14:13:45 crc kubenswrapper[4676]: W0930 14:13:45.862298 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod677c476e_c8df_4a21_9968_b2bd23b246f6.slice/crio-ec7b027597e23901ae5ef7f1134d8d72adfab35ce6b7c731af0f3a7578b4d4df WatchSource:0}: Error finding container ec7b027597e23901ae5ef7f1134d8d72adfab35ce6b7c731af0f3a7578b4d4df: Status 404 returned error can't find the container with id ec7b027597e23901ae5ef7f1134d8d72adfab35ce6b7c731af0f3a7578b4d4df Sep 30 14:13:45 crc kubenswrapper[4676]: E0930 14:13:45.865856 4676 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:225524223bf2a7f3a4ce95958fc9ca6fdab02745fb70374e8ff5bf1ddaceda4b,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wh5sr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-79d8469568-xkkwv_openstack-operators(677c476e-c8df-4a21-9968-b2bd23b246f6): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 14:13:45 crc kubenswrapper[4676]: E0930 14:13:45.867155 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-xkkwv" podUID="677c476e-c8df-4a21-9968-b2bd23b246f6" Sep 30 14:13:45 crc kubenswrapper[4676]: I0930 14:13:45.870473 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-76669f99c-bwq8t"] Sep 30 14:13:45 crc kubenswrapper[4676]: W0930 14:13:45.880286 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod347e3ac8_4477_4bab_a64b_a443098bb400.slice/crio-a9dfa27c9c401a444b2aa8138020c72c3fad41e50da8581d54eef929efdc6440 WatchSource:0}: Error finding container a9dfa27c9c401a444b2aa8138020c72c3fad41e50da8581d54eef929efdc6440: Status 404 returned error can't find the container with id a9dfa27c9c401a444b2aa8138020c72c3fad41e50da8581d54eef929efdc6440 Sep 30 14:13:45 crc kubenswrapper[4676]: I0930 14:13:45.880411 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-589c58c6c-9dzn5"] Sep 30 14:13:45 crc kubenswrapper[4676]: W0930 14:13:45.882129 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72b68346_543d_4b80_ba31_9bcb856b6989.slice/crio-90931d16a77f78fd9ab6ffba17f20e504da99c6fcb494a333cfd536b3f3c5d1e WatchSource:0}: Error finding container 90931d16a77f78fd9ab6ffba17f20e504da99c6fcb494a333cfd536b3f3c5d1e: Status 404 returned error can't find the container with id 90931d16a77f78fd9ab6ffba17f20e504da99c6fcb494a333cfd536b3f3c5d1e Sep 30 14:13:45 crc kubenswrapper[4676]: E0930 14:13:45.883664 4676 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:7169dfadf5f5589f14ca52700d2eba991c2a0c7733f6a1ea795752d993d7f61b,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sl5lq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-76669f99c-bwq8t_openstack-operators(347e3ac8-4477-4bab-a64b-a443098bb400): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 14:13:45 crc kubenswrapper[4676]: W0930 14:13:45.884838 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3762232_4e9f_452e_aea4_c5feb443ad75.slice/crio-fbf7b28a211638dfd4955379cbc20d946b80491d32883db3ebe97c2123cd87b5 WatchSource:0}: Error finding container fbf7b28a211638dfd4955379cbc20d946b80491d32883db3ebe97c2123cd87b5: Status 404 returned error can't find the container with id fbf7b28a211638dfd4955379cbc20d946b80491d32883db3ebe97c2123cd87b5 Sep 30 14:13:45 crc kubenswrapper[4676]: E0930 14:13:45.888112 4676 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:a6b3408d79df6b6d4a467e49defaa4a9d9c088c94d0605a4fee0030c9ccc84d2,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gbztv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-589c58c6c-9dzn5_openstack-operators(a3762232-4e9f-452e-aea4-c5feb443ad75): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 14:13:45 crc kubenswrapper[4676]: E0930 14:13:45.926260 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-8m45v" podUID="43f93725-c577-4253-ae9c-7d14e8aec0b9" Sep 30 14:13:45 crc kubenswrapper[4676]: I0930 14:13:45.929181 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-qd7b8" Sep 30 14:13:46 crc kubenswrapper[4676]: E0930 14:13:46.109839 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-bwq8t" podUID="347e3ac8-4477-4bab-a64b-a443098bb400" Sep 30 14:13:46 crc kubenswrapper[4676]: E0930 14:13:46.143697 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-9dzn5" podUID="a3762232-4e9f-452e-aea4-c5feb443ad75" Sep 30 14:13:46 crc kubenswrapper[4676]: I0930 14:13:46.374851 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-9dzn5" event={"ID":"a3762232-4e9f-452e-aea4-c5feb443ad75","Type":"ContainerStarted","Data":"6b5abaf4bcfddb8267dffb3d0c9857d35eb5d73128e6a5bac481aac4a0bfd02f"} Sep 30 14:13:46 crc kubenswrapper[4676]: I0930 14:13:46.374918 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-9dzn5" event={"ID":"a3762232-4e9f-452e-aea4-c5feb443ad75","Type":"ContainerStarted","Data":"fbf7b28a211638dfd4955379cbc20d946b80491d32883db3ebe97c2123cd87b5"} Sep 30 14:13:46 crc kubenswrapper[4676]: E0930 14:13:46.376551 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:a6b3408d79df6b6d4a467e49defaa4a9d9c088c94d0605a4fee0030c9ccc84d2\\\"\"" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-9dzn5" podUID="a3762232-4e9f-452e-aea4-c5feb443ad75" Sep 30 14:13:46 crc kubenswrapper[4676]: I0930 14:13:46.384267 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-wf6fb" event={"ID":"aa6dd699-ccd5-476f-ab9c-3d4841ed591a","Type":"ContainerStarted","Data":"0757a62c0a3fa9de4e26b068e0bec02ddf59583d09f5c341e06ededf86b43349"} Sep 30 14:13:46 crc kubenswrapper[4676]: I0930 14:13:46.384317 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-wf6fb" event={"ID":"aa6dd699-ccd5-476f-ab9c-3d4841ed591a","Type":"ContainerStarted","Data":"e8defbe34efb10fdb3f3c24e6b25e43205451e8a9f2b2b809717b782b6f5006f"} Sep 30 14:13:46 crc kubenswrapper[4676]: I0930 14:13:46.386425 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-t6h2t" event={"ID":"dbe0db98-4cbd-49d2-9f6a-f54a8189c64b","Type":"ContainerStarted","Data":"c4cec895372256f38d275262afdd346448f623e83d41f8f742cb4b7e6abda407"} Sep 30 14:13:46 crc kubenswrapper[4676]: E0930 14:13:46.387235 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:8fdf377daf05e2fa7346505017078fa81981dd945bf635a64c8022633c68118f\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-wf6fb" podUID="aa6dd699-ccd5-476f-ab9c-3d4841ed591a" Sep 30 14:13:46 crc kubenswrapper[4676]: I0930 14:13:46.388404 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-xkkwv" event={"ID":"677c476e-c8df-4a21-9968-b2bd23b246f6","Type":"ContainerStarted","Data":"ec7b027597e23901ae5ef7f1134d8d72adfab35ce6b7c731af0f3a7578b4d4df"} Sep 30 14:13:46 crc kubenswrapper[4676]: E0930 14:13:46.390293 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:225524223bf2a7f3a4ce95958fc9ca6fdab02745fb70374e8ff5bf1ddaceda4b\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-xkkwv" podUID="677c476e-c8df-4a21-9968-b2bd23b246f6" Sep 30 14:13:46 crc kubenswrapper[4676]: I0930 14:13:46.393391 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-8m45v" event={"ID":"43f93725-c577-4253-ae9c-7d14e8aec0b9","Type":"ContainerStarted","Data":"1d9c86184cb5516b7f63457178f68c2de2f31426ad53ad96d0711634d9b0e795"} Sep 30 14:13:46 crc kubenswrapper[4676]: I0930 14:13:46.393468 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-8m45v" event={"ID":"43f93725-c577-4253-ae9c-7d14e8aec0b9","Type":"ContainerStarted","Data":"3cde8b2e280297b680255cdbc38df091fe359fdc8f2db45aa7e22c7e9a9316ae"} Sep 30 14:13:46 crc kubenswrapper[4676]: E0930 14:13:46.394830 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:4d08afd31dc5ded10c54a5541f514ac351e9b40a183285b3db27d0757a6354c8\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-8m45v" podUID="43f93725-c577-4253-ae9c-7d14e8aec0b9" Sep 30 14:13:46 crc kubenswrapper[4676]: I0930 14:13:46.398104 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-bwq8t" event={"ID":"347e3ac8-4477-4bab-a64b-a443098bb400","Type":"ContainerStarted","Data":"e0f396275bf0671a3c1fb4fcc071e2683d46e48ad824f10ed70de95a09a50fcc"} Sep 30 14:13:46 crc kubenswrapper[4676]: I0930 14:13:46.398162 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-bwq8t" event={"ID":"347e3ac8-4477-4bab-a64b-a443098bb400","Type":"ContainerStarted","Data":"a9dfa27c9c401a444b2aa8138020c72c3fad41e50da8581d54eef929efdc6440"} Sep 30 14:13:46 crc kubenswrapper[4676]: E0930 14:13:46.399776 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:7169dfadf5f5589f14ca52700d2eba991c2a0c7733f6a1ea795752d993d7f61b\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-bwq8t" podUID="347e3ac8-4477-4bab-a64b-a443098bb400" Sep 30 14:13:46 crc kubenswrapper[4676]: I0930 14:13:46.429384 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5f5687bfdd-7nt8g" event={"ID":"72b68346-543d-4b80-ba31-9bcb856b6989","Type":"ContainerStarted","Data":"1471a46202f8ad25d96e7112146c51f0792812ce2a76461ec8addece2f2034f8"} Sep 30 14:13:46 crc kubenswrapper[4676]: I0930 14:13:46.429430 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5f5687bfdd-7nt8g" event={"ID":"72b68346-543d-4b80-ba31-9bcb856b6989","Type":"ContainerStarted","Data":"2f655c4827ae82295b4ad4e383d991902420f3c0c8e7bd0c98607547bb73ff54"} Sep 30 14:13:46 crc kubenswrapper[4676]: I0930 14:13:46.429450 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5f5687bfdd-7nt8g" event={"ID":"72b68346-543d-4b80-ba31-9bcb856b6989","Type":"ContainerStarted","Data":"90931d16a77f78fd9ab6ffba17f20e504da99c6fcb494a333cfd536b3f3c5d1e"} Sep 30 14:13:46 crc kubenswrapper[4676]: I0930 14:13:46.429949 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-5f5687bfdd-7nt8g" Sep 30 14:13:46 crc kubenswrapper[4676]: I0930 14:13:46.435153 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-mg54v" event={"ID":"c15f6efe-27f0-4f55-b1f0-957366ff23a4","Type":"ContainerStarted","Data":"15c01abf85226856d5f8f90c1f0a302d6c54be82e124e1510a4eb99fa18b5b75"} Sep 30 14:13:46 crc kubenswrapper[4676]: I0930 14:13:46.465404 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-f66b554c6-bvj6l" event={"ID":"129d5672-c8dd-4a63-8d48-dc95c84a45b2","Type":"ContainerStarted","Data":"3e5317b4f735514fb8aeb10131273be6d284b9d3508d12e82de355f7ef7697d1"} Sep 30 14:13:46 crc kubenswrapper[4676]: I0930 14:13:46.476625 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-l4p8f" event={"ID":"cedc986e-ac92-45e8-862a-fc4dcb60455d","Type":"ContainerStarted","Data":"695899687e5d8815d9a024217892e7ed44efd33cb3d797e1b8629bf48e35fc18"} Sep 30 14:13:46 crc kubenswrapper[4676]: I0930 14:13:46.503930 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-qd7b8"] Sep 30 14:13:46 crc kubenswrapper[4676]: I0930 14:13:46.504212 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-gh2vw" event={"ID":"daee0b60-331c-4108-8881-66cf4eb731e0","Type":"ContainerStarted","Data":"652e72e008dde357927781e7b00df9804d76d3dd961ea321ce49035842d9c727"} Sep 30 14:13:46 crc kubenswrapper[4676]: I0930 14:13:46.560928 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-5f5687bfdd-7nt8g" podStartSLOduration=2.560896954 podStartE2EDuration="2.560896954s" podCreationTimestamp="2025-09-30 14:13:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:13:46.555156264 +0000 UTC m=+930.538244713" watchObservedRunningTime="2025-09-30 14:13:46.560896954 +0000 UTC m=+930.543985393" Sep 30 14:13:46 crc kubenswrapper[4676]: I0930 14:13:46.562375 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-8pv7q" event={"ID":"7e6672d2-5e94-4d5d-b927-ad3573b95469","Type":"ContainerStarted","Data":"675acb1aefb54387ab88c151be4de389ded5b60fcc97f06ec25a0dae76a1e3df"} Sep 30 14:13:46 crc kubenswrapper[4676]: I0930 14:13:46.581577 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-g2s9t" event={"ID":"e51ea15a-8d04-4d56-956d-0fcf41846eb8","Type":"ContainerStarted","Data":"b6944fab10c6a90d348613534082fbac06057ece70d7ee9189c93e96f2f42ce5"} Sep 30 14:13:47 crc kubenswrapper[4676]: I0930 14:13:47.596957 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-qd7b8" event={"ID":"96c8a26f-c044-429d-90eb-d0342486c32f","Type":"ContainerStarted","Data":"4d5a62f162bdcb42371e813f1c08a739e69db71c480412c405833e89e8aadb9d"} Sep 30 14:13:47 crc kubenswrapper[4676]: E0930 14:13:47.599823 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:225524223bf2a7f3a4ce95958fc9ca6fdab02745fb70374e8ff5bf1ddaceda4b\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-xkkwv" podUID="677c476e-c8df-4a21-9968-b2bd23b246f6" Sep 30 14:13:47 crc kubenswrapper[4676]: E0930 14:13:47.600226 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:8fdf377daf05e2fa7346505017078fa81981dd945bf635a64c8022633c68118f\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-wf6fb" podUID="aa6dd699-ccd5-476f-ab9c-3d4841ed591a" Sep 30 14:13:47 crc kubenswrapper[4676]: E0930 14:13:47.600332 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:4d08afd31dc5ded10c54a5541f514ac351e9b40a183285b3db27d0757a6354c8\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-8m45v" podUID="43f93725-c577-4253-ae9c-7d14e8aec0b9" Sep 30 14:13:47 crc kubenswrapper[4676]: E0930 14:13:47.600359 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:a6b3408d79df6b6d4a467e49defaa4a9d9c088c94d0605a4fee0030c9ccc84d2\\\"\"" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-9dzn5" podUID="a3762232-4e9f-452e-aea4-c5feb443ad75" Sep 30 14:13:47 crc kubenswrapper[4676]: E0930 14:13:47.600444 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:7169dfadf5f5589f14ca52700d2eba991c2a0c7733f6a1ea795752d993d7f61b\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-bwq8t" podUID="347e3ac8-4477-4bab-a64b-a443098bb400" Sep 30 14:13:54 crc kubenswrapper[4676]: I0930 14:13:54.944251 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-5f5687bfdd-7nt8g" Sep 30 14:13:58 crc kubenswrapper[4676]: I0930 14:13:58.676239 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-4dzg7" event={"ID":"5e535753-178a-4b7b-b20c-e13fa0be5ce1","Type":"ContainerStarted","Data":"57734450e7e9e0d1e2bd24829f1a51ae4144d46bf46d01bc45f5a9c4a145dde7"} Sep 30 14:13:58 crc kubenswrapper[4676]: I0930 14:13:58.678131 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-gh2vw" event={"ID":"daee0b60-331c-4108-8881-66cf4eb731e0","Type":"ContainerStarted","Data":"bbf18a30b75e0b416300509ac29740f2ff7284354058aec5c45edbe1565b9287"} Sep 30 14:13:58 crc kubenswrapper[4676]: I0930 14:13:58.681701 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-7qc9p" event={"ID":"8f9d1069-29eb-42e5-8029-1ed616f31c4a","Type":"ContainerStarted","Data":"4825b6789c80d69cbdc9d154c27c887b8fe800aa9ffcaf128eb4a27e420c225c"} Sep 30 14:13:58 crc kubenswrapper[4676]: I0930 14:13:58.683866 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-6qcf5" event={"ID":"930c8b21-3bfd-497b-9bc7-60f2cf7abde6","Type":"ContainerStarted","Data":"0c0823fb335071c7816f7e1f396eb03b7580d12e6af7cd2930f4d2aaec7ae2f7"} Sep 30 14:13:58 crc kubenswrapper[4676]: I0930 14:13:58.685619 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-8pv7q" event={"ID":"7e6672d2-5e94-4d5d-b927-ad3573b95469","Type":"ContainerStarted","Data":"4d74ddccb2406b9a5edced477abb216e4f7d3a831f52d16259e694505132a45d"} Sep 30 14:13:59 crc kubenswrapper[4676]: I0930 14:13:59.696115 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-t6h2t" event={"ID":"dbe0db98-4cbd-49d2-9f6a-f54a8189c64b","Type":"ContainerStarted","Data":"b7e5194e7f750d41867e5cd1a09452b69a2742645c187349212db40ab302357c"} Sep 30 14:13:59 crc kubenswrapper[4676]: I0930 14:13:59.697972 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-s9znt" event={"ID":"1d51c97e-7c47-4274-8bd4-bc3d7402a378","Type":"ContainerStarted","Data":"442f2a553be29e6fc294b8b3b97f7b9e0b7bd6c62f059b902ce37e43d18e84a2"} Sep 30 14:13:59 crc kubenswrapper[4676]: I0930 14:13:59.700124 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-7qc9p" event={"ID":"8f9d1069-29eb-42e5-8029-1ed616f31c4a","Type":"ContainerStarted","Data":"28514708de53510e3d2cf10a0223ab6f3213a8f11c92909f0701470c56e88080"} Sep 30 14:13:59 crc kubenswrapper[4676]: I0930 14:13:59.701399 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-7qc9p" Sep 30 14:13:59 crc kubenswrapper[4676]: I0930 14:13:59.709135 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-qd7b8" event={"ID":"96c8a26f-c044-429d-90eb-d0342486c32f","Type":"ContainerStarted","Data":"712e125cb43308a02dc48cf18e84ddc2ad52d873ffd88455afd3d6266aa84636"} Sep 30 14:13:59 crc kubenswrapper[4676]: I0930 14:13:59.716215 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-zn6zg" event={"ID":"b5ead6b1-3f68-454e-847c-89cac8d7f1f0","Type":"ContainerStarted","Data":"b6061be91b128332c8134cbc6257c6658a2484f8a5ace426e652dab8cd359309"} Sep 30 14:13:59 crc kubenswrapper[4676]: I0930 14:13:59.726395 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-7qc9p" podStartSLOduration=3.641088762 podStartE2EDuration="16.726377426s" podCreationTimestamp="2025-09-30 14:13:43 +0000 UTC" firstStartedPulling="2025-09-30 14:13:44.961318506 +0000 UTC m=+928.944406935" lastFinishedPulling="2025-09-30 14:13:58.04660717 +0000 UTC m=+942.029695599" observedRunningTime="2025-09-30 14:13:59.724963316 +0000 UTC m=+943.708051755" watchObservedRunningTime="2025-09-30 14:13:59.726377426 +0000 UTC m=+943.709465855" Sep 30 14:13:59 crc kubenswrapper[4676]: I0930 14:13:59.729650 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-7b844" event={"ID":"9e7d83e3-0f96-4a53-88ad-568d39435e5f","Type":"ContainerStarted","Data":"1607bcbb1b682db888e9f25cc72d4311abdd2a83a0a9d05705666e1574efc76e"} Sep 30 14:13:59 crc kubenswrapper[4676]: I0930 14:13:59.735046 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-f66b554c6-bvj6l" event={"ID":"129d5672-c8dd-4a63-8d48-dc95c84a45b2","Type":"ContainerStarted","Data":"9625fe81608ca5bd3728ecd749f476956578b405c6a78c3c538d24a372e1a176"} Sep 30 14:13:59 crc kubenswrapper[4676]: I0930 14:13:59.738461 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-6qcf5" event={"ID":"930c8b21-3bfd-497b-9bc7-60f2cf7abde6","Type":"ContainerStarted","Data":"e43624c1f08ad9498f994f3829e31e2fd946d40d55227a8f448e43e1914aec93"} Sep 30 14:13:59 crc kubenswrapper[4676]: I0930 14:13:59.738601 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-6qcf5" Sep 30 14:13:59 crc kubenswrapper[4676]: I0930 14:13:59.741896 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-8pv7q" event={"ID":"7e6672d2-5e94-4d5d-b927-ad3573b95469","Type":"ContainerStarted","Data":"3ee272093fa1ee11226ccc238a4a5e6949884545a4833df40076986cafdf61e3"} Sep 30 14:13:59 crc kubenswrapper[4676]: I0930 14:13:59.742075 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-8pv7q" Sep 30 14:13:59 crc kubenswrapper[4676]: I0930 14:13:59.745947 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-7tf2x" event={"ID":"2342a742-ce41-4487-9d32-34fc69cb4445","Type":"ContainerStarted","Data":"327e467df250d6f08ce29ca42e9b4d431d18ea24527df99fb8c3ca049b7a8868"} Sep 30 14:13:59 crc kubenswrapper[4676]: I0930 14:13:59.748033 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-4dzg7" event={"ID":"5e535753-178a-4b7b-b20c-e13fa0be5ce1","Type":"ContainerStarted","Data":"75ca7062f3c0338d191bd5bc35236e1df0505ba32b84b13366b2987c7e4b8c48"} Sep 30 14:13:59 crc kubenswrapper[4676]: I0930 14:13:59.748256 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-4dzg7" Sep 30 14:13:59 crc kubenswrapper[4676]: I0930 14:13:59.750602 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-l4p8f" event={"ID":"cedc986e-ac92-45e8-862a-fc4dcb60455d","Type":"ContainerStarted","Data":"fba536da68d24cc873b27005e2a1e4031af2f0ce136976586e26e14d4ffea8ad"} Sep 30 14:13:59 crc kubenswrapper[4676]: I0930 14:13:59.771489 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-g2s9t" event={"ID":"e51ea15a-8d04-4d56-956d-0fcf41846eb8","Type":"ContainerStarted","Data":"5e77cafb15d3074c11a380b443206309f05c5c3efc29d44389cc06943ca70895"} Sep 30 14:13:59 crc kubenswrapper[4676]: I0930 14:13:59.781038 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-88c7-n5djz" event={"ID":"d9d6dec4-1cc3-40c5-b90a-71b5f7a7da85","Type":"ContainerStarted","Data":"b73710e506290ca7df4ec498fe258964790c1c4f92cdf3ca431ca5a03a8c2bb5"} Sep 30 14:13:59 crc kubenswrapper[4676]: I0930 14:13:59.783689 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-6qcf5" podStartSLOduration=3.701161161 podStartE2EDuration="16.783654277s" podCreationTimestamp="2025-09-30 14:13:43 +0000 UTC" firstStartedPulling="2025-09-30 14:13:44.952680646 +0000 UTC m=+928.935769075" lastFinishedPulling="2025-09-30 14:13:58.035173762 +0000 UTC m=+942.018262191" observedRunningTime="2025-09-30 14:13:59.774483993 +0000 UTC m=+943.757572422" watchObservedRunningTime="2025-09-30 14:13:59.783654277 +0000 UTC m=+943.766742706" Sep 30 14:13:59 crc kubenswrapper[4676]: I0930 14:13:59.804375 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-8fx9n" event={"ID":"5dbc4210-e31a-4bf8-a5cb-6f00a7406743","Type":"ContainerStarted","Data":"28d873981a996e377e3f132bae38206ea4e532fdd3edbac38d78e0e19df314de"} Sep 30 14:13:59 crc kubenswrapper[4676]: I0930 14:13:59.816748 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-4dzg7" podStartSLOduration=3.97491322 podStartE2EDuration="16.816727077s" podCreationTimestamp="2025-09-30 14:13:43 +0000 UTC" firstStartedPulling="2025-09-30 14:13:45.203871677 +0000 UTC m=+929.186960106" lastFinishedPulling="2025-09-30 14:13:58.045685534 +0000 UTC m=+942.028773963" observedRunningTime="2025-09-30 14:13:59.805018912 +0000 UTC m=+943.788107361" watchObservedRunningTime="2025-09-30 14:13:59.816727077 +0000 UTC m=+943.799815506" Sep 30 14:13:59 crc kubenswrapper[4676]: I0930 14:13:59.823206 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-mg54v" event={"ID":"c15f6efe-27f0-4f55-b1f0-957366ff23a4","Type":"ContainerStarted","Data":"2e9412394942613f5b1e5736f5259353d9d22fa53f363b95dfc7b4108aa0936e"} Sep 30 14:13:59 crc kubenswrapper[4676]: I0930 14:13:59.886503 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-8pv7q" podStartSLOduration=4.341496669 podStartE2EDuration="16.886470665s" podCreationTimestamp="2025-09-30 14:13:43 +0000 UTC" firstStartedPulling="2025-09-30 14:13:45.468237325 +0000 UTC m=+929.451325764" lastFinishedPulling="2025-09-30 14:13:58.013211341 +0000 UTC m=+941.996299760" observedRunningTime="2025-09-30 14:13:59.877216058 +0000 UTC m=+943.860304497" watchObservedRunningTime="2025-09-30 14:13:59.886470665 +0000 UTC m=+943.869559104" Sep 30 14:14:00 crc kubenswrapper[4676]: I0930 14:14:00.840475 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-f66b554c6-bvj6l" event={"ID":"129d5672-c8dd-4a63-8d48-dc95c84a45b2","Type":"ContainerStarted","Data":"09e3e4b1e19a192cb03740bc9d1efd9e7f2d34c1b3c0a71b94a03fd032ff3131"} Sep 30 14:14:00 crc kubenswrapper[4676]: I0930 14:14:00.840842 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-f66b554c6-bvj6l" Sep 30 14:14:00 crc kubenswrapper[4676]: I0930 14:14:00.847646 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-g2s9t" event={"ID":"e51ea15a-8d04-4d56-956d-0fcf41846eb8","Type":"ContainerStarted","Data":"2f4dd9dcdacba9003ee2ec7f14c02ee76f3087c2d75b3b38dede7b3af4009851"} Sep 30 14:14:00 crc kubenswrapper[4676]: I0930 14:14:00.847905 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-g2s9t" Sep 30 14:14:00 crc kubenswrapper[4676]: I0930 14:14:00.858633 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-7tf2x" event={"ID":"2342a742-ce41-4487-9d32-34fc69cb4445","Type":"ContainerStarted","Data":"22f74c7f4e888f809a7a6fe0402d8d9de10d598ca611483de5d4ef30893a2e9d"} Sep 30 14:14:00 crc kubenswrapper[4676]: I0930 14:14:00.859858 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-7tf2x" Sep 30 14:14:00 crc kubenswrapper[4676]: I0930 14:14:00.862654 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-s9znt" event={"ID":"1d51c97e-7c47-4274-8bd4-bc3d7402a378","Type":"ContainerStarted","Data":"7ec8c2296f7acb839212e621c57d595ba44b4767732ed04d2edfe0695a9be465"} Sep 30 14:14:00 crc kubenswrapper[4676]: I0930 14:14:00.864005 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-s9znt" Sep 30 14:14:00 crc kubenswrapper[4676]: I0930 14:14:00.869335 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-f66b554c6-bvj6l" podStartSLOduration=5.697743674 podStartE2EDuration="17.869307382s" podCreationTimestamp="2025-09-30 14:13:43 +0000 UTC" firstStartedPulling="2025-09-30 14:13:45.864657813 +0000 UTC m=+929.847746242" lastFinishedPulling="2025-09-30 14:13:58.036221521 +0000 UTC m=+942.019309950" observedRunningTime="2025-09-30 14:14:00.867642335 +0000 UTC m=+944.850730784" watchObservedRunningTime="2025-09-30 14:14:00.869307382 +0000 UTC m=+944.852395811" Sep 30 14:14:00 crc kubenswrapper[4676]: I0930 14:14:00.874862 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-7b844" event={"ID":"9e7d83e3-0f96-4a53-88ad-568d39435e5f","Type":"ContainerStarted","Data":"cb3ce56dfdee2d3381f07e69f03819bdf4ef359a950d06fa826930e4892928e9"} Sep 30 14:14:00 crc kubenswrapper[4676]: I0930 14:14:00.875056 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-7b844" Sep 30 14:14:00 crc kubenswrapper[4676]: I0930 14:14:00.879255 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-zn6zg" event={"ID":"b5ead6b1-3f68-454e-847c-89cac8d7f1f0","Type":"ContainerStarted","Data":"51fb8299c8ae220d75dcffd6adeff99c3e92c626ae5bce8126ec12f593e6d8dd"} Sep 30 14:14:00 crc kubenswrapper[4676]: I0930 14:14:00.879408 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-zn6zg" Sep 30 14:14:00 crc kubenswrapper[4676]: I0930 14:14:00.883365 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-8fx9n" event={"ID":"5dbc4210-e31a-4bf8-a5cb-6f00a7406743","Type":"ContainerStarted","Data":"790be7f479a9aa45fa13c1bfd6a1c8dd8a6b3831b084d3f5a794aee87941f708"} Sep 30 14:14:00 crc kubenswrapper[4676]: I0930 14:14:00.883743 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-8fx9n" Sep 30 14:14:00 crc kubenswrapper[4676]: I0930 14:14:00.894076 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-7tf2x" podStartSLOduration=4.526424228 podStartE2EDuration="17.894050029s" podCreationTimestamp="2025-09-30 14:13:43 +0000 UTC" firstStartedPulling="2025-09-30 14:13:44.842591546 +0000 UTC m=+928.825679975" lastFinishedPulling="2025-09-30 14:13:58.210217347 +0000 UTC m=+942.193305776" observedRunningTime="2025-09-30 14:14:00.888607858 +0000 UTC m=+944.871696287" watchObservedRunningTime="2025-09-30 14:14:00.894050029 +0000 UTC m=+944.877138458" Sep 30 14:14:00 crc kubenswrapper[4676]: I0930 14:14:00.894566 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-qd7b8" event={"ID":"96c8a26f-c044-429d-90eb-d0342486c32f","Type":"ContainerStarted","Data":"ce653c81b158b99115c4b861af8e86c2c3c6594122db82f2f10d0e870a672482"} Sep 30 14:14:00 crc kubenswrapper[4676]: I0930 14:14:00.894721 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-qd7b8" Sep 30 14:14:00 crc kubenswrapper[4676]: I0930 14:14:00.897794 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-mg54v" event={"ID":"c15f6efe-27f0-4f55-b1f0-957366ff23a4","Type":"ContainerStarted","Data":"db09b766e18d74f757dccd0bc20cc270f699b00b27c1422e250be59da03da2b0"} Sep 30 14:14:00 crc kubenswrapper[4676]: I0930 14:14:00.897962 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-mg54v" Sep 30 14:14:00 crc kubenswrapper[4676]: I0930 14:14:00.900703 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-t6h2t" event={"ID":"dbe0db98-4cbd-49d2-9f6a-f54a8189c64b","Type":"ContainerStarted","Data":"4cfc285046be8de489639be45976e9d2c957d9cab66d1723dd09258220d676c7"} Sep 30 14:14:00 crc kubenswrapper[4676]: I0930 14:14:00.900817 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-t6h2t" Sep 30 14:14:00 crc kubenswrapper[4676]: I0930 14:14:00.914744 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-88c7-n5djz" event={"ID":"d9d6dec4-1cc3-40c5-b90a-71b5f7a7da85","Type":"ContainerStarted","Data":"94b034cb245f45834f3ac60aced68b53dd4560898502676ad0318ad80cbc2370"} Sep 30 14:14:00 crc kubenswrapper[4676]: I0930 14:14:00.915606 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-88c7-n5djz" Sep 30 14:14:00 crc kubenswrapper[4676]: I0930 14:14:00.915794 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-s9znt" podStartSLOduration=5.07878061 podStartE2EDuration="17.915773883s" podCreationTimestamp="2025-09-30 14:13:43 +0000 UTC" firstStartedPulling="2025-09-30 14:13:45.237254765 +0000 UTC m=+929.220343194" lastFinishedPulling="2025-09-30 14:13:58.074248038 +0000 UTC m=+942.057336467" observedRunningTime="2025-09-30 14:14:00.913707646 +0000 UTC m=+944.896796075" watchObservedRunningTime="2025-09-30 14:14:00.915773883 +0000 UTC m=+944.898862322" Sep 30 14:14:00 crc kubenswrapper[4676]: I0930 14:14:00.934490 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-l4p8f" event={"ID":"cedc986e-ac92-45e8-862a-fc4dcb60455d","Type":"ContainerStarted","Data":"b7cca74f9bc19eedffdd231837d685627785a5b2264ed5a2527cc2ad8ed6620a"} Sep 30 14:14:00 crc kubenswrapper[4676]: I0930 14:14:00.935944 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-l4p8f" Sep 30 14:14:00 crc kubenswrapper[4676]: I0930 14:14:00.940522 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-gh2vw" event={"ID":"daee0b60-331c-4108-8881-66cf4eb731e0","Type":"ContainerStarted","Data":"0b41f0104d12143866151894480878901862a161e2f76d6308d2ea2836c979ae"} Sep 30 14:14:00 crc kubenswrapper[4676]: I0930 14:14:00.940583 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-gh2vw" Sep 30 14:14:00 crc kubenswrapper[4676]: I0930 14:14:00.946406 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-g2s9t" podStartSLOduration=5.497549099 podStartE2EDuration="17.946386084s" podCreationTimestamp="2025-09-30 14:13:43 +0000 UTC" firstStartedPulling="2025-09-30 14:13:45.625368032 +0000 UTC m=+929.608456461" lastFinishedPulling="2025-09-30 14:13:58.074205017 +0000 UTC m=+942.057293446" observedRunningTime="2025-09-30 14:14:00.942415883 +0000 UTC m=+944.925504332" watchObservedRunningTime="2025-09-30 14:14:00.946386084 +0000 UTC m=+944.929474513" Sep 30 14:14:00 crc kubenswrapper[4676]: I0930 14:14:00.969041 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-88c7-n5djz" podStartSLOduration=5.18272484 podStartE2EDuration="17.969012443s" podCreationTimestamp="2025-09-30 14:13:43 +0000 UTC" firstStartedPulling="2025-09-30 14:13:45.259342099 +0000 UTC m=+929.242430528" lastFinishedPulling="2025-09-30 14:13:58.045629712 +0000 UTC m=+942.028718131" observedRunningTime="2025-09-30 14:14:00.966753349 +0000 UTC m=+944.949841808" watchObservedRunningTime="2025-09-30 14:14:00.969012443 +0000 UTC m=+944.952100872" Sep 30 14:14:00 crc kubenswrapper[4676]: I0930 14:14:00.995483 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-8fx9n" podStartSLOduration=5.166587012 podStartE2EDuration="17.995459818s" podCreationTimestamp="2025-09-30 14:13:43 +0000 UTC" firstStartedPulling="2025-09-30 14:13:45.341309828 +0000 UTC m=+929.324398257" lastFinishedPulling="2025-09-30 14:13:58.170182624 +0000 UTC m=+942.153271063" observedRunningTime="2025-09-30 14:14:00.988311569 +0000 UTC m=+944.971400018" watchObservedRunningTime="2025-09-30 14:14:00.995459818 +0000 UTC m=+944.978548247" Sep 30 14:14:01 crc kubenswrapper[4676]: I0930 14:14:01.102004 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-qd7b8" podStartSLOduration=6.637188354 podStartE2EDuration="18.101979868s" podCreationTimestamp="2025-09-30 14:13:43 +0000 UTC" firstStartedPulling="2025-09-30 14:13:46.580845378 +0000 UTC m=+930.563933807" lastFinishedPulling="2025-09-30 14:13:58.045636892 +0000 UTC m=+942.028725321" observedRunningTime="2025-09-30 14:14:01.096348042 +0000 UTC m=+945.079436471" watchObservedRunningTime="2025-09-30 14:14:01.101979868 +0000 UTC m=+945.085068297" Sep 30 14:14:01 crc kubenswrapper[4676]: I0930 14:14:01.104064 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-gh2vw" podStartSLOduration=5.508211696 podStartE2EDuration="18.104051685s" podCreationTimestamp="2025-09-30 14:13:43 +0000 UTC" firstStartedPulling="2025-09-30 14:13:45.449826453 +0000 UTC m=+929.432914883" lastFinishedPulling="2025-09-30 14:13:58.045666453 +0000 UTC m=+942.028754872" observedRunningTime="2025-09-30 14:14:01.011827073 +0000 UTC m=+944.994915502" watchObservedRunningTime="2025-09-30 14:14:01.104051685 +0000 UTC m=+945.087140124" Sep 30 14:14:01 crc kubenswrapper[4676]: I0930 14:14:01.184857 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-zn6zg" podStartSLOduration=5.109375501 podStartE2EDuration="18.184836191s" podCreationTimestamp="2025-09-30 14:13:43 +0000 UTC" firstStartedPulling="2025-09-30 14:13:44.987128904 +0000 UTC m=+928.970217323" lastFinishedPulling="2025-09-30 14:13:58.062589584 +0000 UTC m=+942.045678013" observedRunningTime="2025-09-30 14:14:01.183492203 +0000 UTC m=+945.166580642" watchObservedRunningTime="2025-09-30 14:14:01.184836191 +0000 UTC m=+945.167924620" Sep 30 14:14:01 crc kubenswrapper[4676]: I0930 14:14:01.188023 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-7b844" podStartSLOduration=4.555141697 podStartE2EDuration="18.188009849s" podCreationTimestamp="2025-09-30 14:13:43 +0000 UTC" firstStartedPulling="2025-09-30 14:13:44.531235823 +0000 UTC m=+928.514324252" lastFinishedPulling="2025-09-30 14:13:58.164103975 +0000 UTC m=+942.147192404" observedRunningTime="2025-09-30 14:14:01.146798814 +0000 UTC m=+945.129887243" watchObservedRunningTime="2025-09-30 14:14:01.188009849 +0000 UTC m=+945.171098268" Sep 30 14:14:01 crc kubenswrapper[4676]: I0930 14:14:01.219753 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-l4p8f" podStartSLOduration=5.787441877 podStartE2EDuration="18.219735411s" podCreationTimestamp="2025-09-30 14:13:43 +0000 UTC" firstStartedPulling="2025-09-30 14:13:45.738138017 +0000 UTC m=+929.721226446" lastFinishedPulling="2025-09-30 14:13:58.170431551 +0000 UTC m=+942.153519980" observedRunningTime="2025-09-30 14:14:01.212178331 +0000 UTC m=+945.195266760" watchObservedRunningTime="2025-09-30 14:14:01.219735411 +0000 UTC m=+945.202823840" Sep 30 14:14:01 crc kubenswrapper[4676]: I0930 14:14:01.247325 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-mg54v" podStartSLOduration=5.609201823 podStartE2EDuration="18.247304457s" podCreationTimestamp="2025-09-30 14:13:43 +0000 UTC" firstStartedPulling="2025-09-30 14:13:45.436643178 +0000 UTC m=+929.419731607" lastFinishedPulling="2025-09-30 14:13:58.074745812 +0000 UTC m=+942.057834241" observedRunningTime="2025-09-30 14:14:01.240783066 +0000 UTC m=+945.223871495" watchObservedRunningTime="2025-09-30 14:14:01.247304457 +0000 UTC m=+945.230392896" Sep 30 14:14:01 crc kubenswrapper[4676]: I0930 14:14:01.276770 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-t6h2t" podStartSLOduration=5.450727448 podStartE2EDuration="18.276742996s" podCreationTimestamp="2025-09-30 14:13:43 +0000 UTC" firstStartedPulling="2025-09-30 14:13:45.343467877 +0000 UTC m=+929.326556306" lastFinishedPulling="2025-09-30 14:13:58.169483425 +0000 UTC m=+942.152571854" observedRunningTime="2025-09-30 14:14:01.272694373 +0000 UTC m=+945.255782802" watchObservedRunningTime="2025-09-30 14:14:01.276742996 +0000 UTC m=+945.259831445" Sep 30 14:14:03 crc kubenswrapper[4676]: I0930 14:14:03.601302 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-7b844" Sep 30 14:14:03 crc kubenswrapper[4676]: I0930 14:14:03.615911 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-7tf2x" Sep 30 14:14:03 crc kubenswrapper[4676]: I0930 14:14:03.633494 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-7qc9p" Sep 30 14:14:03 crc kubenswrapper[4676]: I0930 14:14:03.796516 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-6qcf5" Sep 30 14:14:04 crc kubenswrapper[4676]: I0930 14:14:04.256055 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-s9znt" Sep 30 14:14:04 crc kubenswrapper[4676]: I0930 14:14:04.277633 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-4dzg7" Sep 30 14:14:04 crc kubenswrapper[4676]: I0930 14:14:04.351986 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-88c7-n5djz" Sep 30 14:14:04 crc kubenswrapper[4676]: I0930 14:14:04.404056 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-t6h2t" Sep 30 14:14:04 crc kubenswrapper[4676]: I0930 14:14:04.441030 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-8pv7q" Sep 30 14:14:04 crc kubenswrapper[4676]: I0930 14:14:04.473292 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-8fx9n" Sep 30 14:14:04 crc kubenswrapper[4676]: I0930 14:14:04.586314 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-gh2vw" Sep 30 14:14:04 crc kubenswrapper[4676]: I0930 14:14:04.640423 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-mg54v" Sep 30 14:14:04 crc kubenswrapper[4676]: I0930 14:14:04.747533 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-g2s9t" Sep 30 14:14:04 crc kubenswrapper[4676]: I0930 14:14:04.859018 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-l4p8f" Sep 30 14:14:04 crc kubenswrapper[4676]: I0930 14:14:04.898299 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-f66b554c6-bvj6l" Sep 30 14:14:05 crc kubenswrapper[4676]: I0930 14:14:05.934980 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-qd7b8" Sep 30 14:14:13 crc kubenswrapper[4676]: E0930 14:14:13.216206 4676 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:225524223bf2a7f3a4ce95958fc9ca6fdab02745fb70374e8ff5bf1ddaceda4b" Sep 30 14:14:13 crc kubenswrapper[4676]: E0930 14:14:13.218135 4676 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:225524223bf2a7f3a4ce95958fc9ca6fdab02745fb70374e8ff5bf1ddaceda4b,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wh5sr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-79d8469568-xkkwv_openstack-operators(677c476e-c8df-4a21-9968-b2bd23b246f6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 14:14:13 crc kubenswrapper[4676]: E0930 14:14:13.219388 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-xkkwv" podUID="677c476e-c8df-4a21-9968-b2bd23b246f6" Sep 30 14:14:13 crc kubenswrapper[4676]: I0930 14:14:13.735032 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-zn6zg" Sep 30 14:14:19 crc kubenswrapper[4676]: I0930 14:14:19.069353 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-9dzn5" event={"ID":"a3762232-4e9f-452e-aea4-c5feb443ad75","Type":"ContainerStarted","Data":"4f85069356b7a24ca9298857380e3e833046abc64c9daa1662d2015ed650beaf"} Sep 30 14:14:19 crc kubenswrapper[4676]: I0930 14:14:19.071685 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-8m45v" event={"ID":"43f93725-c577-4253-ae9c-7d14e8aec0b9","Type":"ContainerStarted","Data":"4c2599a3da47c25a9ac63c5ccb5e6b3e4fa97229b2e9b7d992b443eb5f372eac"} Sep 30 14:14:19 crc kubenswrapper[4676]: I0930 14:14:19.072200 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-9dzn5" Sep 30 14:14:19 crc kubenswrapper[4676]: I0930 14:14:19.073481 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-bwq8t" event={"ID":"347e3ac8-4477-4bab-a64b-a443098bb400","Type":"ContainerStarted","Data":"cda8df67646b351f8fa265a28ff9595c9d508197f11b9d4f138d2a5ff402b925"} Sep 30 14:14:19 crc kubenswrapper[4676]: I0930 14:14:19.073735 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-bwq8t" Sep 30 14:14:19 crc kubenswrapper[4676]: I0930 14:14:19.075929 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-wf6fb" event={"ID":"aa6dd699-ccd5-476f-ab9c-3d4841ed591a","Type":"ContainerStarted","Data":"d8f6b6fa4cac3e57a121f74616e011cfd9c131d6869c7466cffb37884559fe81"} Sep 30 14:14:19 crc kubenswrapper[4676]: I0930 14:14:19.076160 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-wf6fb" Sep 30 14:14:19 crc kubenswrapper[4676]: I0930 14:14:19.094155 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-9dzn5" podStartSLOduration=4.217447282 podStartE2EDuration="36.094127969s" podCreationTimestamp="2025-09-30 14:13:43 +0000 UTC" firstStartedPulling="2025-09-30 14:13:45.887890209 +0000 UTC m=+929.870978638" lastFinishedPulling="2025-09-30 14:14:17.764570906 +0000 UTC m=+961.747659325" observedRunningTime="2025-09-30 14:14:19.091013002 +0000 UTC m=+963.074101471" watchObservedRunningTime="2025-09-30 14:14:19.094127969 +0000 UTC m=+963.077216398" Sep 30 14:14:19 crc kubenswrapper[4676]: I0930 14:14:19.114612 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-wf6fb" podStartSLOduration=3.973541284 podStartE2EDuration="36.114551557s" podCreationTimestamp="2025-09-30 14:13:43 +0000 UTC" firstStartedPulling="2025-09-30 14:13:45.631721099 +0000 UTC m=+929.614809528" lastFinishedPulling="2025-09-30 14:14:17.772731372 +0000 UTC m=+961.755819801" observedRunningTime="2025-09-30 14:14:19.112105958 +0000 UTC m=+963.095194387" watchObservedRunningTime="2025-09-30 14:14:19.114551557 +0000 UTC m=+963.097640006" Sep 30 14:14:19 crc kubenswrapper[4676]: I0930 14:14:19.137252 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-8m45v" podStartSLOduration=4.12131268 podStartE2EDuration="36.137235507s" podCreationTimestamp="2025-09-30 14:13:43 +0000 UTC" firstStartedPulling="2025-09-30 14:13:45.748512665 +0000 UTC m=+929.731601094" lastFinishedPulling="2025-09-30 14:14:17.764435492 +0000 UTC m=+961.747523921" observedRunningTime="2025-09-30 14:14:19.132790174 +0000 UTC m=+963.115878603" watchObservedRunningTime="2025-09-30 14:14:19.137235507 +0000 UTC m=+963.120323936" Sep 30 14:14:19 crc kubenswrapper[4676]: I0930 14:14:19.154904 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-bwq8t" podStartSLOduration=4.274156048 podStartE2EDuration="36.154862707s" podCreationTimestamp="2025-09-30 14:13:43 +0000 UTC" firstStartedPulling="2025-09-30 14:13:45.883481406 +0000 UTC m=+929.866569835" lastFinishedPulling="2025-09-30 14:14:17.764188065 +0000 UTC m=+961.747276494" observedRunningTime="2025-09-30 14:14:19.150446544 +0000 UTC m=+963.133534983" watchObservedRunningTime="2025-09-30 14:14:19.154862707 +0000 UTC m=+963.137951136" Sep 30 14:14:24 crc kubenswrapper[4676]: I0930 14:14:24.760584 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-8m45v" Sep 30 14:14:24 crc kubenswrapper[4676]: I0930 14:14:24.763753 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-8m45v" Sep 30 14:14:24 crc kubenswrapper[4676]: I0930 14:14:24.790559 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-wf6fb" Sep 30 14:14:24 crc kubenswrapper[4676]: I0930 14:14:24.825025 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-9dzn5" Sep 30 14:14:24 crc kubenswrapper[4676]: I0930 14:14:24.925546 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-bwq8t" Sep 30 14:14:28 crc kubenswrapper[4676]: E0930 14:14:28.437539 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:225524223bf2a7f3a4ce95958fc9ca6fdab02745fb70374e8ff5bf1ddaceda4b\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-xkkwv" podUID="677c476e-c8df-4a21-9968-b2bd23b246f6" Sep 30 14:14:40 crc kubenswrapper[4676]: I0930 14:14:40.435400 4676 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 14:14:47 crc kubenswrapper[4676]: I0930 14:14:47.277260 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-xkkwv" event={"ID":"677c476e-c8df-4a21-9968-b2bd23b246f6","Type":"ContainerStarted","Data":"24170ef18301219d34f729821ff0b76a5278da936a7dc114610295685bb5259f"} Sep 30 14:14:47 crc kubenswrapper[4676]: I0930 14:14:47.304724 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-xkkwv" podStartSLOduration=2.685327132 podStartE2EDuration="1m3.304705985s" podCreationTimestamp="2025-09-30 14:13:44 +0000 UTC" firstStartedPulling="2025-09-30 14:13:45.865663251 +0000 UTC m=+929.848751680" lastFinishedPulling="2025-09-30 14:14:46.485042094 +0000 UTC m=+990.468130533" observedRunningTime="2025-09-30 14:14:47.302897735 +0000 UTC m=+991.285986164" watchObservedRunningTime="2025-09-30 14:14:47.304705985 +0000 UTC m=+991.287794414" Sep 30 14:14:59 crc kubenswrapper[4676]: I0930 14:14:59.919694 4676 patch_prober.go:28] interesting pod/machine-config-daemon-4k2dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:14:59 crc kubenswrapper[4676]: I0930 14:14:59.920362 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:15:00 crc kubenswrapper[4676]: I0930 14:15:00.139505 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320695-ztt5f"] Sep 30 14:15:00 crc kubenswrapper[4676]: I0930 14:15:00.140473 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320695-ztt5f" Sep 30 14:15:00 crc kubenswrapper[4676]: I0930 14:15:00.142929 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 14:15:00 crc kubenswrapper[4676]: I0930 14:15:00.144200 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 14:15:00 crc kubenswrapper[4676]: I0930 14:15:00.160363 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320695-ztt5f"] Sep 30 14:15:00 crc kubenswrapper[4676]: I0930 14:15:00.206772 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/59ee7394-b396-4e50-9ead-e46010aa9f40-secret-volume\") pod \"collect-profiles-29320695-ztt5f\" (UID: \"59ee7394-b396-4e50-9ead-e46010aa9f40\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320695-ztt5f" Sep 30 14:15:00 crc kubenswrapper[4676]: I0930 14:15:00.206911 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/59ee7394-b396-4e50-9ead-e46010aa9f40-config-volume\") pod \"collect-profiles-29320695-ztt5f\" (UID: \"59ee7394-b396-4e50-9ead-e46010aa9f40\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320695-ztt5f" Sep 30 14:15:00 crc kubenswrapper[4676]: I0930 14:15:00.206977 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzv8r\" (UniqueName: \"kubernetes.io/projected/59ee7394-b396-4e50-9ead-e46010aa9f40-kube-api-access-mzv8r\") pod \"collect-profiles-29320695-ztt5f\" (UID: \"59ee7394-b396-4e50-9ead-e46010aa9f40\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320695-ztt5f" Sep 30 14:15:00 crc kubenswrapper[4676]: I0930 14:15:00.308253 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/59ee7394-b396-4e50-9ead-e46010aa9f40-config-volume\") pod \"collect-profiles-29320695-ztt5f\" (UID: \"59ee7394-b396-4e50-9ead-e46010aa9f40\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320695-ztt5f" Sep 30 14:15:00 crc kubenswrapper[4676]: I0930 14:15:00.308321 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzv8r\" (UniqueName: \"kubernetes.io/projected/59ee7394-b396-4e50-9ead-e46010aa9f40-kube-api-access-mzv8r\") pod \"collect-profiles-29320695-ztt5f\" (UID: \"59ee7394-b396-4e50-9ead-e46010aa9f40\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320695-ztt5f" Sep 30 14:15:00 crc kubenswrapper[4676]: I0930 14:15:00.308410 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/59ee7394-b396-4e50-9ead-e46010aa9f40-secret-volume\") pod \"collect-profiles-29320695-ztt5f\" (UID: \"59ee7394-b396-4e50-9ead-e46010aa9f40\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320695-ztt5f" Sep 30 14:15:00 crc kubenswrapper[4676]: I0930 14:15:00.309310 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/59ee7394-b396-4e50-9ead-e46010aa9f40-config-volume\") pod \"collect-profiles-29320695-ztt5f\" (UID: \"59ee7394-b396-4e50-9ead-e46010aa9f40\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320695-ztt5f" Sep 30 14:15:00 crc kubenswrapper[4676]: I0930 14:15:00.319224 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/59ee7394-b396-4e50-9ead-e46010aa9f40-secret-volume\") pod \"collect-profiles-29320695-ztt5f\" (UID: \"59ee7394-b396-4e50-9ead-e46010aa9f40\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320695-ztt5f" Sep 30 14:15:00 crc kubenswrapper[4676]: I0930 14:15:00.325224 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzv8r\" (UniqueName: \"kubernetes.io/projected/59ee7394-b396-4e50-9ead-e46010aa9f40-kube-api-access-mzv8r\") pod \"collect-profiles-29320695-ztt5f\" (UID: \"59ee7394-b396-4e50-9ead-e46010aa9f40\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320695-ztt5f" Sep 30 14:15:00 crc kubenswrapper[4676]: I0930 14:15:00.459686 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320695-ztt5f" Sep 30 14:15:00 crc kubenswrapper[4676]: I0930 14:15:00.931422 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320695-ztt5f"] Sep 30 14:15:01 crc kubenswrapper[4676]: I0930 14:15:01.384638 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320695-ztt5f" event={"ID":"59ee7394-b396-4e50-9ead-e46010aa9f40","Type":"ContainerStarted","Data":"4194a635ef36d9d1ddf532db3fde4407a3a65b5bcfa3c4ad7b5e52174214554b"} Sep 30 14:15:01 crc kubenswrapper[4676]: I0930 14:15:01.385402 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320695-ztt5f" event={"ID":"59ee7394-b396-4e50-9ead-e46010aa9f40","Type":"ContainerStarted","Data":"ffe2807ff6225235c2528817ac5ce16a403d6a61fdbcc553a4e61ba0a220ca9d"} Sep 30 14:15:01 crc kubenswrapper[4676]: I0930 14:15:01.404757 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29320695-ztt5f" podStartSLOduration=1.404735623 podStartE2EDuration="1.404735623s" podCreationTimestamp="2025-09-30 14:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:15:01.398675665 +0000 UTC m=+1005.381764104" watchObservedRunningTime="2025-09-30 14:15:01.404735623 +0000 UTC m=+1005.387824052" Sep 30 14:15:01 crc kubenswrapper[4676]: I0930 14:15:01.839411 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-cfjfj"] Sep 30 14:15:01 crc kubenswrapper[4676]: I0930 14:15:01.842963 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-cfjfj" Sep 30 14:15:01 crc kubenswrapper[4676]: I0930 14:15:01.846206 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Sep 30 14:15:01 crc kubenswrapper[4676]: I0930 14:15:01.846477 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Sep 30 14:15:01 crc kubenswrapper[4676]: I0930 14:15:01.846534 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Sep 30 14:15:01 crc kubenswrapper[4676]: I0930 14:15:01.846536 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-bw7jg" Sep 30 14:15:01 crc kubenswrapper[4676]: I0930 14:15:01.863481 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-cfjfj"] Sep 30 14:15:01 crc kubenswrapper[4676]: I0930 14:15:01.929858 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-xn6j9"] Sep 30 14:15:01 crc kubenswrapper[4676]: I0930 14:15:01.931782 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-xn6j9" Sep 30 14:15:01 crc kubenswrapper[4676]: I0930 14:15:01.932013 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qpbh\" (UniqueName: \"kubernetes.io/projected/9aeb973a-5d23-461b-a9b8-4f816dc01830-kube-api-access-8qpbh\") pod \"dnsmasq-dns-675f4bcbfc-cfjfj\" (UID: \"9aeb973a-5d23-461b-a9b8-4f816dc01830\") " pod="openstack/dnsmasq-dns-675f4bcbfc-cfjfj" Sep 30 14:15:01 crc kubenswrapper[4676]: I0930 14:15:01.932608 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9aeb973a-5d23-461b-a9b8-4f816dc01830-config\") pod \"dnsmasq-dns-675f4bcbfc-cfjfj\" (UID: \"9aeb973a-5d23-461b-a9b8-4f816dc01830\") " pod="openstack/dnsmasq-dns-675f4bcbfc-cfjfj" Sep 30 14:15:01 crc kubenswrapper[4676]: I0930 14:15:01.934432 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Sep 30 14:15:01 crc kubenswrapper[4676]: I0930 14:15:01.979022 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-xn6j9"] Sep 30 14:15:02 crc kubenswrapper[4676]: I0930 14:15:02.034762 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ff371d8-e7bd-48aa-806e-a91a8a5a629c-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-xn6j9\" (UID: \"2ff371d8-e7bd-48aa-806e-a91a8a5a629c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-xn6j9" Sep 30 14:15:02 crc kubenswrapper[4676]: I0930 14:15:02.034840 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tdx9\" (UniqueName: \"kubernetes.io/projected/2ff371d8-e7bd-48aa-806e-a91a8a5a629c-kube-api-access-9tdx9\") pod \"dnsmasq-dns-78dd6ddcc-xn6j9\" (UID: \"2ff371d8-e7bd-48aa-806e-a91a8a5a629c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-xn6j9" Sep 30 14:15:02 crc kubenswrapper[4676]: I0930 14:15:02.034909 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ff371d8-e7bd-48aa-806e-a91a8a5a629c-config\") pod \"dnsmasq-dns-78dd6ddcc-xn6j9\" (UID: \"2ff371d8-e7bd-48aa-806e-a91a8a5a629c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-xn6j9" Sep 30 14:15:02 crc kubenswrapper[4676]: I0930 14:15:02.034978 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qpbh\" (UniqueName: \"kubernetes.io/projected/9aeb973a-5d23-461b-a9b8-4f816dc01830-kube-api-access-8qpbh\") pod \"dnsmasq-dns-675f4bcbfc-cfjfj\" (UID: \"9aeb973a-5d23-461b-a9b8-4f816dc01830\") " pod="openstack/dnsmasq-dns-675f4bcbfc-cfjfj" Sep 30 14:15:02 crc kubenswrapper[4676]: I0930 14:15:02.035005 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9aeb973a-5d23-461b-a9b8-4f816dc01830-config\") pod \"dnsmasq-dns-675f4bcbfc-cfjfj\" (UID: \"9aeb973a-5d23-461b-a9b8-4f816dc01830\") " pod="openstack/dnsmasq-dns-675f4bcbfc-cfjfj" Sep 30 14:15:02 crc kubenswrapper[4676]: I0930 14:15:02.036076 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9aeb973a-5d23-461b-a9b8-4f816dc01830-config\") pod \"dnsmasq-dns-675f4bcbfc-cfjfj\" (UID: \"9aeb973a-5d23-461b-a9b8-4f816dc01830\") " pod="openstack/dnsmasq-dns-675f4bcbfc-cfjfj" Sep 30 14:15:02 crc kubenswrapper[4676]: I0930 14:15:02.063827 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qpbh\" (UniqueName: \"kubernetes.io/projected/9aeb973a-5d23-461b-a9b8-4f816dc01830-kube-api-access-8qpbh\") pod \"dnsmasq-dns-675f4bcbfc-cfjfj\" (UID: \"9aeb973a-5d23-461b-a9b8-4f816dc01830\") " pod="openstack/dnsmasq-dns-675f4bcbfc-cfjfj" Sep 30 14:15:02 crc kubenswrapper[4676]: I0930 14:15:02.137050 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ff371d8-e7bd-48aa-806e-a91a8a5a629c-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-xn6j9\" (UID: \"2ff371d8-e7bd-48aa-806e-a91a8a5a629c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-xn6j9" Sep 30 14:15:02 crc kubenswrapper[4676]: I0930 14:15:02.137134 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tdx9\" (UniqueName: \"kubernetes.io/projected/2ff371d8-e7bd-48aa-806e-a91a8a5a629c-kube-api-access-9tdx9\") pod \"dnsmasq-dns-78dd6ddcc-xn6j9\" (UID: \"2ff371d8-e7bd-48aa-806e-a91a8a5a629c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-xn6j9" Sep 30 14:15:02 crc kubenswrapper[4676]: I0930 14:15:02.137171 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ff371d8-e7bd-48aa-806e-a91a8a5a629c-config\") pod \"dnsmasq-dns-78dd6ddcc-xn6j9\" (UID: \"2ff371d8-e7bd-48aa-806e-a91a8a5a629c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-xn6j9" Sep 30 14:15:02 crc kubenswrapper[4676]: I0930 14:15:02.138509 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ff371d8-e7bd-48aa-806e-a91a8a5a629c-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-xn6j9\" (UID: \"2ff371d8-e7bd-48aa-806e-a91a8a5a629c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-xn6j9" Sep 30 14:15:02 crc kubenswrapper[4676]: I0930 14:15:02.138783 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ff371d8-e7bd-48aa-806e-a91a8a5a629c-config\") pod \"dnsmasq-dns-78dd6ddcc-xn6j9\" (UID: \"2ff371d8-e7bd-48aa-806e-a91a8a5a629c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-xn6j9" Sep 30 14:15:02 crc kubenswrapper[4676]: I0930 14:15:02.158679 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tdx9\" (UniqueName: \"kubernetes.io/projected/2ff371d8-e7bd-48aa-806e-a91a8a5a629c-kube-api-access-9tdx9\") pod \"dnsmasq-dns-78dd6ddcc-xn6j9\" (UID: \"2ff371d8-e7bd-48aa-806e-a91a8a5a629c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-xn6j9" Sep 30 14:15:02 crc kubenswrapper[4676]: I0930 14:15:02.165460 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-cfjfj" Sep 30 14:15:02 crc kubenswrapper[4676]: I0930 14:15:02.265230 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-xn6j9" Sep 30 14:15:02 crc kubenswrapper[4676]: I0930 14:15:02.401650 4676 generic.go:334] "Generic (PLEG): container finished" podID="59ee7394-b396-4e50-9ead-e46010aa9f40" containerID="4194a635ef36d9d1ddf532db3fde4407a3a65b5bcfa3c4ad7b5e52174214554b" exitCode=0 Sep 30 14:15:02 crc kubenswrapper[4676]: I0930 14:15:02.401717 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320695-ztt5f" event={"ID":"59ee7394-b396-4e50-9ead-e46010aa9f40","Type":"ContainerDied","Data":"4194a635ef36d9d1ddf532db3fde4407a3a65b5bcfa3c4ad7b5e52174214554b"} Sep 30 14:15:02 crc kubenswrapper[4676]: I0930 14:15:02.444485 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-cfjfj"] Sep 30 14:15:02 crc kubenswrapper[4676]: I0930 14:15:02.734713 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-xn6j9"] Sep 30 14:15:03 crc kubenswrapper[4676]: I0930 14:15:03.412251 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-xn6j9" event={"ID":"2ff371d8-e7bd-48aa-806e-a91a8a5a629c","Type":"ContainerStarted","Data":"2db89d0cdb69b4fb500289f43f112985609899f29ef747564a50a4800ce6ce5d"} Sep 30 14:15:03 crc kubenswrapper[4676]: I0930 14:15:03.414275 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-cfjfj" event={"ID":"9aeb973a-5d23-461b-a9b8-4f816dc01830","Type":"ContainerStarted","Data":"93ea78b03a318b20b17d38864ea0c0de0164accdb1bd6704277c1b2cebbe611f"} Sep 30 14:15:03 crc kubenswrapper[4676]: I0930 14:15:03.715567 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320695-ztt5f" Sep 30 14:15:03 crc kubenswrapper[4676]: I0930 14:15:03.873539 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/59ee7394-b396-4e50-9ead-e46010aa9f40-config-volume\") pod \"59ee7394-b396-4e50-9ead-e46010aa9f40\" (UID: \"59ee7394-b396-4e50-9ead-e46010aa9f40\") " Sep 30 14:15:03 crc kubenswrapper[4676]: I0930 14:15:03.873615 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/59ee7394-b396-4e50-9ead-e46010aa9f40-secret-volume\") pod \"59ee7394-b396-4e50-9ead-e46010aa9f40\" (UID: \"59ee7394-b396-4e50-9ead-e46010aa9f40\") " Sep 30 14:15:03 crc kubenswrapper[4676]: I0930 14:15:03.873745 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzv8r\" (UniqueName: \"kubernetes.io/projected/59ee7394-b396-4e50-9ead-e46010aa9f40-kube-api-access-mzv8r\") pod \"59ee7394-b396-4e50-9ead-e46010aa9f40\" (UID: \"59ee7394-b396-4e50-9ead-e46010aa9f40\") " Sep 30 14:15:03 crc kubenswrapper[4676]: I0930 14:15:03.878489 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59ee7394-b396-4e50-9ead-e46010aa9f40-config-volume" (OuterVolumeSpecName: "config-volume") pod "59ee7394-b396-4e50-9ead-e46010aa9f40" (UID: "59ee7394-b396-4e50-9ead-e46010aa9f40"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:15:03 crc kubenswrapper[4676]: I0930 14:15:03.881366 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59ee7394-b396-4e50-9ead-e46010aa9f40-kube-api-access-mzv8r" (OuterVolumeSpecName: "kube-api-access-mzv8r") pod "59ee7394-b396-4e50-9ead-e46010aa9f40" (UID: "59ee7394-b396-4e50-9ead-e46010aa9f40"). InnerVolumeSpecName "kube-api-access-mzv8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:15:03 crc kubenswrapper[4676]: I0930 14:15:03.896457 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59ee7394-b396-4e50-9ead-e46010aa9f40-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "59ee7394-b396-4e50-9ead-e46010aa9f40" (UID: "59ee7394-b396-4e50-9ead-e46010aa9f40"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:15:03 crc kubenswrapper[4676]: I0930 14:15:03.975775 4676 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/59ee7394-b396-4e50-9ead-e46010aa9f40-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 14:15:03 crc kubenswrapper[4676]: I0930 14:15:03.975824 4676 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/59ee7394-b396-4e50-9ead-e46010aa9f40-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 14:15:03 crc kubenswrapper[4676]: I0930 14:15:03.975841 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzv8r\" (UniqueName: \"kubernetes.io/projected/59ee7394-b396-4e50-9ead-e46010aa9f40-kube-api-access-mzv8r\") on node \"crc\" DevicePath \"\"" Sep 30 14:15:04 crc kubenswrapper[4676]: I0930 14:15:04.433316 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320695-ztt5f" event={"ID":"59ee7394-b396-4e50-9ead-e46010aa9f40","Type":"ContainerDied","Data":"ffe2807ff6225235c2528817ac5ce16a403d6a61fdbcc553a4e61ba0a220ca9d"} Sep 30 14:15:04 crc kubenswrapper[4676]: I0930 14:15:04.433441 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320695-ztt5f" Sep 30 14:15:04 crc kubenswrapper[4676]: I0930 14:15:04.433464 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ffe2807ff6225235c2528817ac5ce16a403d6a61fdbcc553a4e61ba0a220ca9d" Sep 30 14:15:05 crc kubenswrapper[4676]: I0930 14:15:05.018143 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-cfjfj"] Sep 30 14:15:05 crc kubenswrapper[4676]: I0930 14:15:05.037737 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-kp5vg"] Sep 30 14:15:05 crc kubenswrapper[4676]: E0930 14:15:05.038318 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59ee7394-b396-4e50-9ead-e46010aa9f40" containerName="collect-profiles" Sep 30 14:15:05 crc kubenswrapper[4676]: I0930 14:15:05.038345 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="59ee7394-b396-4e50-9ead-e46010aa9f40" containerName="collect-profiles" Sep 30 14:15:05 crc kubenswrapper[4676]: I0930 14:15:05.038625 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="59ee7394-b396-4e50-9ead-e46010aa9f40" containerName="collect-profiles" Sep 30 14:15:05 crc kubenswrapper[4676]: I0930 14:15:05.039689 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-kp5vg" Sep 30 14:15:05 crc kubenswrapper[4676]: I0930 14:15:05.061980 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-kp5vg"] Sep 30 14:15:05 crc kubenswrapper[4676]: I0930 14:15:05.209226 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6fe7625-e1a3-4abd-b42b-591d546d5e37-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-kp5vg\" (UID: \"b6fe7625-e1a3-4abd-b42b-591d546d5e37\") " pod="openstack/dnsmasq-dns-5ccc8479f9-kp5vg" Sep 30 14:15:05 crc kubenswrapper[4676]: I0930 14:15:05.209325 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ffw8\" (UniqueName: \"kubernetes.io/projected/b6fe7625-e1a3-4abd-b42b-591d546d5e37-kube-api-access-6ffw8\") pod \"dnsmasq-dns-5ccc8479f9-kp5vg\" (UID: \"b6fe7625-e1a3-4abd-b42b-591d546d5e37\") " pod="openstack/dnsmasq-dns-5ccc8479f9-kp5vg" Sep 30 14:15:05 crc kubenswrapper[4676]: I0930 14:15:05.209383 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6fe7625-e1a3-4abd-b42b-591d546d5e37-config\") pod \"dnsmasq-dns-5ccc8479f9-kp5vg\" (UID: \"b6fe7625-e1a3-4abd-b42b-591d546d5e37\") " pod="openstack/dnsmasq-dns-5ccc8479f9-kp5vg" Sep 30 14:15:05 crc kubenswrapper[4676]: I0930 14:15:05.310717 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6fe7625-e1a3-4abd-b42b-591d546d5e37-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-kp5vg\" (UID: \"b6fe7625-e1a3-4abd-b42b-591d546d5e37\") " pod="openstack/dnsmasq-dns-5ccc8479f9-kp5vg" Sep 30 14:15:05 crc kubenswrapper[4676]: I0930 14:15:05.310784 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ffw8\" (UniqueName: \"kubernetes.io/projected/b6fe7625-e1a3-4abd-b42b-591d546d5e37-kube-api-access-6ffw8\") pod \"dnsmasq-dns-5ccc8479f9-kp5vg\" (UID: \"b6fe7625-e1a3-4abd-b42b-591d546d5e37\") " pod="openstack/dnsmasq-dns-5ccc8479f9-kp5vg" Sep 30 14:15:05 crc kubenswrapper[4676]: I0930 14:15:05.310825 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6fe7625-e1a3-4abd-b42b-591d546d5e37-config\") pod \"dnsmasq-dns-5ccc8479f9-kp5vg\" (UID: \"b6fe7625-e1a3-4abd-b42b-591d546d5e37\") " pod="openstack/dnsmasq-dns-5ccc8479f9-kp5vg" Sep 30 14:15:05 crc kubenswrapper[4676]: I0930 14:15:05.311740 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6fe7625-e1a3-4abd-b42b-591d546d5e37-config\") pod \"dnsmasq-dns-5ccc8479f9-kp5vg\" (UID: \"b6fe7625-e1a3-4abd-b42b-591d546d5e37\") " pod="openstack/dnsmasq-dns-5ccc8479f9-kp5vg" Sep 30 14:15:05 crc kubenswrapper[4676]: I0930 14:15:05.312317 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6fe7625-e1a3-4abd-b42b-591d546d5e37-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-kp5vg\" (UID: \"b6fe7625-e1a3-4abd-b42b-591d546d5e37\") " pod="openstack/dnsmasq-dns-5ccc8479f9-kp5vg" Sep 30 14:15:05 crc kubenswrapper[4676]: I0930 14:15:05.320185 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-xn6j9"] Sep 30 14:15:05 crc kubenswrapper[4676]: I0930 14:15:05.346359 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ffw8\" (UniqueName: \"kubernetes.io/projected/b6fe7625-e1a3-4abd-b42b-591d546d5e37-kube-api-access-6ffw8\") pod \"dnsmasq-dns-5ccc8479f9-kp5vg\" (UID: \"b6fe7625-e1a3-4abd-b42b-591d546d5e37\") " pod="openstack/dnsmasq-dns-5ccc8479f9-kp5vg" Sep 30 14:15:05 crc kubenswrapper[4676]: I0930 14:15:05.355754 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-txc97"] Sep 30 14:15:05 crc kubenswrapper[4676]: I0930 14:15:05.357170 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-txc97" Sep 30 14:15:05 crc kubenswrapper[4676]: I0930 14:15:05.367636 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-txc97"] Sep 30 14:15:05 crc kubenswrapper[4676]: I0930 14:15:05.380834 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-kp5vg" Sep 30 14:15:05 crc kubenswrapper[4676]: I0930 14:15:05.513827 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgsrf\" (UniqueName: \"kubernetes.io/projected/1fb231af-b3b4-45d8-ae43-8b418e1dfcf5-kube-api-access-cgsrf\") pod \"dnsmasq-dns-57d769cc4f-txc97\" (UID: \"1fb231af-b3b4-45d8-ae43-8b418e1dfcf5\") " pod="openstack/dnsmasq-dns-57d769cc4f-txc97" Sep 30 14:15:05 crc kubenswrapper[4676]: I0930 14:15:05.513954 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fb231af-b3b4-45d8-ae43-8b418e1dfcf5-config\") pod \"dnsmasq-dns-57d769cc4f-txc97\" (UID: \"1fb231af-b3b4-45d8-ae43-8b418e1dfcf5\") " pod="openstack/dnsmasq-dns-57d769cc4f-txc97" Sep 30 14:15:05 crc kubenswrapper[4676]: I0930 14:15:05.514007 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fb231af-b3b4-45d8-ae43-8b418e1dfcf5-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-txc97\" (UID: \"1fb231af-b3b4-45d8-ae43-8b418e1dfcf5\") " pod="openstack/dnsmasq-dns-57d769cc4f-txc97" Sep 30 14:15:05 crc kubenswrapper[4676]: I0930 14:15:05.616179 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgsrf\" (UniqueName: \"kubernetes.io/projected/1fb231af-b3b4-45d8-ae43-8b418e1dfcf5-kube-api-access-cgsrf\") pod \"dnsmasq-dns-57d769cc4f-txc97\" (UID: \"1fb231af-b3b4-45d8-ae43-8b418e1dfcf5\") " pod="openstack/dnsmasq-dns-57d769cc4f-txc97" Sep 30 14:15:05 crc kubenswrapper[4676]: I0930 14:15:05.616488 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fb231af-b3b4-45d8-ae43-8b418e1dfcf5-config\") pod \"dnsmasq-dns-57d769cc4f-txc97\" (UID: \"1fb231af-b3b4-45d8-ae43-8b418e1dfcf5\") " pod="openstack/dnsmasq-dns-57d769cc4f-txc97" Sep 30 14:15:05 crc kubenswrapper[4676]: I0930 14:15:05.616519 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fb231af-b3b4-45d8-ae43-8b418e1dfcf5-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-txc97\" (UID: \"1fb231af-b3b4-45d8-ae43-8b418e1dfcf5\") " pod="openstack/dnsmasq-dns-57d769cc4f-txc97" Sep 30 14:15:05 crc kubenswrapper[4676]: I0930 14:15:05.617322 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fb231af-b3b4-45d8-ae43-8b418e1dfcf5-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-txc97\" (UID: \"1fb231af-b3b4-45d8-ae43-8b418e1dfcf5\") " pod="openstack/dnsmasq-dns-57d769cc4f-txc97" Sep 30 14:15:05 crc kubenswrapper[4676]: I0930 14:15:05.617394 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fb231af-b3b4-45d8-ae43-8b418e1dfcf5-config\") pod \"dnsmasq-dns-57d769cc4f-txc97\" (UID: \"1fb231af-b3b4-45d8-ae43-8b418e1dfcf5\") " pod="openstack/dnsmasq-dns-57d769cc4f-txc97" Sep 30 14:15:05 crc kubenswrapper[4676]: I0930 14:15:05.642073 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgsrf\" (UniqueName: \"kubernetes.io/projected/1fb231af-b3b4-45d8-ae43-8b418e1dfcf5-kube-api-access-cgsrf\") pod \"dnsmasq-dns-57d769cc4f-txc97\" (UID: \"1fb231af-b3b4-45d8-ae43-8b418e1dfcf5\") " pod="openstack/dnsmasq-dns-57d769cc4f-txc97" Sep 30 14:15:05 crc kubenswrapper[4676]: I0930 14:15:05.699861 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-txc97" Sep 30 14:15:05 crc kubenswrapper[4676]: I0930 14:15:05.961675 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-txc97"] Sep 30 14:15:05 crc kubenswrapper[4676]: W0930 14:15:05.974415 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1fb231af_b3b4_45d8_ae43_8b418e1dfcf5.slice/crio-a1ea1d8b19b4205385f691ed1ec9ef5f597a353afa2cefd1e626271cb867597a WatchSource:0}: Error finding container a1ea1d8b19b4205385f691ed1ec9ef5f597a353afa2cefd1e626271cb867597a: Status 404 returned error can't find the container with id a1ea1d8b19b4205385f691ed1ec9ef5f597a353afa2cefd1e626271cb867597a Sep 30 14:15:05 crc kubenswrapper[4676]: I0930 14:15:05.999333 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-kp5vg"] Sep 30 14:15:06 crc kubenswrapper[4676]: W0930 14:15:06.002824 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb6fe7625_e1a3_4abd_b42b_591d546d5e37.slice/crio-478dc7fff95b07ff7747b232da99f30605451d6bbba400f9de97bfe47c28f506 WatchSource:0}: Error finding container 478dc7fff95b07ff7747b232da99f30605451d6bbba400f9de97bfe47c28f506: Status 404 returned error can't find the container with id 478dc7fff95b07ff7747b232da99f30605451d6bbba400f9de97bfe47c28f506 Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.198663 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.200620 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.203424 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.204805 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.205304 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.204984 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.205181 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.205236 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-mwbdr" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.205252 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.206166 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.336960 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/65fc51e6-2db0-4efd-880c-0ba599ef8637-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"65fc51e6-2db0-4efd-880c-0ba599ef8637\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.337226 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/65fc51e6-2db0-4efd-880c-0ba599ef8637-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"65fc51e6-2db0-4efd-880c-0ba599ef8637\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.337606 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/65fc51e6-2db0-4efd-880c-0ba599ef8637-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"65fc51e6-2db0-4efd-880c-0ba599ef8637\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.337684 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz85l\" (UniqueName: \"kubernetes.io/projected/65fc51e6-2db0-4efd-880c-0ba599ef8637-kube-api-access-xz85l\") pod \"rabbitmq-cell1-server-0\" (UID: \"65fc51e6-2db0-4efd-880c-0ba599ef8637\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.337821 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/65fc51e6-2db0-4efd-880c-0ba599ef8637-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"65fc51e6-2db0-4efd-880c-0ba599ef8637\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.337968 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/65fc51e6-2db0-4efd-880c-0ba599ef8637-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"65fc51e6-2db0-4efd-880c-0ba599ef8637\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.338034 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/65fc51e6-2db0-4efd-880c-0ba599ef8637-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"65fc51e6-2db0-4efd-880c-0ba599ef8637\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.338115 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/65fc51e6-2db0-4efd-880c-0ba599ef8637-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"65fc51e6-2db0-4efd-880c-0ba599ef8637\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.338238 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/65fc51e6-2db0-4efd-880c-0ba599ef8637-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"65fc51e6-2db0-4efd-880c-0ba599ef8637\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.338419 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/65fc51e6-2db0-4efd-880c-0ba599ef8637-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"65fc51e6-2db0-4efd-880c-0ba599ef8637\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.338484 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"65fc51e6-2db0-4efd-880c-0ba599ef8637\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.440693 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/65fc51e6-2db0-4efd-880c-0ba599ef8637-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"65fc51e6-2db0-4efd-880c-0ba599ef8637\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.440758 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xz85l\" (UniqueName: \"kubernetes.io/projected/65fc51e6-2db0-4efd-880c-0ba599ef8637-kube-api-access-xz85l\") pod \"rabbitmq-cell1-server-0\" (UID: \"65fc51e6-2db0-4efd-880c-0ba599ef8637\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.440804 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/65fc51e6-2db0-4efd-880c-0ba599ef8637-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"65fc51e6-2db0-4efd-880c-0ba599ef8637\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.440851 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/65fc51e6-2db0-4efd-880c-0ba599ef8637-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"65fc51e6-2db0-4efd-880c-0ba599ef8637\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.440948 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/65fc51e6-2db0-4efd-880c-0ba599ef8637-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"65fc51e6-2db0-4efd-880c-0ba599ef8637\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.441067 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/65fc51e6-2db0-4efd-880c-0ba599ef8637-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"65fc51e6-2db0-4efd-880c-0ba599ef8637\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.441099 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/65fc51e6-2db0-4efd-880c-0ba599ef8637-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"65fc51e6-2db0-4efd-880c-0ba599ef8637\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.441194 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/65fc51e6-2db0-4efd-880c-0ba599ef8637-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"65fc51e6-2db0-4efd-880c-0ba599ef8637\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.441224 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"65fc51e6-2db0-4efd-880c-0ba599ef8637\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.441269 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/65fc51e6-2db0-4efd-880c-0ba599ef8637-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"65fc51e6-2db0-4efd-880c-0ba599ef8637\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.441487 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/65fc51e6-2db0-4efd-880c-0ba599ef8637-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"65fc51e6-2db0-4efd-880c-0ba599ef8637\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.441946 4676 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"65fc51e6-2db0-4efd-880c-0ba599ef8637\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.442326 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/65fc51e6-2db0-4efd-880c-0ba599ef8637-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"65fc51e6-2db0-4efd-880c-0ba599ef8637\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.442369 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/65fc51e6-2db0-4efd-880c-0ba599ef8637-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"65fc51e6-2db0-4efd-880c-0ba599ef8637\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.442704 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/65fc51e6-2db0-4efd-880c-0ba599ef8637-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"65fc51e6-2db0-4efd-880c-0ba599ef8637\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.443028 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/65fc51e6-2db0-4efd-880c-0ba599ef8637-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"65fc51e6-2db0-4efd-880c-0ba599ef8637\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.443087 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/65fc51e6-2db0-4efd-880c-0ba599ef8637-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"65fc51e6-2db0-4efd-880c-0ba599ef8637\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.449756 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/65fc51e6-2db0-4efd-880c-0ba599ef8637-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"65fc51e6-2db0-4efd-880c-0ba599ef8637\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.452091 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/65fc51e6-2db0-4efd-880c-0ba599ef8637-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"65fc51e6-2db0-4efd-880c-0ba599ef8637\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.452500 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/65fc51e6-2db0-4efd-880c-0ba599ef8637-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"65fc51e6-2db0-4efd-880c-0ba599ef8637\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.456651 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/65fc51e6-2db0-4efd-880c-0ba599ef8637-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"65fc51e6-2db0-4efd-880c-0ba599ef8637\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.457556 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-kp5vg" event={"ID":"b6fe7625-e1a3-4abd-b42b-591d546d5e37","Type":"ContainerStarted","Data":"478dc7fff95b07ff7747b232da99f30605451d6bbba400f9de97bfe47c28f506"} Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.459262 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-txc97" event={"ID":"1fb231af-b3b4-45d8-ae43-8b418e1dfcf5","Type":"ContainerStarted","Data":"a1ea1d8b19b4205385f691ed1ec9ef5f597a353afa2cefd1e626271cb867597a"} Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.463440 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz85l\" (UniqueName: \"kubernetes.io/projected/65fc51e6-2db0-4efd-880c-0ba599ef8637-kube-api-access-xz85l\") pod \"rabbitmq-cell1-server-0\" (UID: \"65fc51e6-2db0-4efd-880c-0ba599ef8637\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.475260 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"65fc51e6-2db0-4efd-880c-0ba599ef8637\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.492731 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.494605 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.496379 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.497294 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.497429 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.497621 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.497813 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.497973 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-lps8l" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.502813 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.510684 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.545619 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.644732 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0ee88379-95a2-4019-a41a-7931a5ab2f30-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"0ee88379-95a2-4019-a41a-7931a5ab2f30\") " pod="openstack/rabbitmq-server-0" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.644849 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0ee88379-95a2-4019-a41a-7931a5ab2f30-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"0ee88379-95a2-4019-a41a-7931a5ab2f30\") " pod="openstack/rabbitmq-server-0" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.645255 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0ee88379-95a2-4019-a41a-7931a5ab2f30-server-conf\") pod \"rabbitmq-server-0\" (UID: \"0ee88379-95a2-4019-a41a-7931a5ab2f30\") " pod="openstack/rabbitmq-server-0" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.645307 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-server-0\" (UID: \"0ee88379-95a2-4019-a41a-7931a5ab2f30\") " pod="openstack/rabbitmq-server-0" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.645359 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0ee88379-95a2-4019-a41a-7931a5ab2f30-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"0ee88379-95a2-4019-a41a-7931a5ab2f30\") " pod="openstack/rabbitmq-server-0" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.645405 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0ee88379-95a2-4019-a41a-7931a5ab2f30-pod-info\") pod \"rabbitmq-server-0\" (UID: \"0ee88379-95a2-4019-a41a-7931a5ab2f30\") " pod="openstack/rabbitmq-server-0" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.645431 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0ee88379-95a2-4019-a41a-7931a5ab2f30-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"0ee88379-95a2-4019-a41a-7931a5ab2f30\") " pod="openstack/rabbitmq-server-0" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.645473 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9fc6\" (UniqueName: \"kubernetes.io/projected/0ee88379-95a2-4019-a41a-7931a5ab2f30-kube-api-access-q9fc6\") pod \"rabbitmq-server-0\" (UID: \"0ee88379-95a2-4019-a41a-7931a5ab2f30\") " pod="openstack/rabbitmq-server-0" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.645510 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0ee88379-95a2-4019-a41a-7931a5ab2f30-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"0ee88379-95a2-4019-a41a-7931a5ab2f30\") " pod="openstack/rabbitmq-server-0" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.645539 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0ee88379-95a2-4019-a41a-7931a5ab2f30-config-data\") pod \"rabbitmq-server-0\" (UID: \"0ee88379-95a2-4019-a41a-7931a5ab2f30\") " pod="openstack/rabbitmq-server-0" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.645568 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0ee88379-95a2-4019-a41a-7931a5ab2f30-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"0ee88379-95a2-4019-a41a-7931a5ab2f30\") " pod="openstack/rabbitmq-server-0" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.747253 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0ee88379-95a2-4019-a41a-7931a5ab2f30-pod-info\") pod \"rabbitmq-server-0\" (UID: \"0ee88379-95a2-4019-a41a-7931a5ab2f30\") " pod="openstack/rabbitmq-server-0" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.747301 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0ee88379-95a2-4019-a41a-7931a5ab2f30-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"0ee88379-95a2-4019-a41a-7931a5ab2f30\") " pod="openstack/rabbitmq-server-0" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.747337 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9fc6\" (UniqueName: \"kubernetes.io/projected/0ee88379-95a2-4019-a41a-7931a5ab2f30-kube-api-access-q9fc6\") pod \"rabbitmq-server-0\" (UID: \"0ee88379-95a2-4019-a41a-7931a5ab2f30\") " pod="openstack/rabbitmq-server-0" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.747364 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0ee88379-95a2-4019-a41a-7931a5ab2f30-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"0ee88379-95a2-4019-a41a-7931a5ab2f30\") " pod="openstack/rabbitmq-server-0" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.747382 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0ee88379-95a2-4019-a41a-7931a5ab2f30-config-data\") pod \"rabbitmq-server-0\" (UID: \"0ee88379-95a2-4019-a41a-7931a5ab2f30\") " pod="openstack/rabbitmq-server-0" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.747401 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0ee88379-95a2-4019-a41a-7931a5ab2f30-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"0ee88379-95a2-4019-a41a-7931a5ab2f30\") " pod="openstack/rabbitmq-server-0" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.747442 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0ee88379-95a2-4019-a41a-7931a5ab2f30-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"0ee88379-95a2-4019-a41a-7931a5ab2f30\") " pod="openstack/rabbitmq-server-0" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.747457 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0ee88379-95a2-4019-a41a-7931a5ab2f30-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"0ee88379-95a2-4019-a41a-7931a5ab2f30\") " pod="openstack/rabbitmq-server-0" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.747479 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0ee88379-95a2-4019-a41a-7931a5ab2f30-server-conf\") pod \"rabbitmq-server-0\" (UID: \"0ee88379-95a2-4019-a41a-7931a5ab2f30\") " pod="openstack/rabbitmq-server-0" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.747523 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-server-0\" (UID: \"0ee88379-95a2-4019-a41a-7931a5ab2f30\") " pod="openstack/rabbitmq-server-0" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.747553 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0ee88379-95a2-4019-a41a-7931a5ab2f30-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"0ee88379-95a2-4019-a41a-7931a5ab2f30\") " pod="openstack/rabbitmq-server-0" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.749708 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0ee88379-95a2-4019-a41a-7931a5ab2f30-config-data\") pod \"rabbitmq-server-0\" (UID: \"0ee88379-95a2-4019-a41a-7931a5ab2f30\") " pod="openstack/rabbitmq-server-0" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.750855 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0ee88379-95a2-4019-a41a-7931a5ab2f30-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"0ee88379-95a2-4019-a41a-7931a5ab2f30\") " pod="openstack/rabbitmq-server-0" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.752237 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0ee88379-95a2-4019-a41a-7931a5ab2f30-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"0ee88379-95a2-4019-a41a-7931a5ab2f30\") " pod="openstack/rabbitmq-server-0" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.752746 4676 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-server-0\" (UID: \"0ee88379-95a2-4019-a41a-7931a5ab2f30\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/rabbitmq-server-0" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.753303 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0ee88379-95a2-4019-a41a-7931a5ab2f30-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"0ee88379-95a2-4019-a41a-7931a5ab2f30\") " pod="openstack/rabbitmq-server-0" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.753779 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0ee88379-95a2-4019-a41a-7931a5ab2f30-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"0ee88379-95a2-4019-a41a-7931a5ab2f30\") " pod="openstack/rabbitmq-server-0" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.755690 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0ee88379-95a2-4019-a41a-7931a5ab2f30-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"0ee88379-95a2-4019-a41a-7931a5ab2f30\") " pod="openstack/rabbitmq-server-0" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.758392 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0ee88379-95a2-4019-a41a-7931a5ab2f30-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"0ee88379-95a2-4019-a41a-7931a5ab2f30\") " pod="openstack/rabbitmq-server-0" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.760176 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0ee88379-95a2-4019-a41a-7931a5ab2f30-pod-info\") pod \"rabbitmq-server-0\" (UID: \"0ee88379-95a2-4019-a41a-7931a5ab2f30\") " pod="openstack/rabbitmq-server-0" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.769837 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0ee88379-95a2-4019-a41a-7931a5ab2f30-server-conf\") pod \"rabbitmq-server-0\" (UID: \"0ee88379-95a2-4019-a41a-7931a5ab2f30\") " pod="openstack/rabbitmq-server-0" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.775727 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9fc6\" (UniqueName: \"kubernetes.io/projected/0ee88379-95a2-4019-a41a-7931a5ab2f30-kube-api-access-q9fc6\") pod \"rabbitmq-server-0\" (UID: \"0ee88379-95a2-4019-a41a-7931a5ab2f30\") " pod="openstack/rabbitmq-server-0" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.792467 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-server-0\" (UID: \"0ee88379-95a2-4019-a41a-7931a5ab2f30\") " pod="openstack/rabbitmq-server-0" Sep 30 14:15:06 crc kubenswrapper[4676]: I0930 14:15:06.838759 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 14:15:07 crc kubenswrapper[4676]: I0930 14:15:07.074842 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 14:15:08 crc kubenswrapper[4676]: I0930 14:15:08.741962 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Sep 30 14:15:08 crc kubenswrapper[4676]: I0930 14:15:08.744283 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Sep 30 14:15:08 crc kubenswrapper[4676]: I0930 14:15:08.752830 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Sep 30 14:15:08 crc kubenswrapper[4676]: I0930 14:15:08.754141 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Sep 30 14:15:08 crc kubenswrapper[4676]: I0930 14:15:08.754344 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Sep 30 14:15:08 crc kubenswrapper[4676]: I0930 14:15:08.754672 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Sep 30 14:15:08 crc kubenswrapper[4676]: I0930 14:15:08.754736 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Sep 30 14:15:08 crc kubenswrapper[4676]: I0930 14:15:08.755355 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-w6pps" Sep 30 14:15:08 crc kubenswrapper[4676]: I0930 14:15:08.759625 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Sep 30 14:15:08 crc kubenswrapper[4676]: I0930 14:15:08.884499 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d32e1e85-6a70-4751-9223-85e7018c3cc7-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"d32e1e85-6a70-4751-9223-85e7018c3cc7\") " pod="openstack/openstack-galera-0" Sep 30 14:15:08 crc kubenswrapper[4676]: I0930 14:15:08.884574 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d32e1e85-6a70-4751-9223-85e7018c3cc7-operator-scripts\") pod \"openstack-galera-0\" (UID: \"d32e1e85-6a70-4751-9223-85e7018c3cc7\") " pod="openstack/openstack-galera-0" Sep 30 14:15:08 crc kubenswrapper[4676]: I0930 14:15:08.884615 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d32e1e85-6a70-4751-9223-85e7018c3cc7-config-data-default\") pod \"openstack-galera-0\" (UID: \"d32e1e85-6a70-4751-9223-85e7018c3cc7\") " pod="openstack/openstack-galera-0" Sep 30 14:15:08 crc kubenswrapper[4676]: I0930 14:15:08.884663 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"d32e1e85-6a70-4751-9223-85e7018c3cc7\") " pod="openstack/openstack-galera-0" Sep 30 14:15:08 crc kubenswrapper[4676]: I0930 14:15:08.884706 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d32e1e85-6a70-4751-9223-85e7018c3cc7-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"d32e1e85-6a70-4751-9223-85e7018c3cc7\") " pod="openstack/openstack-galera-0" Sep 30 14:15:08 crc kubenswrapper[4676]: I0930 14:15:08.884731 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/d32e1e85-6a70-4751-9223-85e7018c3cc7-secrets\") pod \"openstack-galera-0\" (UID: \"d32e1e85-6a70-4751-9223-85e7018c3cc7\") " pod="openstack/openstack-galera-0" Sep 30 14:15:08 crc kubenswrapper[4676]: I0930 14:15:08.885144 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d32e1e85-6a70-4751-9223-85e7018c3cc7-config-data-generated\") pod \"openstack-galera-0\" (UID: \"d32e1e85-6a70-4751-9223-85e7018c3cc7\") " pod="openstack/openstack-galera-0" Sep 30 14:15:08 crc kubenswrapper[4676]: I0930 14:15:08.885249 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6wll\" (UniqueName: \"kubernetes.io/projected/d32e1e85-6a70-4751-9223-85e7018c3cc7-kube-api-access-m6wll\") pod \"openstack-galera-0\" (UID: \"d32e1e85-6a70-4751-9223-85e7018c3cc7\") " pod="openstack/openstack-galera-0" Sep 30 14:15:08 crc kubenswrapper[4676]: I0930 14:15:08.885304 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d32e1e85-6a70-4751-9223-85e7018c3cc7-kolla-config\") pod \"openstack-galera-0\" (UID: \"d32e1e85-6a70-4751-9223-85e7018c3cc7\") " pod="openstack/openstack-galera-0" Sep 30 14:15:08 crc kubenswrapper[4676]: I0930 14:15:08.988979 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d32e1e85-6a70-4751-9223-85e7018c3cc7-config-data-generated\") pod \"openstack-galera-0\" (UID: \"d32e1e85-6a70-4751-9223-85e7018c3cc7\") " pod="openstack/openstack-galera-0" Sep 30 14:15:08 crc kubenswrapper[4676]: I0930 14:15:08.989055 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6wll\" (UniqueName: \"kubernetes.io/projected/d32e1e85-6a70-4751-9223-85e7018c3cc7-kube-api-access-m6wll\") pod \"openstack-galera-0\" (UID: \"d32e1e85-6a70-4751-9223-85e7018c3cc7\") " pod="openstack/openstack-galera-0" Sep 30 14:15:08 crc kubenswrapper[4676]: I0930 14:15:08.989083 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d32e1e85-6a70-4751-9223-85e7018c3cc7-kolla-config\") pod \"openstack-galera-0\" (UID: \"d32e1e85-6a70-4751-9223-85e7018c3cc7\") " pod="openstack/openstack-galera-0" Sep 30 14:15:08 crc kubenswrapper[4676]: I0930 14:15:08.989108 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d32e1e85-6a70-4751-9223-85e7018c3cc7-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"d32e1e85-6a70-4751-9223-85e7018c3cc7\") " pod="openstack/openstack-galera-0" Sep 30 14:15:08 crc kubenswrapper[4676]: I0930 14:15:08.989146 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d32e1e85-6a70-4751-9223-85e7018c3cc7-operator-scripts\") pod \"openstack-galera-0\" (UID: \"d32e1e85-6a70-4751-9223-85e7018c3cc7\") " pod="openstack/openstack-galera-0" Sep 30 14:15:08 crc kubenswrapper[4676]: I0930 14:15:08.989180 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d32e1e85-6a70-4751-9223-85e7018c3cc7-config-data-default\") pod \"openstack-galera-0\" (UID: \"d32e1e85-6a70-4751-9223-85e7018c3cc7\") " pod="openstack/openstack-galera-0" Sep 30 14:15:08 crc kubenswrapper[4676]: I0930 14:15:08.989386 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"d32e1e85-6a70-4751-9223-85e7018c3cc7\") " pod="openstack/openstack-galera-0" Sep 30 14:15:08 crc kubenswrapper[4676]: I0930 14:15:08.989444 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d32e1e85-6a70-4751-9223-85e7018c3cc7-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"d32e1e85-6a70-4751-9223-85e7018c3cc7\") " pod="openstack/openstack-galera-0" Sep 30 14:15:08 crc kubenswrapper[4676]: I0930 14:15:08.989470 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/d32e1e85-6a70-4751-9223-85e7018c3cc7-secrets\") pod \"openstack-galera-0\" (UID: \"d32e1e85-6a70-4751-9223-85e7018c3cc7\") " pod="openstack/openstack-galera-0" Sep 30 14:15:08 crc kubenswrapper[4676]: I0930 14:15:08.989692 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d32e1e85-6a70-4751-9223-85e7018c3cc7-config-data-generated\") pod \"openstack-galera-0\" (UID: \"d32e1e85-6a70-4751-9223-85e7018c3cc7\") " pod="openstack/openstack-galera-0" Sep 30 14:15:08 crc kubenswrapper[4676]: I0930 14:15:08.990624 4676 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"d32e1e85-6a70-4751-9223-85e7018c3cc7\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/openstack-galera-0" Sep 30 14:15:08 crc kubenswrapper[4676]: I0930 14:15:08.991292 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d32e1e85-6a70-4751-9223-85e7018c3cc7-kolla-config\") pod \"openstack-galera-0\" (UID: \"d32e1e85-6a70-4751-9223-85e7018c3cc7\") " pod="openstack/openstack-galera-0" Sep 30 14:15:08 crc kubenswrapper[4676]: I0930 14:15:08.992485 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d32e1e85-6a70-4751-9223-85e7018c3cc7-config-data-default\") pod \"openstack-galera-0\" (UID: \"d32e1e85-6a70-4751-9223-85e7018c3cc7\") " pod="openstack/openstack-galera-0" Sep 30 14:15:08 crc kubenswrapper[4676]: I0930 14:15:08.993324 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d32e1e85-6a70-4751-9223-85e7018c3cc7-operator-scripts\") pod \"openstack-galera-0\" (UID: \"d32e1e85-6a70-4751-9223-85e7018c3cc7\") " pod="openstack/openstack-galera-0" Sep 30 14:15:08 crc kubenswrapper[4676]: I0930 14:15:08.997731 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/d32e1e85-6a70-4751-9223-85e7018c3cc7-secrets\") pod \"openstack-galera-0\" (UID: \"d32e1e85-6a70-4751-9223-85e7018c3cc7\") " pod="openstack/openstack-galera-0" Sep 30 14:15:09 crc kubenswrapper[4676]: I0930 14:15:08.998525 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d32e1e85-6a70-4751-9223-85e7018c3cc7-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"d32e1e85-6a70-4751-9223-85e7018c3cc7\") " pod="openstack/openstack-galera-0" Sep 30 14:15:09 crc kubenswrapper[4676]: I0930 14:15:09.007821 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d32e1e85-6a70-4751-9223-85e7018c3cc7-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"d32e1e85-6a70-4751-9223-85e7018c3cc7\") " pod="openstack/openstack-galera-0" Sep 30 14:15:09 crc kubenswrapper[4676]: I0930 14:15:09.010358 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6wll\" (UniqueName: \"kubernetes.io/projected/d32e1e85-6a70-4751-9223-85e7018c3cc7-kube-api-access-m6wll\") pod \"openstack-galera-0\" (UID: \"d32e1e85-6a70-4751-9223-85e7018c3cc7\") " pod="openstack/openstack-galera-0" Sep 30 14:15:09 crc kubenswrapper[4676]: I0930 14:15:09.019901 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"d32e1e85-6a70-4751-9223-85e7018c3cc7\") " pod="openstack/openstack-galera-0" Sep 30 14:15:09 crc kubenswrapper[4676]: I0930 14:15:09.093259 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Sep 30 14:15:09 crc kubenswrapper[4676]: I0930 14:15:09.194486 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Sep 30 14:15:09 crc kubenswrapper[4676]: I0930 14:15:09.197586 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Sep 30 14:15:09 crc kubenswrapper[4676]: I0930 14:15:09.200513 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-r84zs" Sep 30 14:15:09 crc kubenswrapper[4676]: I0930 14:15:09.205848 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Sep 30 14:15:09 crc kubenswrapper[4676]: I0930 14:15:09.207286 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Sep 30 14:15:09 crc kubenswrapper[4676]: I0930 14:15:09.207623 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Sep 30 14:15:09 crc kubenswrapper[4676]: I0930 14:15:09.207766 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Sep 30 14:15:09 crc kubenswrapper[4676]: I0930 14:15:09.294797 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9762912c-8ca3-4791-93c0-4d5728543998-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"9762912c-8ca3-4791-93c0-4d5728543998\") " pod="openstack/openstack-cell1-galera-0" Sep 30 14:15:09 crc kubenswrapper[4676]: I0930 14:15:09.295195 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vn9xj\" (UniqueName: \"kubernetes.io/projected/9762912c-8ca3-4791-93c0-4d5728543998-kube-api-access-vn9xj\") pod \"openstack-cell1-galera-0\" (UID: \"9762912c-8ca3-4791-93c0-4d5728543998\") " pod="openstack/openstack-cell1-galera-0" Sep 30 14:15:09 crc kubenswrapper[4676]: I0930 14:15:09.295573 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9762912c-8ca3-4791-93c0-4d5728543998-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"9762912c-8ca3-4791-93c0-4d5728543998\") " pod="openstack/openstack-cell1-galera-0" Sep 30 14:15:09 crc kubenswrapper[4676]: I0930 14:15:09.295642 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9762912c-8ca3-4791-93c0-4d5728543998-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"9762912c-8ca3-4791-93c0-4d5728543998\") " pod="openstack/openstack-cell1-galera-0" Sep 30 14:15:09 crc kubenswrapper[4676]: I0930 14:15:09.295673 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9762912c-8ca3-4791-93c0-4d5728543998-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"9762912c-8ca3-4791-93c0-4d5728543998\") " pod="openstack/openstack-cell1-galera-0" Sep 30 14:15:09 crc kubenswrapper[4676]: I0930 14:15:09.295771 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9762912c-8ca3-4791-93c0-4d5728543998-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"9762912c-8ca3-4791-93c0-4d5728543998\") " pod="openstack/openstack-cell1-galera-0" Sep 30 14:15:09 crc kubenswrapper[4676]: I0930 14:15:09.295795 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9762912c-8ca3-4791-93c0-4d5728543998-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"9762912c-8ca3-4791-93c0-4d5728543998\") " pod="openstack/openstack-cell1-galera-0" Sep 30 14:15:09 crc kubenswrapper[4676]: I0930 14:15:09.295827 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/9762912c-8ca3-4791-93c0-4d5728543998-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"9762912c-8ca3-4791-93c0-4d5728543998\") " pod="openstack/openstack-cell1-galera-0" Sep 30 14:15:09 crc kubenswrapper[4676]: I0930 14:15:09.295862 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"9762912c-8ca3-4791-93c0-4d5728543998\") " pod="openstack/openstack-cell1-galera-0" Sep 30 14:15:09 crc kubenswrapper[4676]: I0930 14:15:09.397533 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9762912c-8ca3-4791-93c0-4d5728543998-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"9762912c-8ca3-4791-93c0-4d5728543998\") " pod="openstack/openstack-cell1-galera-0" Sep 30 14:15:09 crc kubenswrapper[4676]: I0930 14:15:09.397572 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9762912c-8ca3-4791-93c0-4d5728543998-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"9762912c-8ca3-4791-93c0-4d5728543998\") " pod="openstack/openstack-cell1-galera-0" Sep 30 14:15:09 crc kubenswrapper[4676]: I0930 14:15:09.397596 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/9762912c-8ca3-4791-93c0-4d5728543998-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"9762912c-8ca3-4791-93c0-4d5728543998\") " pod="openstack/openstack-cell1-galera-0" Sep 30 14:15:09 crc kubenswrapper[4676]: I0930 14:15:09.397625 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"9762912c-8ca3-4791-93c0-4d5728543998\") " pod="openstack/openstack-cell1-galera-0" Sep 30 14:15:09 crc kubenswrapper[4676]: I0930 14:15:09.397642 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9762912c-8ca3-4791-93c0-4d5728543998-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"9762912c-8ca3-4791-93c0-4d5728543998\") " pod="openstack/openstack-cell1-galera-0" Sep 30 14:15:09 crc kubenswrapper[4676]: I0930 14:15:09.397658 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vn9xj\" (UniqueName: \"kubernetes.io/projected/9762912c-8ca3-4791-93c0-4d5728543998-kube-api-access-vn9xj\") pod \"openstack-cell1-galera-0\" (UID: \"9762912c-8ca3-4791-93c0-4d5728543998\") " pod="openstack/openstack-cell1-galera-0" Sep 30 14:15:09 crc kubenswrapper[4676]: I0930 14:15:09.397689 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9762912c-8ca3-4791-93c0-4d5728543998-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"9762912c-8ca3-4791-93c0-4d5728543998\") " pod="openstack/openstack-cell1-galera-0" Sep 30 14:15:09 crc kubenswrapper[4676]: I0930 14:15:09.397725 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9762912c-8ca3-4791-93c0-4d5728543998-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"9762912c-8ca3-4791-93c0-4d5728543998\") " pod="openstack/openstack-cell1-galera-0" Sep 30 14:15:09 crc kubenswrapper[4676]: I0930 14:15:09.397742 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9762912c-8ca3-4791-93c0-4d5728543998-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"9762912c-8ca3-4791-93c0-4d5728543998\") " pod="openstack/openstack-cell1-galera-0" Sep 30 14:15:09 crc kubenswrapper[4676]: I0930 14:15:09.399278 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9762912c-8ca3-4791-93c0-4d5728543998-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"9762912c-8ca3-4791-93c0-4d5728543998\") " pod="openstack/openstack-cell1-galera-0" Sep 30 14:15:09 crc kubenswrapper[4676]: I0930 14:15:09.400255 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9762912c-8ca3-4791-93c0-4d5728543998-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"9762912c-8ca3-4791-93c0-4d5728543998\") " pod="openstack/openstack-cell1-galera-0" Sep 30 14:15:09 crc kubenswrapper[4676]: I0930 14:15:09.401180 4676 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"9762912c-8ca3-4791-93c0-4d5728543998\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/openstack-cell1-galera-0" Sep 30 14:15:09 crc kubenswrapper[4676]: I0930 14:15:09.401680 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9762912c-8ca3-4791-93c0-4d5728543998-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"9762912c-8ca3-4791-93c0-4d5728543998\") " pod="openstack/openstack-cell1-galera-0" Sep 30 14:15:09 crc kubenswrapper[4676]: I0930 14:15:09.402152 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9762912c-8ca3-4791-93c0-4d5728543998-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"9762912c-8ca3-4791-93c0-4d5728543998\") " pod="openstack/openstack-cell1-galera-0" Sep 30 14:15:09 crc kubenswrapper[4676]: I0930 14:15:09.404953 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/9762912c-8ca3-4791-93c0-4d5728543998-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"9762912c-8ca3-4791-93c0-4d5728543998\") " pod="openstack/openstack-cell1-galera-0" Sep 30 14:15:09 crc kubenswrapper[4676]: I0930 14:15:09.405101 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9762912c-8ca3-4791-93c0-4d5728543998-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"9762912c-8ca3-4791-93c0-4d5728543998\") " pod="openstack/openstack-cell1-galera-0" Sep 30 14:15:09 crc kubenswrapper[4676]: I0930 14:15:09.435449 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vn9xj\" (UniqueName: \"kubernetes.io/projected/9762912c-8ca3-4791-93c0-4d5728543998-kube-api-access-vn9xj\") pod \"openstack-cell1-galera-0\" (UID: \"9762912c-8ca3-4791-93c0-4d5728543998\") " pod="openstack/openstack-cell1-galera-0" Sep 30 14:15:09 crc kubenswrapper[4676]: I0930 14:15:09.445958 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9762912c-8ca3-4791-93c0-4d5728543998-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"9762912c-8ca3-4791-93c0-4d5728543998\") " pod="openstack/openstack-cell1-galera-0" Sep 30 14:15:09 crc kubenswrapper[4676]: I0930 14:15:09.455577 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"9762912c-8ca3-4791-93c0-4d5728543998\") " pod="openstack/openstack-cell1-galera-0" Sep 30 14:15:09 crc kubenswrapper[4676]: I0930 14:15:09.533916 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Sep 30 14:15:09 crc kubenswrapper[4676]: I0930 14:15:09.586426 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Sep 30 14:15:09 crc kubenswrapper[4676]: I0930 14:15:09.587727 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Sep 30 14:15:09 crc kubenswrapper[4676]: I0930 14:15:09.594054 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Sep 30 14:15:09 crc kubenswrapper[4676]: I0930 14:15:09.594302 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-dcqlq" Sep 30 14:15:09 crc kubenswrapper[4676]: I0930 14:15:09.595362 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Sep 30 14:15:09 crc kubenswrapper[4676]: I0930 14:15:09.603520 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Sep 30 14:15:09 crc kubenswrapper[4676]: I0930 14:15:09.704853 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aef6349-6b33-4f9e-972d-a990cb3ff62e-combined-ca-bundle\") pod \"memcached-0\" (UID: \"7aef6349-6b33-4f9e-972d-a990cb3ff62e\") " pod="openstack/memcached-0" Sep 30 14:15:09 crc kubenswrapper[4676]: I0930 14:15:09.704918 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9984\" (UniqueName: \"kubernetes.io/projected/7aef6349-6b33-4f9e-972d-a990cb3ff62e-kube-api-access-t9984\") pod \"memcached-0\" (UID: \"7aef6349-6b33-4f9e-972d-a990cb3ff62e\") " pod="openstack/memcached-0" Sep 30 14:15:09 crc kubenswrapper[4676]: I0930 14:15:09.704972 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7aef6349-6b33-4f9e-972d-a990cb3ff62e-kolla-config\") pod \"memcached-0\" (UID: \"7aef6349-6b33-4f9e-972d-a990cb3ff62e\") " pod="openstack/memcached-0" Sep 30 14:15:09 crc kubenswrapper[4676]: I0930 14:15:09.705068 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7aef6349-6b33-4f9e-972d-a990cb3ff62e-config-data\") pod \"memcached-0\" (UID: \"7aef6349-6b33-4f9e-972d-a990cb3ff62e\") " pod="openstack/memcached-0" Sep 30 14:15:09 crc kubenswrapper[4676]: I0930 14:15:09.705097 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/7aef6349-6b33-4f9e-972d-a990cb3ff62e-memcached-tls-certs\") pod \"memcached-0\" (UID: \"7aef6349-6b33-4f9e-972d-a990cb3ff62e\") " pod="openstack/memcached-0" Sep 30 14:15:09 crc kubenswrapper[4676]: I0930 14:15:09.806983 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aef6349-6b33-4f9e-972d-a990cb3ff62e-combined-ca-bundle\") pod \"memcached-0\" (UID: \"7aef6349-6b33-4f9e-972d-a990cb3ff62e\") " pod="openstack/memcached-0" Sep 30 14:15:09 crc kubenswrapper[4676]: I0930 14:15:09.807043 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9984\" (UniqueName: \"kubernetes.io/projected/7aef6349-6b33-4f9e-972d-a990cb3ff62e-kube-api-access-t9984\") pod \"memcached-0\" (UID: \"7aef6349-6b33-4f9e-972d-a990cb3ff62e\") " pod="openstack/memcached-0" Sep 30 14:15:09 crc kubenswrapper[4676]: I0930 14:15:09.807096 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7aef6349-6b33-4f9e-972d-a990cb3ff62e-kolla-config\") pod \"memcached-0\" (UID: \"7aef6349-6b33-4f9e-972d-a990cb3ff62e\") " pod="openstack/memcached-0" Sep 30 14:15:09 crc kubenswrapper[4676]: I0930 14:15:09.807196 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7aef6349-6b33-4f9e-972d-a990cb3ff62e-config-data\") pod \"memcached-0\" (UID: \"7aef6349-6b33-4f9e-972d-a990cb3ff62e\") " pod="openstack/memcached-0" Sep 30 14:15:09 crc kubenswrapper[4676]: I0930 14:15:09.807227 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/7aef6349-6b33-4f9e-972d-a990cb3ff62e-memcached-tls-certs\") pod \"memcached-0\" (UID: \"7aef6349-6b33-4f9e-972d-a990cb3ff62e\") " pod="openstack/memcached-0" Sep 30 14:15:09 crc kubenswrapper[4676]: I0930 14:15:09.808358 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7aef6349-6b33-4f9e-972d-a990cb3ff62e-kolla-config\") pod \"memcached-0\" (UID: \"7aef6349-6b33-4f9e-972d-a990cb3ff62e\") " pod="openstack/memcached-0" Sep 30 14:15:09 crc kubenswrapper[4676]: I0930 14:15:09.808444 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7aef6349-6b33-4f9e-972d-a990cb3ff62e-config-data\") pod \"memcached-0\" (UID: \"7aef6349-6b33-4f9e-972d-a990cb3ff62e\") " pod="openstack/memcached-0" Sep 30 14:15:09 crc kubenswrapper[4676]: I0930 14:15:09.812178 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aef6349-6b33-4f9e-972d-a990cb3ff62e-combined-ca-bundle\") pod \"memcached-0\" (UID: \"7aef6349-6b33-4f9e-972d-a990cb3ff62e\") " pod="openstack/memcached-0" Sep 30 14:15:09 crc kubenswrapper[4676]: I0930 14:15:09.812590 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/7aef6349-6b33-4f9e-972d-a990cb3ff62e-memcached-tls-certs\") pod \"memcached-0\" (UID: \"7aef6349-6b33-4f9e-972d-a990cb3ff62e\") " pod="openstack/memcached-0" Sep 30 14:15:09 crc kubenswrapper[4676]: I0930 14:15:09.830447 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9984\" (UniqueName: \"kubernetes.io/projected/7aef6349-6b33-4f9e-972d-a990cb3ff62e-kube-api-access-t9984\") pod \"memcached-0\" (UID: \"7aef6349-6b33-4f9e-972d-a990cb3ff62e\") " pod="openstack/memcached-0" Sep 30 14:15:09 crc kubenswrapper[4676]: I0930 14:15:09.960125 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Sep 30 14:15:11 crc kubenswrapper[4676]: I0930 14:15:11.264311 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 14:15:11 crc kubenswrapper[4676]: I0930 14:15:11.265545 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 30 14:15:11 crc kubenswrapper[4676]: I0930 14:15:11.267528 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-9fks2" Sep 30 14:15:11 crc kubenswrapper[4676]: I0930 14:15:11.277821 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 14:15:11 crc kubenswrapper[4676]: I0930 14:15:11.332654 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-785br\" (UniqueName: \"kubernetes.io/projected/ad8d4649-f28a-4d12-884f-44308450c02b-kube-api-access-785br\") pod \"kube-state-metrics-0\" (UID: \"ad8d4649-f28a-4d12-884f-44308450c02b\") " pod="openstack/kube-state-metrics-0" Sep 30 14:15:11 crc kubenswrapper[4676]: I0930 14:15:11.435565 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-785br\" (UniqueName: \"kubernetes.io/projected/ad8d4649-f28a-4d12-884f-44308450c02b-kube-api-access-785br\") pod \"kube-state-metrics-0\" (UID: \"ad8d4649-f28a-4d12-884f-44308450c02b\") " pod="openstack/kube-state-metrics-0" Sep 30 14:15:11 crc kubenswrapper[4676]: I0930 14:15:11.461994 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-785br\" (UniqueName: \"kubernetes.io/projected/ad8d4649-f28a-4d12-884f-44308450c02b-kube-api-access-785br\") pod \"kube-state-metrics-0\" (UID: \"ad8d4649-f28a-4d12-884f-44308450c02b\") " pod="openstack/kube-state-metrics-0" Sep 30 14:15:11 crc kubenswrapper[4676]: I0930 14:15:11.589088 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 30 14:15:13 crc kubenswrapper[4676]: I0930 14:15:13.539698 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"65fc51e6-2db0-4efd-880c-0ba599ef8637","Type":"ContainerStarted","Data":"e56d13fc7d0baeaa2fcbfcecd2480ee9e2cfc3d4f710aabec23ca62f8990a525"} Sep 30 14:15:13 crc kubenswrapper[4676]: I0930 14:15:13.718107 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 14:15:14 crc kubenswrapper[4676]: I0930 14:15:14.973747 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-9nlxb"] Sep 30 14:15:14 crc kubenswrapper[4676]: I0930 14:15:14.981302 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9nlxb" Sep 30 14:15:14 crc kubenswrapper[4676]: I0930 14:15:14.984526 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Sep 30 14:15:14 crc kubenswrapper[4676]: I0930 14:15:14.984940 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-hwv6p" Sep 30 14:15:14 crc kubenswrapper[4676]: I0930 14:15:14.988534 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-7t792"] Sep 30 14:15:14 crc kubenswrapper[4676]: I0930 14:15:14.990804 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Sep 30 14:15:14 crc kubenswrapper[4676]: I0930 14:15:14.991142 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-7t792" Sep 30 14:15:14 crc kubenswrapper[4676]: I0930 14:15:14.997389 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9nlxb"] Sep 30 14:15:15 crc kubenswrapper[4676]: I0930 14:15:15.025044 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-7t792"] Sep 30 14:15:15 crc kubenswrapper[4676]: I0930 14:15:15.096501 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/72431bf0-bd8f-431d-81a8-082f9ef654e1-var-run-ovn\") pod \"ovn-controller-9nlxb\" (UID: \"72431bf0-bd8f-431d-81a8-082f9ef654e1\") " pod="openstack/ovn-controller-9nlxb" Sep 30 14:15:15 crc kubenswrapper[4676]: I0930 14:15:15.096554 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a69f5429-b9b9-47c3-b720-1a59c5d87b27-var-run\") pod \"ovn-controller-ovs-7t792\" (UID: \"a69f5429-b9b9-47c3-b720-1a59c5d87b27\") " pod="openstack/ovn-controller-ovs-7t792" Sep 30 14:15:15 crc kubenswrapper[4676]: I0930 14:15:15.096583 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cknlz\" (UniqueName: \"kubernetes.io/projected/72431bf0-bd8f-431d-81a8-082f9ef654e1-kube-api-access-cknlz\") pod \"ovn-controller-9nlxb\" (UID: \"72431bf0-bd8f-431d-81a8-082f9ef654e1\") " pod="openstack/ovn-controller-9nlxb" Sep 30 14:15:15 crc kubenswrapper[4676]: I0930 14:15:15.096603 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/a69f5429-b9b9-47c3-b720-1a59c5d87b27-etc-ovs\") pod \"ovn-controller-ovs-7t792\" (UID: \"a69f5429-b9b9-47c3-b720-1a59c5d87b27\") " pod="openstack/ovn-controller-ovs-7t792" Sep 30 14:15:15 crc kubenswrapper[4676]: I0930 14:15:15.096621 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72431bf0-bd8f-431d-81a8-082f9ef654e1-combined-ca-bundle\") pod \"ovn-controller-9nlxb\" (UID: \"72431bf0-bd8f-431d-81a8-082f9ef654e1\") " pod="openstack/ovn-controller-9nlxb" Sep 30 14:15:15 crc kubenswrapper[4676]: I0930 14:15:15.096641 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwl7g\" (UniqueName: \"kubernetes.io/projected/a69f5429-b9b9-47c3-b720-1a59c5d87b27-kube-api-access-qwl7g\") pod \"ovn-controller-ovs-7t792\" (UID: \"a69f5429-b9b9-47c3-b720-1a59c5d87b27\") " pod="openstack/ovn-controller-ovs-7t792" Sep 30 14:15:15 crc kubenswrapper[4676]: I0930 14:15:15.096669 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/72431bf0-bd8f-431d-81a8-082f9ef654e1-var-run\") pod \"ovn-controller-9nlxb\" (UID: \"72431bf0-bd8f-431d-81a8-082f9ef654e1\") " pod="openstack/ovn-controller-9nlxb" Sep 30 14:15:15 crc kubenswrapper[4676]: I0930 14:15:15.096703 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a69f5429-b9b9-47c3-b720-1a59c5d87b27-var-log\") pod \"ovn-controller-ovs-7t792\" (UID: \"a69f5429-b9b9-47c3-b720-1a59c5d87b27\") " pod="openstack/ovn-controller-ovs-7t792" Sep 30 14:15:15 crc kubenswrapper[4676]: I0930 14:15:15.096769 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/72431bf0-bd8f-431d-81a8-082f9ef654e1-ovn-controller-tls-certs\") pod \"ovn-controller-9nlxb\" (UID: \"72431bf0-bd8f-431d-81a8-082f9ef654e1\") " pod="openstack/ovn-controller-9nlxb" Sep 30 14:15:15 crc kubenswrapper[4676]: I0930 14:15:15.096822 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/72431bf0-bd8f-431d-81a8-082f9ef654e1-scripts\") pod \"ovn-controller-9nlxb\" (UID: \"72431bf0-bd8f-431d-81a8-082f9ef654e1\") " pod="openstack/ovn-controller-9nlxb" Sep 30 14:15:15 crc kubenswrapper[4676]: I0930 14:15:15.096854 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/72431bf0-bd8f-431d-81a8-082f9ef654e1-var-log-ovn\") pod \"ovn-controller-9nlxb\" (UID: \"72431bf0-bd8f-431d-81a8-082f9ef654e1\") " pod="openstack/ovn-controller-9nlxb" Sep 30 14:15:15 crc kubenswrapper[4676]: I0930 14:15:15.096910 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/a69f5429-b9b9-47c3-b720-1a59c5d87b27-var-lib\") pod \"ovn-controller-ovs-7t792\" (UID: \"a69f5429-b9b9-47c3-b720-1a59c5d87b27\") " pod="openstack/ovn-controller-ovs-7t792" Sep 30 14:15:15 crc kubenswrapper[4676]: I0930 14:15:15.096945 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a69f5429-b9b9-47c3-b720-1a59c5d87b27-scripts\") pod \"ovn-controller-ovs-7t792\" (UID: \"a69f5429-b9b9-47c3-b720-1a59c5d87b27\") " pod="openstack/ovn-controller-ovs-7t792" Sep 30 14:15:15 crc kubenswrapper[4676]: I0930 14:15:15.198793 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a69f5429-b9b9-47c3-b720-1a59c5d87b27-var-log\") pod \"ovn-controller-ovs-7t792\" (UID: \"a69f5429-b9b9-47c3-b720-1a59c5d87b27\") " pod="openstack/ovn-controller-ovs-7t792" Sep 30 14:15:15 crc kubenswrapper[4676]: I0930 14:15:15.198894 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/72431bf0-bd8f-431d-81a8-082f9ef654e1-ovn-controller-tls-certs\") pod \"ovn-controller-9nlxb\" (UID: \"72431bf0-bd8f-431d-81a8-082f9ef654e1\") " pod="openstack/ovn-controller-9nlxb" Sep 30 14:15:15 crc kubenswrapper[4676]: I0930 14:15:15.198930 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/72431bf0-bd8f-431d-81a8-082f9ef654e1-scripts\") pod \"ovn-controller-9nlxb\" (UID: \"72431bf0-bd8f-431d-81a8-082f9ef654e1\") " pod="openstack/ovn-controller-9nlxb" Sep 30 14:15:15 crc kubenswrapper[4676]: I0930 14:15:15.198966 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/72431bf0-bd8f-431d-81a8-082f9ef654e1-var-log-ovn\") pod \"ovn-controller-9nlxb\" (UID: \"72431bf0-bd8f-431d-81a8-082f9ef654e1\") " pod="openstack/ovn-controller-9nlxb" Sep 30 14:15:15 crc kubenswrapper[4676]: I0930 14:15:15.199012 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/a69f5429-b9b9-47c3-b720-1a59c5d87b27-var-lib\") pod \"ovn-controller-ovs-7t792\" (UID: \"a69f5429-b9b9-47c3-b720-1a59c5d87b27\") " pod="openstack/ovn-controller-ovs-7t792" Sep 30 14:15:15 crc kubenswrapper[4676]: I0930 14:15:15.199062 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a69f5429-b9b9-47c3-b720-1a59c5d87b27-scripts\") pod \"ovn-controller-ovs-7t792\" (UID: \"a69f5429-b9b9-47c3-b720-1a59c5d87b27\") " pod="openstack/ovn-controller-ovs-7t792" Sep 30 14:15:15 crc kubenswrapper[4676]: I0930 14:15:15.199094 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/72431bf0-bd8f-431d-81a8-082f9ef654e1-var-run-ovn\") pod \"ovn-controller-9nlxb\" (UID: \"72431bf0-bd8f-431d-81a8-082f9ef654e1\") " pod="openstack/ovn-controller-9nlxb" Sep 30 14:15:15 crc kubenswrapper[4676]: I0930 14:15:15.199120 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a69f5429-b9b9-47c3-b720-1a59c5d87b27-var-run\") pod \"ovn-controller-ovs-7t792\" (UID: \"a69f5429-b9b9-47c3-b720-1a59c5d87b27\") " pod="openstack/ovn-controller-ovs-7t792" Sep 30 14:15:15 crc kubenswrapper[4676]: I0930 14:15:15.199186 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cknlz\" (UniqueName: \"kubernetes.io/projected/72431bf0-bd8f-431d-81a8-082f9ef654e1-kube-api-access-cknlz\") pod \"ovn-controller-9nlxb\" (UID: \"72431bf0-bd8f-431d-81a8-082f9ef654e1\") " pod="openstack/ovn-controller-9nlxb" Sep 30 14:15:15 crc kubenswrapper[4676]: I0930 14:15:15.199211 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/a69f5429-b9b9-47c3-b720-1a59c5d87b27-etc-ovs\") pod \"ovn-controller-ovs-7t792\" (UID: \"a69f5429-b9b9-47c3-b720-1a59c5d87b27\") " pod="openstack/ovn-controller-ovs-7t792" Sep 30 14:15:15 crc kubenswrapper[4676]: I0930 14:15:15.199241 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72431bf0-bd8f-431d-81a8-082f9ef654e1-combined-ca-bundle\") pod \"ovn-controller-9nlxb\" (UID: \"72431bf0-bd8f-431d-81a8-082f9ef654e1\") " pod="openstack/ovn-controller-9nlxb" Sep 30 14:15:15 crc kubenswrapper[4676]: I0930 14:15:15.199264 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwl7g\" (UniqueName: \"kubernetes.io/projected/a69f5429-b9b9-47c3-b720-1a59c5d87b27-kube-api-access-qwl7g\") pod \"ovn-controller-ovs-7t792\" (UID: \"a69f5429-b9b9-47c3-b720-1a59c5d87b27\") " pod="openstack/ovn-controller-ovs-7t792" Sep 30 14:15:15 crc kubenswrapper[4676]: I0930 14:15:15.199303 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/72431bf0-bd8f-431d-81a8-082f9ef654e1-var-run\") pod \"ovn-controller-9nlxb\" (UID: \"72431bf0-bd8f-431d-81a8-082f9ef654e1\") " pod="openstack/ovn-controller-9nlxb" Sep 30 14:15:15 crc kubenswrapper[4676]: I0930 14:15:15.199613 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a69f5429-b9b9-47c3-b720-1a59c5d87b27-var-log\") pod \"ovn-controller-ovs-7t792\" (UID: \"a69f5429-b9b9-47c3-b720-1a59c5d87b27\") " pod="openstack/ovn-controller-ovs-7t792" Sep 30 14:15:15 crc kubenswrapper[4676]: I0930 14:15:15.199690 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/72431bf0-bd8f-431d-81a8-082f9ef654e1-var-run\") pod \"ovn-controller-9nlxb\" (UID: \"72431bf0-bd8f-431d-81a8-082f9ef654e1\") " pod="openstack/ovn-controller-9nlxb" Sep 30 14:15:15 crc kubenswrapper[4676]: I0930 14:15:15.199770 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a69f5429-b9b9-47c3-b720-1a59c5d87b27-var-run\") pod \"ovn-controller-ovs-7t792\" (UID: \"a69f5429-b9b9-47c3-b720-1a59c5d87b27\") " pod="openstack/ovn-controller-ovs-7t792" Sep 30 14:15:15 crc kubenswrapper[4676]: I0930 14:15:15.199771 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/72431bf0-bd8f-431d-81a8-082f9ef654e1-var-run-ovn\") pod \"ovn-controller-9nlxb\" (UID: \"72431bf0-bd8f-431d-81a8-082f9ef654e1\") " pod="openstack/ovn-controller-9nlxb" Sep 30 14:15:15 crc kubenswrapper[4676]: I0930 14:15:15.200544 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/a69f5429-b9b9-47c3-b720-1a59c5d87b27-var-lib\") pod \"ovn-controller-ovs-7t792\" (UID: \"a69f5429-b9b9-47c3-b720-1a59c5d87b27\") " pod="openstack/ovn-controller-ovs-7t792" Sep 30 14:15:15 crc kubenswrapper[4676]: I0930 14:15:15.200676 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/72431bf0-bd8f-431d-81a8-082f9ef654e1-var-log-ovn\") pod \"ovn-controller-9nlxb\" (UID: \"72431bf0-bd8f-431d-81a8-082f9ef654e1\") " pod="openstack/ovn-controller-9nlxb" Sep 30 14:15:15 crc kubenswrapper[4676]: I0930 14:15:15.202603 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/72431bf0-bd8f-431d-81a8-082f9ef654e1-scripts\") pod \"ovn-controller-9nlxb\" (UID: \"72431bf0-bd8f-431d-81a8-082f9ef654e1\") " pod="openstack/ovn-controller-9nlxb" Sep 30 14:15:15 crc kubenswrapper[4676]: I0930 14:15:15.202785 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/a69f5429-b9b9-47c3-b720-1a59c5d87b27-etc-ovs\") pod \"ovn-controller-ovs-7t792\" (UID: \"a69f5429-b9b9-47c3-b720-1a59c5d87b27\") " pod="openstack/ovn-controller-ovs-7t792" Sep 30 14:15:15 crc kubenswrapper[4676]: I0930 14:15:15.203590 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a69f5429-b9b9-47c3-b720-1a59c5d87b27-scripts\") pod \"ovn-controller-ovs-7t792\" (UID: \"a69f5429-b9b9-47c3-b720-1a59c5d87b27\") " pod="openstack/ovn-controller-ovs-7t792" Sep 30 14:15:15 crc kubenswrapper[4676]: I0930 14:15:15.205947 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/72431bf0-bd8f-431d-81a8-082f9ef654e1-ovn-controller-tls-certs\") pod \"ovn-controller-9nlxb\" (UID: \"72431bf0-bd8f-431d-81a8-082f9ef654e1\") " pod="openstack/ovn-controller-9nlxb" Sep 30 14:15:15 crc kubenswrapper[4676]: I0930 14:15:15.206312 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72431bf0-bd8f-431d-81a8-082f9ef654e1-combined-ca-bundle\") pod \"ovn-controller-9nlxb\" (UID: \"72431bf0-bd8f-431d-81a8-082f9ef654e1\") " pod="openstack/ovn-controller-9nlxb" Sep 30 14:15:15 crc kubenswrapper[4676]: I0930 14:15:15.217117 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwl7g\" (UniqueName: \"kubernetes.io/projected/a69f5429-b9b9-47c3-b720-1a59c5d87b27-kube-api-access-qwl7g\") pod \"ovn-controller-ovs-7t792\" (UID: \"a69f5429-b9b9-47c3-b720-1a59c5d87b27\") " pod="openstack/ovn-controller-ovs-7t792" Sep 30 14:15:15 crc kubenswrapper[4676]: I0930 14:15:15.221618 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cknlz\" (UniqueName: \"kubernetes.io/projected/72431bf0-bd8f-431d-81a8-082f9ef654e1-kube-api-access-cknlz\") pod \"ovn-controller-9nlxb\" (UID: \"72431bf0-bd8f-431d-81a8-082f9ef654e1\") " pod="openstack/ovn-controller-9nlxb" Sep 30 14:15:15 crc kubenswrapper[4676]: I0930 14:15:15.324064 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9nlxb" Sep 30 14:15:15 crc kubenswrapper[4676]: I0930 14:15:15.335865 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-7t792" Sep 30 14:15:15 crc kubenswrapper[4676]: I0930 14:15:15.839377 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Sep 30 14:15:15 crc kubenswrapper[4676]: I0930 14:15:15.841190 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Sep 30 14:15:15 crc kubenswrapper[4676]: I0930 14:15:15.846287 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Sep 30 14:15:15 crc kubenswrapper[4676]: I0930 14:15:15.846697 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Sep 30 14:15:15 crc kubenswrapper[4676]: I0930 14:15:15.846850 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-85zbq" Sep 30 14:15:15 crc kubenswrapper[4676]: I0930 14:15:15.847063 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Sep 30 14:15:15 crc kubenswrapper[4676]: I0930 14:15:15.847206 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Sep 30 14:15:15 crc kubenswrapper[4676]: I0930 14:15:15.876638 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Sep 30 14:15:15 crc kubenswrapper[4676]: I0930 14:15:15.910379 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/35e313d1-3779-4eb1-b12f-c3b5432dfd1d-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"35e313d1-3779-4eb1-b12f-c3b5432dfd1d\") " pod="openstack/ovsdbserver-nb-0" Sep 30 14:15:15 crc kubenswrapper[4676]: I0930 14:15:15.910444 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trc67\" (UniqueName: \"kubernetes.io/projected/35e313d1-3779-4eb1-b12f-c3b5432dfd1d-kube-api-access-trc67\") pod \"ovsdbserver-nb-0\" (UID: \"35e313d1-3779-4eb1-b12f-c3b5432dfd1d\") " pod="openstack/ovsdbserver-nb-0" Sep 30 14:15:15 crc kubenswrapper[4676]: I0930 14:15:15.910587 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/35e313d1-3779-4eb1-b12f-c3b5432dfd1d-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"35e313d1-3779-4eb1-b12f-c3b5432dfd1d\") " pod="openstack/ovsdbserver-nb-0" Sep 30 14:15:15 crc kubenswrapper[4676]: I0930 14:15:15.910678 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/35e313d1-3779-4eb1-b12f-c3b5432dfd1d-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"35e313d1-3779-4eb1-b12f-c3b5432dfd1d\") " pod="openstack/ovsdbserver-nb-0" Sep 30 14:15:15 crc kubenswrapper[4676]: I0930 14:15:15.910706 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/35e313d1-3779-4eb1-b12f-c3b5432dfd1d-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"35e313d1-3779-4eb1-b12f-c3b5432dfd1d\") " pod="openstack/ovsdbserver-nb-0" Sep 30 14:15:15 crc kubenswrapper[4676]: I0930 14:15:15.910838 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35e313d1-3779-4eb1-b12f-c3b5432dfd1d-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"35e313d1-3779-4eb1-b12f-c3b5432dfd1d\") " pod="openstack/ovsdbserver-nb-0" Sep 30 14:15:15 crc kubenswrapper[4676]: I0930 14:15:15.910982 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"35e313d1-3779-4eb1-b12f-c3b5432dfd1d\") " pod="openstack/ovsdbserver-nb-0" Sep 30 14:15:15 crc kubenswrapper[4676]: I0930 14:15:15.911043 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35e313d1-3779-4eb1-b12f-c3b5432dfd1d-config\") pod \"ovsdbserver-nb-0\" (UID: \"35e313d1-3779-4eb1-b12f-c3b5432dfd1d\") " pod="openstack/ovsdbserver-nb-0" Sep 30 14:15:16 crc kubenswrapper[4676]: I0930 14:15:16.012127 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35e313d1-3779-4eb1-b12f-c3b5432dfd1d-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"35e313d1-3779-4eb1-b12f-c3b5432dfd1d\") " pod="openstack/ovsdbserver-nb-0" Sep 30 14:15:16 crc kubenswrapper[4676]: I0930 14:15:16.012195 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"35e313d1-3779-4eb1-b12f-c3b5432dfd1d\") " pod="openstack/ovsdbserver-nb-0" Sep 30 14:15:16 crc kubenswrapper[4676]: I0930 14:15:16.012224 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35e313d1-3779-4eb1-b12f-c3b5432dfd1d-config\") pod \"ovsdbserver-nb-0\" (UID: \"35e313d1-3779-4eb1-b12f-c3b5432dfd1d\") " pod="openstack/ovsdbserver-nb-0" Sep 30 14:15:16 crc kubenswrapper[4676]: I0930 14:15:16.012262 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/35e313d1-3779-4eb1-b12f-c3b5432dfd1d-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"35e313d1-3779-4eb1-b12f-c3b5432dfd1d\") " pod="openstack/ovsdbserver-nb-0" Sep 30 14:15:16 crc kubenswrapper[4676]: I0930 14:15:16.012286 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trc67\" (UniqueName: \"kubernetes.io/projected/35e313d1-3779-4eb1-b12f-c3b5432dfd1d-kube-api-access-trc67\") pod \"ovsdbserver-nb-0\" (UID: \"35e313d1-3779-4eb1-b12f-c3b5432dfd1d\") " pod="openstack/ovsdbserver-nb-0" Sep 30 14:15:16 crc kubenswrapper[4676]: I0930 14:15:16.012309 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/35e313d1-3779-4eb1-b12f-c3b5432dfd1d-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"35e313d1-3779-4eb1-b12f-c3b5432dfd1d\") " pod="openstack/ovsdbserver-nb-0" Sep 30 14:15:16 crc kubenswrapper[4676]: I0930 14:15:16.012348 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/35e313d1-3779-4eb1-b12f-c3b5432dfd1d-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"35e313d1-3779-4eb1-b12f-c3b5432dfd1d\") " pod="openstack/ovsdbserver-nb-0" Sep 30 14:15:16 crc kubenswrapper[4676]: I0930 14:15:16.012364 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/35e313d1-3779-4eb1-b12f-c3b5432dfd1d-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"35e313d1-3779-4eb1-b12f-c3b5432dfd1d\") " pod="openstack/ovsdbserver-nb-0" Sep 30 14:15:16 crc kubenswrapper[4676]: I0930 14:15:16.012549 4676 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"35e313d1-3779-4eb1-b12f-c3b5432dfd1d\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/ovsdbserver-nb-0" Sep 30 14:15:16 crc kubenswrapper[4676]: I0930 14:15:16.013031 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/35e313d1-3779-4eb1-b12f-c3b5432dfd1d-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"35e313d1-3779-4eb1-b12f-c3b5432dfd1d\") " pod="openstack/ovsdbserver-nb-0" Sep 30 14:15:16 crc kubenswrapper[4676]: I0930 14:15:16.013209 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35e313d1-3779-4eb1-b12f-c3b5432dfd1d-config\") pod \"ovsdbserver-nb-0\" (UID: \"35e313d1-3779-4eb1-b12f-c3b5432dfd1d\") " pod="openstack/ovsdbserver-nb-0" Sep 30 14:15:16 crc kubenswrapper[4676]: I0930 14:15:16.013817 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/35e313d1-3779-4eb1-b12f-c3b5432dfd1d-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"35e313d1-3779-4eb1-b12f-c3b5432dfd1d\") " pod="openstack/ovsdbserver-nb-0" Sep 30 14:15:16 crc kubenswrapper[4676]: I0930 14:15:16.018177 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/35e313d1-3779-4eb1-b12f-c3b5432dfd1d-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"35e313d1-3779-4eb1-b12f-c3b5432dfd1d\") " pod="openstack/ovsdbserver-nb-0" Sep 30 14:15:16 crc kubenswrapper[4676]: I0930 14:15:16.018956 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/35e313d1-3779-4eb1-b12f-c3b5432dfd1d-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"35e313d1-3779-4eb1-b12f-c3b5432dfd1d\") " pod="openstack/ovsdbserver-nb-0" Sep 30 14:15:16 crc kubenswrapper[4676]: I0930 14:15:16.024499 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35e313d1-3779-4eb1-b12f-c3b5432dfd1d-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"35e313d1-3779-4eb1-b12f-c3b5432dfd1d\") " pod="openstack/ovsdbserver-nb-0" Sep 30 14:15:16 crc kubenswrapper[4676]: I0930 14:15:16.031467 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trc67\" (UniqueName: \"kubernetes.io/projected/35e313d1-3779-4eb1-b12f-c3b5432dfd1d-kube-api-access-trc67\") pod \"ovsdbserver-nb-0\" (UID: \"35e313d1-3779-4eb1-b12f-c3b5432dfd1d\") " pod="openstack/ovsdbserver-nb-0" Sep 30 14:15:16 crc kubenswrapper[4676]: I0930 14:15:16.032653 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"35e313d1-3779-4eb1-b12f-c3b5432dfd1d\") " pod="openstack/ovsdbserver-nb-0" Sep 30 14:15:16 crc kubenswrapper[4676]: I0930 14:15:16.172451 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Sep 30 14:15:19 crc kubenswrapper[4676]: I0930 14:15:19.021797 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Sep 30 14:15:19 crc kubenswrapper[4676]: I0930 14:15:19.024078 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Sep 30 14:15:19 crc kubenswrapper[4676]: I0930 14:15:19.027638 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-czhqz" Sep 30 14:15:19 crc kubenswrapper[4676]: I0930 14:15:19.028005 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Sep 30 14:15:19 crc kubenswrapper[4676]: I0930 14:15:19.028165 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Sep 30 14:15:19 crc kubenswrapper[4676]: I0930 14:15:19.028346 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Sep 30 14:15:19 crc kubenswrapper[4676]: I0930 14:15:19.054568 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Sep 30 14:15:19 crc kubenswrapper[4676]: I0930 14:15:19.059577 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"e73b3580-31d5-4c06-9bd8-acbd16c5c48d\") " pod="openstack/ovsdbserver-sb-0" Sep 30 14:15:19 crc kubenswrapper[4676]: I0930 14:15:19.059638 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e73b3580-31d5-4c06-9bd8-acbd16c5c48d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e73b3580-31d5-4c06-9bd8-acbd16c5c48d\") " pod="openstack/ovsdbserver-sb-0" Sep 30 14:15:19 crc kubenswrapper[4676]: I0930 14:15:19.059679 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e73b3580-31d5-4c06-9bd8-acbd16c5c48d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"e73b3580-31d5-4c06-9bd8-acbd16c5c48d\") " pod="openstack/ovsdbserver-sb-0" Sep 30 14:15:19 crc kubenswrapper[4676]: I0930 14:15:19.059717 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e73b3580-31d5-4c06-9bd8-acbd16c5c48d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"e73b3580-31d5-4c06-9bd8-acbd16c5c48d\") " pod="openstack/ovsdbserver-sb-0" Sep 30 14:15:19 crc kubenswrapper[4676]: I0930 14:15:19.059737 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e73b3580-31d5-4c06-9bd8-acbd16c5c48d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"e73b3580-31d5-4c06-9bd8-acbd16c5c48d\") " pod="openstack/ovsdbserver-sb-0" Sep 30 14:15:19 crc kubenswrapper[4676]: I0930 14:15:19.059766 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e73b3580-31d5-4c06-9bd8-acbd16c5c48d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e73b3580-31d5-4c06-9bd8-acbd16c5c48d\") " pod="openstack/ovsdbserver-sb-0" Sep 30 14:15:19 crc kubenswrapper[4676]: I0930 14:15:19.059789 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvnch\" (UniqueName: \"kubernetes.io/projected/e73b3580-31d5-4c06-9bd8-acbd16c5c48d-kube-api-access-dvnch\") pod \"ovsdbserver-sb-0\" (UID: \"e73b3580-31d5-4c06-9bd8-acbd16c5c48d\") " pod="openstack/ovsdbserver-sb-0" Sep 30 14:15:19 crc kubenswrapper[4676]: I0930 14:15:19.059818 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e73b3580-31d5-4c06-9bd8-acbd16c5c48d-config\") pod \"ovsdbserver-sb-0\" (UID: \"e73b3580-31d5-4c06-9bd8-acbd16c5c48d\") " pod="openstack/ovsdbserver-sb-0" Sep 30 14:15:19 crc kubenswrapper[4676]: I0930 14:15:19.163044 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"e73b3580-31d5-4c06-9bd8-acbd16c5c48d\") " pod="openstack/ovsdbserver-sb-0" Sep 30 14:15:19 crc kubenswrapper[4676]: I0930 14:15:19.163094 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e73b3580-31d5-4c06-9bd8-acbd16c5c48d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e73b3580-31d5-4c06-9bd8-acbd16c5c48d\") " pod="openstack/ovsdbserver-sb-0" Sep 30 14:15:19 crc kubenswrapper[4676]: I0930 14:15:19.163128 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e73b3580-31d5-4c06-9bd8-acbd16c5c48d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"e73b3580-31d5-4c06-9bd8-acbd16c5c48d\") " pod="openstack/ovsdbserver-sb-0" Sep 30 14:15:19 crc kubenswrapper[4676]: I0930 14:15:19.163161 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e73b3580-31d5-4c06-9bd8-acbd16c5c48d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"e73b3580-31d5-4c06-9bd8-acbd16c5c48d\") " pod="openstack/ovsdbserver-sb-0" Sep 30 14:15:19 crc kubenswrapper[4676]: I0930 14:15:19.163179 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e73b3580-31d5-4c06-9bd8-acbd16c5c48d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"e73b3580-31d5-4c06-9bd8-acbd16c5c48d\") " pod="openstack/ovsdbserver-sb-0" Sep 30 14:15:19 crc kubenswrapper[4676]: I0930 14:15:19.163209 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e73b3580-31d5-4c06-9bd8-acbd16c5c48d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e73b3580-31d5-4c06-9bd8-acbd16c5c48d\") " pod="openstack/ovsdbserver-sb-0" Sep 30 14:15:19 crc kubenswrapper[4676]: I0930 14:15:19.163275 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvnch\" (UniqueName: \"kubernetes.io/projected/e73b3580-31d5-4c06-9bd8-acbd16c5c48d-kube-api-access-dvnch\") pod \"ovsdbserver-sb-0\" (UID: \"e73b3580-31d5-4c06-9bd8-acbd16c5c48d\") " pod="openstack/ovsdbserver-sb-0" Sep 30 14:15:19 crc kubenswrapper[4676]: I0930 14:15:19.163322 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e73b3580-31d5-4c06-9bd8-acbd16c5c48d-config\") pod \"ovsdbserver-sb-0\" (UID: \"e73b3580-31d5-4c06-9bd8-acbd16c5c48d\") " pod="openstack/ovsdbserver-sb-0" Sep 30 14:15:19 crc kubenswrapper[4676]: I0930 14:15:19.163453 4676 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"e73b3580-31d5-4c06-9bd8-acbd16c5c48d\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/ovsdbserver-sb-0" Sep 30 14:15:19 crc kubenswrapper[4676]: I0930 14:15:19.164372 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e73b3580-31d5-4c06-9bd8-acbd16c5c48d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"e73b3580-31d5-4c06-9bd8-acbd16c5c48d\") " pod="openstack/ovsdbserver-sb-0" Sep 30 14:15:19 crc kubenswrapper[4676]: I0930 14:15:19.165242 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e73b3580-31d5-4c06-9bd8-acbd16c5c48d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"e73b3580-31d5-4c06-9bd8-acbd16c5c48d\") " pod="openstack/ovsdbserver-sb-0" Sep 30 14:15:19 crc kubenswrapper[4676]: I0930 14:15:19.169654 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e73b3580-31d5-4c06-9bd8-acbd16c5c48d-config\") pod \"ovsdbserver-sb-0\" (UID: \"e73b3580-31d5-4c06-9bd8-acbd16c5c48d\") " pod="openstack/ovsdbserver-sb-0" Sep 30 14:15:19 crc kubenswrapper[4676]: I0930 14:15:19.170502 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e73b3580-31d5-4c06-9bd8-acbd16c5c48d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e73b3580-31d5-4c06-9bd8-acbd16c5c48d\") " pod="openstack/ovsdbserver-sb-0" Sep 30 14:15:19 crc kubenswrapper[4676]: I0930 14:15:19.170611 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e73b3580-31d5-4c06-9bd8-acbd16c5c48d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e73b3580-31d5-4c06-9bd8-acbd16c5c48d\") " pod="openstack/ovsdbserver-sb-0" Sep 30 14:15:19 crc kubenswrapper[4676]: I0930 14:15:19.171634 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e73b3580-31d5-4c06-9bd8-acbd16c5c48d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"e73b3580-31d5-4c06-9bd8-acbd16c5c48d\") " pod="openstack/ovsdbserver-sb-0" Sep 30 14:15:19 crc kubenswrapper[4676]: I0930 14:15:19.180638 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvnch\" (UniqueName: \"kubernetes.io/projected/e73b3580-31d5-4c06-9bd8-acbd16c5c48d-kube-api-access-dvnch\") pod \"ovsdbserver-sb-0\" (UID: \"e73b3580-31d5-4c06-9bd8-acbd16c5c48d\") " pod="openstack/ovsdbserver-sb-0" Sep 30 14:15:19 crc kubenswrapper[4676]: I0930 14:15:19.184894 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"e73b3580-31d5-4c06-9bd8-acbd16c5c48d\") " pod="openstack/ovsdbserver-sb-0" Sep 30 14:15:19 crc kubenswrapper[4676]: I0930 14:15:19.400862 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Sep 30 14:15:19 crc kubenswrapper[4676]: I0930 14:15:19.582693 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0ee88379-95a2-4019-a41a-7931a5ab2f30","Type":"ContainerStarted","Data":"fcea88a4ef537edc4f1a5808b5837900199b338d0357a20388b5e52d683a133e"} Sep 30 14:15:23 crc kubenswrapper[4676]: I0930 14:15:23.712846 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Sep 30 14:15:23 crc kubenswrapper[4676]: I0930 14:15:23.758899 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 14:15:24 crc kubenswrapper[4676]: E0930 14:15:24.125552 4676 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Sep 30 14:15:24 crc kubenswrapper[4676]: E0930 14:15:24.125922 4676 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6ffw8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5ccc8479f9-kp5vg_openstack(b6fe7625-e1a3-4abd-b42b-591d546d5e37): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 14:15:24 crc kubenswrapper[4676]: E0930 14:15:24.127022 4676 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Sep 30 14:15:24 crc kubenswrapper[4676]: E0930 14:15:24.127007 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5ccc8479f9-kp5vg" podUID="b6fe7625-e1a3-4abd-b42b-591d546d5e37" Sep 30 14:15:24 crc kubenswrapper[4676]: E0930 14:15:24.127101 4676 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9tdx9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-xn6j9_openstack(2ff371d8-e7bd-48aa-806e-a91a8a5a629c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 14:15:24 crc kubenswrapper[4676]: E0930 14:15:24.128208 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-xn6j9" podUID="2ff371d8-e7bd-48aa-806e-a91a8a5a629c" Sep 30 14:15:24 crc kubenswrapper[4676]: E0930 14:15:24.140470 4676 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Sep 30 14:15:24 crc kubenswrapper[4676]: E0930 14:15:24.140632 4676 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cgsrf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-txc97_openstack(1fb231af-b3b4-45d8-ae43-8b418e1dfcf5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 14:15:24 crc kubenswrapper[4676]: E0930 14:15:24.141823 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-txc97" podUID="1fb231af-b3b4-45d8-ae43-8b418e1dfcf5" Sep 30 14:15:24 crc kubenswrapper[4676]: E0930 14:15:24.150733 4676 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Sep 30 14:15:24 crc kubenswrapper[4676]: E0930 14:15:24.150964 4676 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8qpbh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-cfjfj_openstack(9aeb973a-5d23-461b-a9b8-4f816dc01830): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 14:15:24 crc kubenswrapper[4676]: E0930 14:15:24.152274 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-cfjfj" podUID="9aeb973a-5d23-461b-a9b8-4f816dc01830" Sep 30 14:15:24 crc kubenswrapper[4676]: E0930 14:15:24.618927 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-5ccc8479f9-kp5vg" podUID="b6fe7625-e1a3-4abd-b42b-591d546d5e37" Sep 30 14:15:24 crc kubenswrapper[4676]: E0930 14:15:24.619536 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-txc97" podUID="1fb231af-b3b4-45d8-ae43-8b418e1dfcf5" Sep 30 14:15:26 crc kubenswrapper[4676]: W0930 14:15:26.607671 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7aef6349_6b33_4f9e_972d_a990cb3ff62e.slice/crio-b13981ff73a16489ecc1b77afbce6f1fa3a62fbfab7478e22d0d9440a0d6c933 WatchSource:0}: Error finding container b13981ff73a16489ecc1b77afbce6f1fa3a62fbfab7478e22d0d9440a0d6c933: Status 404 returned error can't find the container with id b13981ff73a16489ecc1b77afbce6f1fa3a62fbfab7478e22d0d9440a0d6c933 Sep 30 14:15:26 crc kubenswrapper[4676]: I0930 14:15:26.632151 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"7aef6349-6b33-4f9e-972d-a990cb3ff62e","Type":"ContainerStarted","Data":"b13981ff73a16489ecc1b77afbce6f1fa3a62fbfab7478e22d0d9440a0d6c933"} Sep 30 14:15:26 crc kubenswrapper[4676]: I0930 14:15:26.633519 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ad8d4649-f28a-4d12-884f-44308450c02b","Type":"ContainerStarted","Data":"489c0e0f1d022156a69c64aa6de6015d64a148f632f31bbe91c64858f5e199d0"} Sep 30 14:15:26 crc kubenswrapper[4676]: I0930 14:15:26.635033 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-cfjfj" event={"ID":"9aeb973a-5d23-461b-a9b8-4f816dc01830","Type":"ContainerDied","Data":"93ea78b03a318b20b17d38864ea0c0de0164accdb1bd6704277c1b2cebbe611f"} Sep 30 14:15:26 crc kubenswrapper[4676]: I0930 14:15:26.635065 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93ea78b03a318b20b17d38864ea0c0de0164accdb1bd6704277c1b2cebbe611f" Sep 30 14:15:26 crc kubenswrapper[4676]: I0930 14:15:26.756949 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-cfjfj" Sep 30 14:15:26 crc kubenswrapper[4676]: I0930 14:15:26.775089 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-xn6j9" Sep 30 14:15:26 crc kubenswrapper[4676]: I0930 14:15:26.815905 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qpbh\" (UniqueName: \"kubernetes.io/projected/9aeb973a-5d23-461b-a9b8-4f816dc01830-kube-api-access-8qpbh\") pod \"9aeb973a-5d23-461b-a9b8-4f816dc01830\" (UID: \"9aeb973a-5d23-461b-a9b8-4f816dc01830\") " Sep 30 14:15:26 crc kubenswrapper[4676]: I0930 14:15:26.815957 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9tdx9\" (UniqueName: \"kubernetes.io/projected/2ff371d8-e7bd-48aa-806e-a91a8a5a629c-kube-api-access-9tdx9\") pod \"2ff371d8-e7bd-48aa-806e-a91a8a5a629c\" (UID: \"2ff371d8-e7bd-48aa-806e-a91a8a5a629c\") " Sep 30 14:15:26 crc kubenswrapper[4676]: I0930 14:15:26.816012 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ff371d8-e7bd-48aa-806e-a91a8a5a629c-config\") pod \"2ff371d8-e7bd-48aa-806e-a91a8a5a629c\" (UID: \"2ff371d8-e7bd-48aa-806e-a91a8a5a629c\") " Sep 30 14:15:26 crc kubenswrapper[4676]: I0930 14:15:26.816056 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ff371d8-e7bd-48aa-806e-a91a8a5a629c-dns-svc\") pod \"2ff371d8-e7bd-48aa-806e-a91a8a5a629c\" (UID: \"2ff371d8-e7bd-48aa-806e-a91a8a5a629c\") " Sep 30 14:15:26 crc kubenswrapper[4676]: I0930 14:15:26.816095 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9aeb973a-5d23-461b-a9b8-4f816dc01830-config\") pod \"9aeb973a-5d23-461b-a9b8-4f816dc01830\" (UID: \"9aeb973a-5d23-461b-a9b8-4f816dc01830\") " Sep 30 14:15:26 crc kubenswrapper[4676]: I0930 14:15:26.816646 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ff371d8-e7bd-48aa-806e-a91a8a5a629c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2ff371d8-e7bd-48aa-806e-a91a8a5a629c" (UID: "2ff371d8-e7bd-48aa-806e-a91a8a5a629c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:15:26 crc kubenswrapper[4676]: I0930 14:15:26.816703 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9aeb973a-5d23-461b-a9b8-4f816dc01830-config" (OuterVolumeSpecName: "config") pod "9aeb973a-5d23-461b-a9b8-4f816dc01830" (UID: "9aeb973a-5d23-461b-a9b8-4f816dc01830"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:15:26 crc kubenswrapper[4676]: I0930 14:15:26.817607 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ff371d8-e7bd-48aa-806e-a91a8a5a629c-config" (OuterVolumeSpecName: "config") pod "2ff371d8-e7bd-48aa-806e-a91a8a5a629c" (UID: "2ff371d8-e7bd-48aa-806e-a91a8a5a629c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:15:26 crc kubenswrapper[4676]: I0930 14:15:26.826224 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ff371d8-e7bd-48aa-806e-a91a8a5a629c-kube-api-access-9tdx9" (OuterVolumeSpecName: "kube-api-access-9tdx9") pod "2ff371d8-e7bd-48aa-806e-a91a8a5a629c" (UID: "2ff371d8-e7bd-48aa-806e-a91a8a5a629c"). InnerVolumeSpecName "kube-api-access-9tdx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:15:26 crc kubenswrapper[4676]: I0930 14:15:26.837792 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9aeb973a-5d23-461b-a9b8-4f816dc01830-kube-api-access-8qpbh" (OuterVolumeSpecName: "kube-api-access-8qpbh") pod "9aeb973a-5d23-461b-a9b8-4f816dc01830" (UID: "9aeb973a-5d23-461b-a9b8-4f816dc01830"). InnerVolumeSpecName "kube-api-access-8qpbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:15:26 crc kubenswrapper[4676]: I0930 14:15:26.917395 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9tdx9\" (UniqueName: \"kubernetes.io/projected/2ff371d8-e7bd-48aa-806e-a91a8a5a629c-kube-api-access-9tdx9\") on node \"crc\" DevicePath \"\"" Sep 30 14:15:26 crc kubenswrapper[4676]: I0930 14:15:26.917428 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ff371d8-e7bd-48aa-806e-a91a8a5a629c-config\") on node \"crc\" DevicePath \"\"" Sep 30 14:15:26 crc kubenswrapper[4676]: I0930 14:15:26.917439 4676 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ff371d8-e7bd-48aa-806e-a91a8a5a629c-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 14:15:26 crc kubenswrapper[4676]: I0930 14:15:26.917449 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9aeb973a-5d23-461b-a9b8-4f816dc01830-config\") on node \"crc\" DevicePath \"\"" Sep 30 14:15:26 crc kubenswrapper[4676]: I0930 14:15:26.917460 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qpbh\" (UniqueName: \"kubernetes.io/projected/9aeb973a-5d23-461b-a9b8-4f816dc01830-kube-api-access-8qpbh\") on node \"crc\" DevicePath \"\"" Sep 30 14:15:27 crc kubenswrapper[4676]: I0930 14:15:27.123624 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Sep 30 14:15:27 crc kubenswrapper[4676]: I0930 14:15:27.180708 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9nlxb"] Sep 30 14:15:27 crc kubenswrapper[4676]: I0930 14:15:27.188981 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Sep 30 14:15:27 crc kubenswrapper[4676]: I0930 14:15:27.284228 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Sep 30 14:15:27 crc kubenswrapper[4676]: I0930 14:15:27.378334 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-7t792"] Sep 30 14:15:27 crc kubenswrapper[4676]: I0930 14:15:27.643103 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-cfjfj" Sep 30 14:15:27 crc kubenswrapper[4676]: I0930 14:15:27.643166 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-xn6j9" Sep 30 14:15:27 crc kubenswrapper[4676]: I0930 14:15:27.643168 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-xn6j9" event={"ID":"2ff371d8-e7bd-48aa-806e-a91a8a5a629c","Type":"ContainerDied","Data":"2db89d0cdb69b4fb500289f43f112985609899f29ef747564a50a4800ce6ce5d"} Sep 30 14:15:27 crc kubenswrapper[4676]: I0930 14:15:27.695219 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-xn6j9"] Sep 30 14:15:27 crc kubenswrapper[4676]: I0930 14:15:27.701966 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-xn6j9"] Sep 30 14:15:27 crc kubenswrapper[4676]: I0930 14:15:27.732193 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-cfjfj"] Sep 30 14:15:27 crc kubenswrapper[4676]: I0930 14:15:27.738595 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-cfjfj"] Sep 30 14:15:27 crc kubenswrapper[4676]: W0930 14:15:27.799056 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9762912c_8ca3_4791_93c0_4d5728543998.slice/crio-fe683e30070838f302f50e69f0dd44903b4ac0e43e0eb8f06317149aa7e9fdfa WatchSource:0}: Error finding container fe683e30070838f302f50e69f0dd44903b4ac0e43e0eb8f06317149aa7e9fdfa: Status 404 returned error can't find the container with id fe683e30070838f302f50e69f0dd44903b4ac0e43e0eb8f06317149aa7e9fdfa Sep 30 14:15:27 crc kubenswrapper[4676]: I0930 14:15:27.984560 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Sep 30 14:15:27 crc kubenswrapper[4676]: W0930 14:15:27.989631 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode73b3580_31d5_4c06_9bd8_acbd16c5c48d.slice/crio-bd89b0d9cbf8b91441c7843db4bfa5dd3b5518cda48a86f7d758c4ef0c6953b2 WatchSource:0}: Error finding container bd89b0d9cbf8b91441c7843db4bfa5dd3b5518cda48a86f7d758c4ef0c6953b2: Status 404 returned error can't find the container with id bd89b0d9cbf8b91441c7843db4bfa5dd3b5518cda48a86f7d758c4ef0c6953b2 Sep 30 14:15:28 crc kubenswrapper[4676]: I0930 14:15:28.652333 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9nlxb" event={"ID":"72431bf0-bd8f-431d-81a8-082f9ef654e1","Type":"ContainerStarted","Data":"ec00a791907c81153396c3e10a0d5ca565a04fd2b45a0af6822bcbfbdc24e455"} Sep 30 14:15:28 crc kubenswrapper[4676]: I0930 14:15:28.654962 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"e73b3580-31d5-4c06-9bd8-acbd16c5c48d","Type":"ContainerStarted","Data":"bd89b0d9cbf8b91441c7843db4bfa5dd3b5518cda48a86f7d758c4ef0c6953b2"} Sep 30 14:15:28 crc kubenswrapper[4676]: I0930 14:15:28.656819 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"9762912c-8ca3-4791-93c0-4d5728543998","Type":"ContainerStarted","Data":"fe683e30070838f302f50e69f0dd44903b4ac0e43e0eb8f06317149aa7e9fdfa"} Sep 30 14:15:28 crc kubenswrapper[4676]: I0930 14:15:28.658263 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-7t792" event={"ID":"a69f5429-b9b9-47c3-b720-1a59c5d87b27","Type":"ContainerStarted","Data":"91ec6cf22bde987ead2920e44066afef491bd5e56626321224b3764d4b0062a3"} Sep 30 14:15:28 crc kubenswrapper[4676]: I0930 14:15:28.660943 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"35e313d1-3779-4eb1-b12f-c3b5432dfd1d","Type":"ContainerStarted","Data":"e81c566639f33c0b597910425ef9dcda4ccfde1073311cb0b464e70824b73dcc"} Sep 30 14:15:28 crc kubenswrapper[4676]: I0930 14:15:28.661847 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d32e1e85-6a70-4751-9223-85e7018c3cc7","Type":"ContainerStarted","Data":"4eabef7b7204bb072f5be5c06811eb018161b808f44ef7ecccc685b7eb5d5918"} Sep 30 14:15:29 crc kubenswrapper[4676]: I0930 14:15:29.443718 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ff371d8-e7bd-48aa-806e-a91a8a5a629c" path="/var/lib/kubelet/pods/2ff371d8-e7bd-48aa-806e-a91a8a5a629c/volumes" Sep 30 14:15:29 crc kubenswrapper[4676]: I0930 14:15:29.444463 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9aeb973a-5d23-461b-a9b8-4f816dc01830" path="/var/lib/kubelet/pods/9aeb973a-5d23-461b-a9b8-4f816dc01830/volumes" Sep 30 14:15:29 crc kubenswrapper[4676]: I0930 14:15:29.920188 4676 patch_prober.go:28] interesting pod/machine-config-daemon-4k2dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:15:29 crc kubenswrapper[4676]: I0930 14:15:29.920260 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:15:30 crc kubenswrapper[4676]: I0930 14:15:30.676818 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0ee88379-95a2-4019-a41a-7931a5ab2f30","Type":"ContainerStarted","Data":"6c8ec10ab315c39eb7ff01d8e84e99116af149f3216722235ba385c2ec21e522"} Sep 30 14:15:30 crc kubenswrapper[4676]: I0930 14:15:30.678989 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"65fc51e6-2db0-4efd-880c-0ba599ef8637","Type":"ContainerStarted","Data":"00862cfe13d270a978c29c8f3cf52989a207952fa61191fd0dfcc057ec6241f6"} Sep 30 14:15:55 crc kubenswrapper[4676]: E0930 14:15:55.073457 4676 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified" Sep 30 14:15:55 crc kubenswrapper[4676]: E0930 14:15:55.074366 4676 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ovsdbserver-nb,Image:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified,Command:[/usr/bin/dumb-init],Args:[/usr/local/bin/container-scripts/setup.sh],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nd4h7chd4h544h96h56dh54chfbh5h665h64h55bhchc9hb9h664h5d6h685h578h75hd5h7hd5h64fhc9h66ch57ch57fh77h5cbh574h687q,ValueFrom:nil,},EnvVar{Name:OVN_LOGDIR,Value:/tmp,ValueFrom:nil,},EnvVar{Name:OVN_RUNDIR,Value:/tmp,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovndbcluster-nb-etc-ovn,ReadOnly:false,MountPath:/etc/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdb-rundir,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndb.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovndb.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-trc67,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:&Lifecycle{PostStart:nil,PreStop:&LifecycleHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/cleanup.sh],},HTTPGet:nil,TCPSocket:nil,Sleep:nil,},},TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:20,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovsdbserver-nb-0_openstack(35e313d1-3779-4eb1-b12f-c3b5432dfd1d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 14:15:55 crc kubenswrapper[4676]: E0930 14:15:55.079451 4676 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Sep 30 14:15:55 crc kubenswrapper[4676]: E0930 14:15:55.079584 4676 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:DB_ROOT_PASSWORD,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:osp-secret,},Key:DbRootPassword,Optional:nil,},},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:secrets,ReadOnly:true,MountPath:/var/lib/secrets,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vn9xj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-cell1-galera-0_openstack(9762912c-8ca3-4791-93c0-4d5728543998): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 14:15:55 crc kubenswrapper[4676]: E0930 14:15:55.080780 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-cell1-galera-0" podUID="9762912c-8ca3-4791-93c0-4d5728543998" Sep 30 14:15:59 crc kubenswrapper[4676]: I0930 14:15:59.919002 4676 patch_prober.go:28] interesting pod/machine-config-daemon-4k2dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:15:59 crc kubenswrapper[4676]: I0930 14:15:59.919547 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:15:59 crc kubenswrapper[4676]: I0930 14:15:59.919593 4676 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" Sep 30 14:15:59 crc kubenswrapper[4676]: I0930 14:15:59.920322 4676 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"124b2a96400d24919cd22f949ea67e3aa5eaa3e8b7e92aeb3ff38f0b94aac9fe"} pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 14:15:59 crc kubenswrapper[4676]: I0930 14:15:59.920366 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerName="machine-config-daemon" containerID="cri-o://124b2a96400d24919cd22f949ea67e3aa5eaa3e8b7e92aeb3ff38f0b94aac9fe" gracePeriod=600 Sep 30 14:16:01 crc kubenswrapper[4676]: I0930 14:16:01.929257 4676 generic.go:334] "Generic (PLEG): container finished" podID="0ee88379-95a2-4019-a41a-7931a5ab2f30" containerID="6c8ec10ab315c39eb7ff01d8e84e99116af149f3216722235ba385c2ec21e522" exitCode=0 Sep 30 14:16:01 crc kubenswrapper[4676]: I0930 14:16:01.929355 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0ee88379-95a2-4019-a41a-7931a5ab2f30","Type":"ContainerDied","Data":"6c8ec10ab315c39eb7ff01d8e84e99116af149f3216722235ba385c2ec21e522"} Sep 30 14:16:01 crc kubenswrapper[4676]: I0930 14:16:01.932911 4676 generic.go:334] "Generic (PLEG): container finished" podID="65fc51e6-2db0-4efd-880c-0ba599ef8637" containerID="00862cfe13d270a978c29c8f3cf52989a207952fa61191fd0dfcc057ec6241f6" exitCode=0 Sep 30 14:16:01 crc kubenswrapper[4676]: I0930 14:16:01.933008 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"65fc51e6-2db0-4efd-880c-0ba599ef8637","Type":"ContainerDied","Data":"00862cfe13d270a978c29c8f3cf52989a207952fa61191fd0dfcc057ec6241f6"} Sep 30 14:16:07 crc kubenswrapper[4676]: I0930 14:16:07.980704 4676 generic.go:334] "Generic (PLEG): container finished" podID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerID="124b2a96400d24919cd22f949ea67e3aa5eaa3e8b7e92aeb3ff38f0b94aac9fe" exitCode=0 Sep 30 14:16:07 crc kubenswrapper[4676]: I0930 14:16:07.980801 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" event={"ID":"af133cb7-f0e4-428e-b348-c6e81493fc1d","Type":"ContainerDied","Data":"124b2a96400d24919cd22f949ea67e3aa5eaa3e8b7e92aeb3ff38f0b94aac9fe"} Sep 30 14:16:07 crc kubenswrapper[4676]: I0930 14:16:07.981224 4676 scope.go:117] "RemoveContainer" containerID="8220f97a191b5b5caecda4ca09b3bb938693c25c347212483dfba93ae90b896c" Sep 30 14:16:42 crc kubenswrapper[4676]: I0930 14:16:42.399223 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-w8c76"] Sep 30 14:16:42 crc kubenswrapper[4676]: I0930 14:16:42.402130 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-w8c76" Sep 30 14:16:42 crc kubenswrapper[4676]: I0930 14:16:42.414311 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Sep 30 14:16:42 crc kubenswrapper[4676]: I0930 14:16:42.421115 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-w8c76"] Sep 30 14:16:42 crc kubenswrapper[4676]: I0930 14:16:42.466582 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39632f21-878d-4bb9-ba72-afcac2cd0b5d-config\") pod \"ovn-controller-metrics-w8c76\" (UID: \"39632f21-878d-4bb9-ba72-afcac2cd0b5d\") " pod="openstack/ovn-controller-metrics-w8c76" Sep 30 14:16:42 crc kubenswrapper[4676]: I0930 14:16:42.466655 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mc88n\" (UniqueName: \"kubernetes.io/projected/39632f21-878d-4bb9-ba72-afcac2cd0b5d-kube-api-access-mc88n\") pod \"ovn-controller-metrics-w8c76\" (UID: \"39632f21-878d-4bb9-ba72-afcac2cd0b5d\") " pod="openstack/ovn-controller-metrics-w8c76" Sep 30 14:16:42 crc kubenswrapper[4676]: I0930 14:16:42.466694 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/39632f21-878d-4bb9-ba72-afcac2cd0b5d-ovs-rundir\") pod \"ovn-controller-metrics-w8c76\" (UID: \"39632f21-878d-4bb9-ba72-afcac2cd0b5d\") " pod="openstack/ovn-controller-metrics-w8c76" Sep 30 14:16:42 crc kubenswrapper[4676]: I0930 14:16:42.466722 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/39632f21-878d-4bb9-ba72-afcac2cd0b5d-ovn-rundir\") pod \"ovn-controller-metrics-w8c76\" (UID: \"39632f21-878d-4bb9-ba72-afcac2cd0b5d\") " pod="openstack/ovn-controller-metrics-w8c76" Sep 30 14:16:42 crc kubenswrapper[4676]: I0930 14:16:42.466758 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39632f21-878d-4bb9-ba72-afcac2cd0b5d-combined-ca-bundle\") pod \"ovn-controller-metrics-w8c76\" (UID: \"39632f21-878d-4bb9-ba72-afcac2cd0b5d\") " pod="openstack/ovn-controller-metrics-w8c76" Sep 30 14:16:42 crc kubenswrapper[4676]: I0930 14:16:42.466791 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/39632f21-878d-4bb9-ba72-afcac2cd0b5d-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-w8c76\" (UID: \"39632f21-878d-4bb9-ba72-afcac2cd0b5d\") " pod="openstack/ovn-controller-metrics-w8c76" Sep 30 14:16:42 crc kubenswrapper[4676]: I0930 14:16:42.567746 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-kp5vg"] Sep 30 14:16:42 crc kubenswrapper[4676]: I0930 14:16:42.568771 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39632f21-878d-4bb9-ba72-afcac2cd0b5d-config\") pod \"ovn-controller-metrics-w8c76\" (UID: \"39632f21-878d-4bb9-ba72-afcac2cd0b5d\") " pod="openstack/ovn-controller-metrics-w8c76" Sep 30 14:16:42 crc kubenswrapper[4676]: I0930 14:16:42.568846 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mc88n\" (UniqueName: \"kubernetes.io/projected/39632f21-878d-4bb9-ba72-afcac2cd0b5d-kube-api-access-mc88n\") pod \"ovn-controller-metrics-w8c76\" (UID: \"39632f21-878d-4bb9-ba72-afcac2cd0b5d\") " pod="openstack/ovn-controller-metrics-w8c76" Sep 30 14:16:42 crc kubenswrapper[4676]: I0930 14:16:42.569081 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/39632f21-878d-4bb9-ba72-afcac2cd0b5d-ovs-rundir\") pod \"ovn-controller-metrics-w8c76\" (UID: \"39632f21-878d-4bb9-ba72-afcac2cd0b5d\") " pod="openstack/ovn-controller-metrics-w8c76" Sep 30 14:16:42 crc kubenswrapper[4676]: I0930 14:16:42.569112 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/39632f21-878d-4bb9-ba72-afcac2cd0b5d-ovn-rundir\") pod \"ovn-controller-metrics-w8c76\" (UID: \"39632f21-878d-4bb9-ba72-afcac2cd0b5d\") " pod="openstack/ovn-controller-metrics-w8c76" Sep 30 14:16:42 crc kubenswrapper[4676]: I0930 14:16:42.569168 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39632f21-878d-4bb9-ba72-afcac2cd0b5d-combined-ca-bundle\") pod \"ovn-controller-metrics-w8c76\" (UID: \"39632f21-878d-4bb9-ba72-afcac2cd0b5d\") " pod="openstack/ovn-controller-metrics-w8c76" Sep 30 14:16:42 crc kubenswrapper[4676]: I0930 14:16:42.569203 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/39632f21-878d-4bb9-ba72-afcac2cd0b5d-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-w8c76\" (UID: \"39632f21-878d-4bb9-ba72-afcac2cd0b5d\") " pod="openstack/ovn-controller-metrics-w8c76" Sep 30 14:16:42 crc kubenswrapper[4676]: I0930 14:16:42.570299 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/39632f21-878d-4bb9-ba72-afcac2cd0b5d-ovs-rundir\") pod \"ovn-controller-metrics-w8c76\" (UID: \"39632f21-878d-4bb9-ba72-afcac2cd0b5d\") " pod="openstack/ovn-controller-metrics-w8c76" Sep 30 14:16:42 crc kubenswrapper[4676]: I0930 14:16:42.570418 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/39632f21-878d-4bb9-ba72-afcac2cd0b5d-ovn-rundir\") pod \"ovn-controller-metrics-w8c76\" (UID: \"39632f21-878d-4bb9-ba72-afcac2cd0b5d\") " pod="openstack/ovn-controller-metrics-w8c76" Sep 30 14:16:42 crc kubenswrapper[4676]: I0930 14:16:42.570997 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39632f21-878d-4bb9-ba72-afcac2cd0b5d-config\") pod \"ovn-controller-metrics-w8c76\" (UID: \"39632f21-878d-4bb9-ba72-afcac2cd0b5d\") " pod="openstack/ovn-controller-metrics-w8c76" Sep 30 14:16:42 crc kubenswrapper[4676]: I0930 14:16:42.580942 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/39632f21-878d-4bb9-ba72-afcac2cd0b5d-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-w8c76\" (UID: \"39632f21-878d-4bb9-ba72-afcac2cd0b5d\") " pod="openstack/ovn-controller-metrics-w8c76" Sep 30 14:16:42 crc kubenswrapper[4676]: I0930 14:16:42.582681 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39632f21-878d-4bb9-ba72-afcac2cd0b5d-combined-ca-bundle\") pod \"ovn-controller-metrics-w8c76\" (UID: \"39632f21-878d-4bb9-ba72-afcac2cd0b5d\") " pod="openstack/ovn-controller-metrics-w8c76" Sep 30 14:16:42 crc kubenswrapper[4676]: I0930 14:16:42.595109 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-dmzbf"] Sep 30 14:16:42 crc kubenswrapper[4676]: I0930 14:16:42.597641 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mc88n\" (UniqueName: \"kubernetes.io/projected/39632f21-878d-4bb9-ba72-afcac2cd0b5d-kube-api-access-mc88n\") pod \"ovn-controller-metrics-w8c76\" (UID: \"39632f21-878d-4bb9-ba72-afcac2cd0b5d\") " pod="openstack/ovn-controller-metrics-w8c76" Sep 30 14:16:42 crc kubenswrapper[4676]: I0930 14:16:42.599868 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-dmzbf" Sep 30 14:16:42 crc kubenswrapper[4676]: I0930 14:16:42.604021 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-dmzbf"] Sep 30 14:16:42 crc kubenswrapper[4676]: I0930 14:16:42.604814 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Sep 30 14:16:42 crc kubenswrapper[4676]: I0930 14:16:42.670319 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/44390184-0cf3-4920-9d04-63311603ba59-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-dmzbf\" (UID: \"44390184-0cf3-4920-9d04-63311603ba59\") " pod="openstack/dnsmasq-dns-7fd796d7df-dmzbf" Sep 30 14:16:42 crc kubenswrapper[4676]: I0930 14:16:42.670458 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44390184-0cf3-4920-9d04-63311603ba59-config\") pod \"dnsmasq-dns-7fd796d7df-dmzbf\" (UID: \"44390184-0cf3-4920-9d04-63311603ba59\") " pod="openstack/dnsmasq-dns-7fd796d7df-dmzbf" Sep 30 14:16:42 crc kubenswrapper[4676]: I0930 14:16:42.670537 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g66gs\" (UniqueName: \"kubernetes.io/projected/44390184-0cf3-4920-9d04-63311603ba59-kube-api-access-g66gs\") pod \"dnsmasq-dns-7fd796d7df-dmzbf\" (UID: \"44390184-0cf3-4920-9d04-63311603ba59\") " pod="openstack/dnsmasq-dns-7fd796d7df-dmzbf" Sep 30 14:16:42 crc kubenswrapper[4676]: I0930 14:16:42.670566 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/44390184-0cf3-4920-9d04-63311603ba59-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-dmzbf\" (UID: \"44390184-0cf3-4920-9d04-63311603ba59\") " pod="openstack/dnsmasq-dns-7fd796d7df-dmzbf" Sep 30 14:16:42 crc kubenswrapper[4676]: I0930 14:16:42.733713 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-w8c76" Sep 30 14:16:42 crc kubenswrapper[4676]: I0930 14:16:42.771987 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44390184-0cf3-4920-9d04-63311603ba59-config\") pod \"dnsmasq-dns-7fd796d7df-dmzbf\" (UID: \"44390184-0cf3-4920-9d04-63311603ba59\") " pod="openstack/dnsmasq-dns-7fd796d7df-dmzbf" Sep 30 14:16:42 crc kubenswrapper[4676]: I0930 14:16:42.772066 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g66gs\" (UniqueName: \"kubernetes.io/projected/44390184-0cf3-4920-9d04-63311603ba59-kube-api-access-g66gs\") pod \"dnsmasq-dns-7fd796d7df-dmzbf\" (UID: \"44390184-0cf3-4920-9d04-63311603ba59\") " pod="openstack/dnsmasq-dns-7fd796d7df-dmzbf" Sep 30 14:16:42 crc kubenswrapper[4676]: I0930 14:16:42.772090 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/44390184-0cf3-4920-9d04-63311603ba59-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-dmzbf\" (UID: \"44390184-0cf3-4920-9d04-63311603ba59\") " pod="openstack/dnsmasq-dns-7fd796d7df-dmzbf" Sep 30 14:16:42 crc kubenswrapper[4676]: I0930 14:16:42.772125 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/44390184-0cf3-4920-9d04-63311603ba59-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-dmzbf\" (UID: \"44390184-0cf3-4920-9d04-63311603ba59\") " pod="openstack/dnsmasq-dns-7fd796d7df-dmzbf" Sep 30 14:16:42 crc kubenswrapper[4676]: I0930 14:16:42.773092 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/44390184-0cf3-4920-9d04-63311603ba59-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-dmzbf\" (UID: \"44390184-0cf3-4920-9d04-63311603ba59\") " pod="openstack/dnsmasq-dns-7fd796d7df-dmzbf" Sep 30 14:16:42 crc kubenswrapper[4676]: I0930 14:16:42.773649 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44390184-0cf3-4920-9d04-63311603ba59-config\") pod \"dnsmasq-dns-7fd796d7df-dmzbf\" (UID: \"44390184-0cf3-4920-9d04-63311603ba59\") " pod="openstack/dnsmasq-dns-7fd796d7df-dmzbf" Sep 30 14:16:42 crc kubenswrapper[4676]: I0930 14:16:42.774427 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/44390184-0cf3-4920-9d04-63311603ba59-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-dmzbf\" (UID: \"44390184-0cf3-4920-9d04-63311603ba59\") " pod="openstack/dnsmasq-dns-7fd796d7df-dmzbf" Sep 30 14:16:42 crc kubenswrapper[4676]: I0930 14:16:42.807007 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g66gs\" (UniqueName: \"kubernetes.io/projected/44390184-0cf3-4920-9d04-63311603ba59-kube-api-access-g66gs\") pod \"dnsmasq-dns-7fd796d7df-dmzbf\" (UID: \"44390184-0cf3-4920-9d04-63311603ba59\") " pod="openstack/dnsmasq-dns-7fd796d7df-dmzbf" Sep 30 14:16:42 crc kubenswrapper[4676]: I0930 14:16:42.860661 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-txc97"] Sep 30 14:16:42 crc kubenswrapper[4676]: I0930 14:16:42.887897 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-c4hz6"] Sep 30 14:16:42 crc kubenswrapper[4676]: I0930 14:16:42.889436 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-c4hz6" Sep 30 14:16:42 crc kubenswrapper[4676]: I0930 14:16:42.891558 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Sep 30 14:16:42 crc kubenswrapper[4676]: I0930 14:16:42.905329 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-c4hz6"] Sep 30 14:16:42 crc kubenswrapper[4676]: I0930 14:16:42.955500 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-dmzbf" Sep 30 14:16:42 crc kubenswrapper[4676]: I0930 14:16:42.974807 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnzrd\" (UniqueName: \"kubernetes.io/projected/17ee04bb-14fc-4d85-9293-fe465ddc167d-kube-api-access-lnzrd\") pod \"dnsmasq-dns-86db49b7ff-c4hz6\" (UID: \"17ee04bb-14fc-4d85-9293-fe465ddc167d\") " pod="openstack/dnsmasq-dns-86db49b7ff-c4hz6" Sep 30 14:16:42 crc kubenswrapper[4676]: I0930 14:16:42.975044 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/17ee04bb-14fc-4d85-9293-fe465ddc167d-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-c4hz6\" (UID: \"17ee04bb-14fc-4d85-9293-fe465ddc167d\") " pod="openstack/dnsmasq-dns-86db49b7ff-c4hz6" Sep 30 14:16:42 crc kubenswrapper[4676]: I0930 14:16:42.975160 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/17ee04bb-14fc-4d85-9293-fe465ddc167d-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-c4hz6\" (UID: \"17ee04bb-14fc-4d85-9293-fe465ddc167d\") " pod="openstack/dnsmasq-dns-86db49b7ff-c4hz6" Sep 30 14:16:42 crc kubenswrapper[4676]: I0930 14:16:42.975241 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17ee04bb-14fc-4d85-9293-fe465ddc167d-config\") pod \"dnsmasq-dns-86db49b7ff-c4hz6\" (UID: \"17ee04bb-14fc-4d85-9293-fe465ddc167d\") " pod="openstack/dnsmasq-dns-86db49b7ff-c4hz6" Sep 30 14:16:42 crc kubenswrapper[4676]: I0930 14:16:42.975309 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17ee04bb-14fc-4d85-9293-fe465ddc167d-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-c4hz6\" (UID: \"17ee04bb-14fc-4d85-9293-fe465ddc167d\") " pod="openstack/dnsmasq-dns-86db49b7ff-c4hz6" Sep 30 14:16:43 crc kubenswrapper[4676]: I0930 14:16:43.077136 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnzrd\" (UniqueName: \"kubernetes.io/projected/17ee04bb-14fc-4d85-9293-fe465ddc167d-kube-api-access-lnzrd\") pod \"dnsmasq-dns-86db49b7ff-c4hz6\" (UID: \"17ee04bb-14fc-4d85-9293-fe465ddc167d\") " pod="openstack/dnsmasq-dns-86db49b7ff-c4hz6" Sep 30 14:16:43 crc kubenswrapper[4676]: I0930 14:16:43.077482 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/17ee04bb-14fc-4d85-9293-fe465ddc167d-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-c4hz6\" (UID: \"17ee04bb-14fc-4d85-9293-fe465ddc167d\") " pod="openstack/dnsmasq-dns-86db49b7ff-c4hz6" Sep 30 14:16:43 crc kubenswrapper[4676]: I0930 14:16:43.077645 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/17ee04bb-14fc-4d85-9293-fe465ddc167d-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-c4hz6\" (UID: \"17ee04bb-14fc-4d85-9293-fe465ddc167d\") " pod="openstack/dnsmasq-dns-86db49b7ff-c4hz6" Sep 30 14:16:43 crc kubenswrapper[4676]: I0930 14:16:43.077769 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17ee04bb-14fc-4d85-9293-fe465ddc167d-config\") pod \"dnsmasq-dns-86db49b7ff-c4hz6\" (UID: \"17ee04bb-14fc-4d85-9293-fe465ddc167d\") " pod="openstack/dnsmasq-dns-86db49b7ff-c4hz6" Sep 30 14:16:43 crc kubenswrapper[4676]: I0930 14:16:43.077877 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17ee04bb-14fc-4d85-9293-fe465ddc167d-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-c4hz6\" (UID: \"17ee04bb-14fc-4d85-9293-fe465ddc167d\") " pod="openstack/dnsmasq-dns-86db49b7ff-c4hz6" Sep 30 14:16:43 crc kubenswrapper[4676]: I0930 14:16:43.078774 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/17ee04bb-14fc-4d85-9293-fe465ddc167d-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-c4hz6\" (UID: \"17ee04bb-14fc-4d85-9293-fe465ddc167d\") " pod="openstack/dnsmasq-dns-86db49b7ff-c4hz6" Sep 30 14:16:43 crc kubenswrapper[4676]: I0930 14:16:43.078867 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17ee04bb-14fc-4d85-9293-fe465ddc167d-config\") pod \"dnsmasq-dns-86db49b7ff-c4hz6\" (UID: \"17ee04bb-14fc-4d85-9293-fe465ddc167d\") " pod="openstack/dnsmasq-dns-86db49b7ff-c4hz6" Sep 30 14:16:43 crc kubenswrapper[4676]: I0930 14:16:43.078911 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17ee04bb-14fc-4d85-9293-fe465ddc167d-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-c4hz6\" (UID: \"17ee04bb-14fc-4d85-9293-fe465ddc167d\") " pod="openstack/dnsmasq-dns-86db49b7ff-c4hz6" Sep 30 14:16:43 crc kubenswrapper[4676]: I0930 14:16:43.079324 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/17ee04bb-14fc-4d85-9293-fe465ddc167d-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-c4hz6\" (UID: \"17ee04bb-14fc-4d85-9293-fe465ddc167d\") " pod="openstack/dnsmasq-dns-86db49b7ff-c4hz6" Sep 30 14:16:43 crc kubenswrapper[4676]: I0930 14:16:43.100937 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnzrd\" (UniqueName: \"kubernetes.io/projected/17ee04bb-14fc-4d85-9293-fe465ddc167d-kube-api-access-lnzrd\") pod \"dnsmasq-dns-86db49b7ff-c4hz6\" (UID: \"17ee04bb-14fc-4d85-9293-fe465ddc167d\") " pod="openstack/dnsmasq-dns-86db49b7ff-c4hz6" Sep 30 14:16:43 crc kubenswrapper[4676]: I0930 14:16:43.217940 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-c4hz6" Sep 30 14:17:02 crc kubenswrapper[4676]: E0930 14:17:02.817982 4676 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified" Sep 30 14:17:02 crc kubenswrapper[4676]: E0930 14:17:02.818679 4676 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ovn-controller,Image:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,Command:[ovn-controller --pidfile unix:/run/openvswitch/db.sock --certificate=/etc/pki/tls/certs/ovndb.crt --private-key=/etc/pki/tls/private/ovndb.key --ca-cert=/etc/pki/tls/certs/ovndbca.crt],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n676h85h554h689h696h687hb9h67ch65dh697h5cbhf8h679h676h684h58fhfch8fh5d6hd7h565hcbh9fh547hdh9ch677hc9hb4h5dfh67fh54fq,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:var-run,ReadOnly:false,MountPath:/var/run/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-run-ovn,ReadOnly:false,MountPath:/var/run/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-log-ovn,ReadOnly:false,MountPath:/var/log/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndb.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovndb.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cknlz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/ovn_controller_liveness.sh],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/ovn_controller_readiness.sh],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:&Lifecycle{PostStart:nil,PreStop:&LifecycleHandler{Exec:&ExecAction{Command:[/usr/share/ovn/scripts/ovn-ctl stop_controller],},HTTPGet:nil,TCPSocket:nil,Sleep:nil,},},TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[NET_ADMIN SYS_ADMIN SYS_NICE],Drop:[],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-controller-9nlxb_openstack(72431bf0-bd8f-431d-81a8-082f9ef654e1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 14:17:02 crc kubenswrapper[4676]: E0930 14:17:02.819802 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovn-controller\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovn-controller-9nlxb" podUID="72431bf0-bd8f-431d-81a8-082f9ef654e1" Sep 30 14:17:03 crc kubenswrapper[4676]: E0930 14:17:03.622146 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovn-controller\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified\\\"\"" pod="openstack/ovn-controller-9nlxb" podUID="72431bf0-bd8f-431d-81a8-082f9ef654e1" Sep 30 14:17:04 crc kubenswrapper[4676]: E0930 14:17:04.512787 4676 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Sep 30 14:17:04 crc kubenswrapper[4676]: E0930 14:17:04.512997 4676 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cgsrf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-txc97_openstack(1fb231af-b3b4-45d8-ae43-8b418e1dfcf5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 14:17:04 crc kubenswrapper[4676]: E0930 14:17:04.515417 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-txc97" podUID="1fb231af-b3b4-45d8-ae43-8b418e1dfcf5" Sep 30 14:17:04 crc kubenswrapper[4676]: E0930 14:17:04.750765 4676 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified" Sep 30 14:17:04 crc kubenswrapper[4676]: E0930 14:17:04.751031 4676 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ovsdbserver-sb,Image:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified,Command:[/usr/bin/dumb-init],Args:[/usr/local/bin/container-scripts/setup.sh],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5d7hfh587h587h55ch5c4h666hf4h99h64hf7h7fh666h5b7hcbh54ch694h5f4h59fh685h655h5b9hd4hf9h5c8h88hc5h54fh64fh648h697hbfq,ValueFrom:nil,},EnvVar{Name:OVN_LOGDIR,Value:/tmp,ValueFrom:nil,},EnvVar{Name:OVN_RUNDIR,Value:/tmp,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovndbcluster-sb-etc-ovn,ReadOnly:false,MountPath:/etc/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdb-rundir,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndb.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovndb.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dvnch,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:&Lifecycle{PostStart:nil,PreStop:&LifecycleHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/cleanup.sh],},HTTPGet:nil,TCPSocket:nil,Sleep:nil,},},TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:20,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovsdbserver-sb-0_openstack(e73b3580-31d5-4c06-9bd8-acbd16c5c48d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 14:17:05 crc kubenswrapper[4676]: I0930 14:17:05.889358 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-c4hz6"] Sep 30 14:17:05 crc kubenswrapper[4676]: I0930 14:17:05.902779 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-dmzbf"] Sep 30 14:17:06 crc kubenswrapper[4676]: I0930 14:17:06.027980 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-w8c76"] Sep 30 14:17:07 crc kubenswrapper[4676]: W0930 14:17:07.727811 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod17ee04bb_14fc_4d85_9293_fe465ddc167d.slice/crio-8d5f6a5a29763fc7de515e0cac178c8a55dfd42f7fb13de94e71930622eb03b5 WatchSource:0}: Error finding container 8d5f6a5a29763fc7de515e0cac178c8a55dfd42f7fb13de94e71930622eb03b5: Status 404 returned error can't find the container with id 8d5f6a5a29763fc7de515e0cac178c8a55dfd42f7fb13de94e71930622eb03b5 Sep 30 14:17:08 crc kubenswrapper[4676]: W0930 14:17:08.145299 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44390184_0cf3_4920_9d04_63311603ba59.slice/crio-4ea46eb01b011bc39003f8ef8bd240a97642add49526aa27b7d1178289bf02ec WatchSource:0}: Error finding container 4ea46eb01b011bc39003f8ef8bd240a97642add49526aa27b7d1178289bf02ec: Status 404 returned error can't find the container with id 4ea46eb01b011bc39003f8ef8bd240a97642add49526aa27b7d1178289bf02ec Sep 30 14:17:08 crc kubenswrapper[4676]: I0930 14:17:08.218978 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-txc97" Sep 30 14:17:08 crc kubenswrapper[4676]: I0930 14:17:08.277792 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fb231af-b3b4-45d8-ae43-8b418e1dfcf5-dns-svc\") pod \"1fb231af-b3b4-45d8-ae43-8b418e1dfcf5\" (UID: \"1fb231af-b3b4-45d8-ae43-8b418e1dfcf5\") " Sep 30 14:17:08 crc kubenswrapper[4676]: I0930 14:17:08.277966 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fb231af-b3b4-45d8-ae43-8b418e1dfcf5-config\") pod \"1fb231af-b3b4-45d8-ae43-8b418e1dfcf5\" (UID: \"1fb231af-b3b4-45d8-ae43-8b418e1dfcf5\") " Sep 30 14:17:08 crc kubenswrapper[4676]: I0930 14:17:08.278065 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgsrf\" (UniqueName: \"kubernetes.io/projected/1fb231af-b3b4-45d8-ae43-8b418e1dfcf5-kube-api-access-cgsrf\") pod \"1fb231af-b3b4-45d8-ae43-8b418e1dfcf5\" (UID: \"1fb231af-b3b4-45d8-ae43-8b418e1dfcf5\") " Sep 30 14:17:08 crc kubenswrapper[4676]: I0930 14:17:08.279417 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fb231af-b3b4-45d8-ae43-8b418e1dfcf5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1fb231af-b3b4-45d8-ae43-8b418e1dfcf5" (UID: "1fb231af-b3b4-45d8-ae43-8b418e1dfcf5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:17:08 crc kubenswrapper[4676]: I0930 14:17:08.281129 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fb231af-b3b4-45d8-ae43-8b418e1dfcf5-config" (OuterVolumeSpecName: "config") pod "1fb231af-b3b4-45d8-ae43-8b418e1dfcf5" (UID: "1fb231af-b3b4-45d8-ae43-8b418e1dfcf5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:17:08 crc kubenswrapper[4676]: I0930 14:17:08.287263 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fb231af-b3b4-45d8-ae43-8b418e1dfcf5-kube-api-access-cgsrf" (OuterVolumeSpecName: "kube-api-access-cgsrf") pod "1fb231af-b3b4-45d8-ae43-8b418e1dfcf5" (UID: "1fb231af-b3b4-45d8-ae43-8b418e1dfcf5"). InnerVolumeSpecName "kube-api-access-cgsrf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:17:08 crc kubenswrapper[4676]: I0930 14:17:08.380693 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fb231af-b3b4-45d8-ae43-8b418e1dfcf5-config\") on node \"crc\" DevicePath \"\"" Sep 30 14:17:08 crc kubenswrapper[4676]: I0930 14:17:08.380736 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgsrf\" (UniqueName: \"kubernetes.io/projected/1fb231af-b3b4-45d8-ae43-8b418e1dfcf5-kube-api-access-cgsrf\") on node \"crc\" DevicePath \"\"" Sep 30 14:17:08 crc kubenswrapper[4676]: I0930 14:17:08.380750 4676 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fb231af-b3b4-45d8-ae43-8b418e1dfcf5-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 14:17:08 crc kubenswrapper[4676]: I0930 14:17:08.416179 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-w8c76" event={"ID":"39632f21-878d-4bb9-ba72-afcac2cd0b5d","Type":"ContainerStarted","Data":"653d3b86ed80a91634426fca01c9e064c62e751dddd8bcba24c0ec3fe509aaa0"} Sep 30 14:17:08 crc kubenswrapper[4676]: I0930 14:17:08.418827 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"65fc51e6-2db0-4efd-880c-0ba599ef8637","Type":"ContainerStarted","Data":"bb59ac140098029e345d6da7202977df3632e37bef8af0d303536be23487701d"} Sep 30 14:17:08 crc kubenswrapper[4676]: I0930 14:17:08.419026 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:17:08 crc kubenswrapper[4676]: I0930 14:17:08.420580 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-dmzbf" event={"ID":"44390184-0cf3-4920-9d04-63311603ba59","Type":"ContainerStarted","Data":"4ea46eb01b011bc39003f8ef8bd240a97642add49526aa27b7d1178289bf02ec"} Sep 30 14:17:08 crc kubenswrapper[4676]: I0930 14:17:08.421596 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-c4hz6" event={"ID":"17ee04bb-14fc-4d85-9293-fe465ddc167d","Type":"ContainerStarted","Data":"8d5f6a5a29763fc7de515e0cac178c8a55dfd42f7fb13de94e71930622eb03b5"} Sep 30 14:17:08 crc kubenswrapper[4676]: I0930 14:17:08.426708 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-txc97" event={"ID":"1fb231af-b3b4-45d8-ae43-8b418e1dfcf5","Type":"ContainerDied","Data":"a1ea1d8b19b4205385f691ed1ec9ef5f597a353afa2cefd1e626271cb867597a"} Sep 30 14:17:08 crc kubenswrapper[4676]: I0930 14:17:08.426740 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-txc97" Sep 30 14:17:08 crc kubenswrapper[4676]: I0930 14:17:08.456359 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=106.242532351 podStartE2EDuration="2m3.456334178s" podCreationTimestamp="2025-09-30 14:15:05 +0000 UTC" firstStartedPulling="2025-09-30 14:15:12.753008888 +0000 UTC m=+1016.736097317" lastFinishedPulling="2025-09-30 14:15:29.966810715 +0000 UTC m=+1033.949899144" observedRunningTime="2025-09-30 14:17:08.448693831 +0000 UTC m=+1132.431782280" watchObservedRunningTime="2025-09-30 14:17:08.456334178 +0000 UTC m=+1132.439422607" Sep 30 14:17:08 crc kubenswrapper[4676]: I0930 14:17:08.499696 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-txc97"] Sep 30 14:17:08 crc kubenswrapper[4676]: I0930 14:17:08.517062 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-txc97"] Sep 30 14:17:09 crc kubenswrapper[4676]: I0930 14:17:09.446984 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fb231af-b3b4-45d8-ae43-8b418e1dfcf5" path="/var/lib/kubelet/pods/1fb231af-b3b4-45d8-ae43-8b418e1dfcf5/volumes" Sep 30 14:17:09 crc kubenswrapper[4676]: I0930 14:17:09.448276 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0ee88379-95a2-4019-a41a-7931a5ab2f30","Type":"ContainerStarted","Data":"f061333676831439f5d651eeb48da1b574912c769ed66d6ad08be9c6767e44fd"} Sep 30 14:17:09 crc kubenswrapper[4676]: I0930 14:17:09.448322 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" event={"ID":"af133cb7-f0e4-428e-b348-c6e81493fc1d","Type":"ContainerStarted","Data":"5e91fb257d3a45cd5a74b5617de04aa40d1ce872ef596abb2a4557538639b58d"} Sep 30 14:17:09 crc kubenswrapper[4676]: I0930 14:17:09.448343 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Sep 30 14:17:09 crc kubenswrapper[4676]: I0930 14:17:09.503713 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=113.555832951 podStartE2EDuration="2m4.503695828s" podCreationTimestamp="2025-09-30 14:15:05 +0000 UTC" firstStartedPulling="2025-09-30 14:15:19.034151701 +0000 UTC m=+1023.017240120" lastFinishedPulling="2025-09-30 14:15:29.982014568 +0000 UTC m=+1033.965102997" observedRunningTime="2025-09-30 14:17:09.482576017 +0000 UTC m=+1133.465664446" watchObservedRunningTime="2025-09-30 14:17:09.503695828 +0000 UTC m=+1133.486784257" Sep 30 14:17:10 crc kubenswrapper[4676]: I0930 14:17:10.455628 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d32e1e85-6a70-4751-9223-85e7018c3cc7","Type":"ContainerStarted","Data":"57b2c4c3985f96e63724996d6147259085ff756eaff7d4bbbf0987d70cca8042"} Sep 30 14:17:10 crc kubenswrapper[4676]: I0930 14:17:10.460144 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-7t792" event={"ID":"a69f5429-b9b9-47c3-b720-1a59c5d87b27","Type":"ContainerStarted","Data":"446839e228e1525fa9888a2de4fae6a6eb5a4a0b16e2bf023a652e2b154c87aa"} Sep 30 14:17:11 crc kubenswrapper[4676]: E0930 14:17:11.463867 4676 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Sep 30 14:17:11 crc kubenswrapper[4676]: E0930 14:17:11.464513 4676 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Sep 30 14:17:11 crc kubenswrapper[4676]: E0930 14:17:11.464730 4676 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-state-metrics,Image:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,Command:[],Args:[--resources=pods --namespaces=openstack],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http-metrics,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},ContainerPort{Name:telemetry,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-785br,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/livez,Port:{0 8080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-state-metrics-0_openstack(ad8d4649-f28a-4d12-884f-44308450c02b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 30 14:17:11 crc kubenswrapper[4676]: E0930 14:17:11.465965 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/kube-state-metrics-0" podUID="ad8d4649-f28a-4d12-884f-44308450c02b" Sep 30 14:17:11 crc kubenswrapper[4676]: I0930 14:17:11.471973 4676 generic.go:334] "Generic (PLEG): container finished" podID="a69f5429-b9b9-47c3-b720-1a59c5d87b27" containerID="446839e228e1525fa9888a2de4fae6a6eb5a4a0b16e2bf023a652e2b154c87aa" exitCode=0 Sep 30 14:17:11 crc kubenswrapper[4676]: I0930 14:17:11.472068 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-7t792" event={"ID":"a69f5429-b9b9-47c3-b720-1a59c5d87b27","Type":"ContainerDied","Data":"446839e228e1525fa9888a2de4fae6a6eb5a4a0b16e2bf023a652e2b154c87aa"} Sep 30 14:17:11 crc kubenswrapper[4676]: I0930 14:17:11.475549 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"7aef6349-6b33-4f9e-972d-a990cb3ff62e","Type":"ContainerStarted","Data":"251e0e0fe344f4f36c9da62babf6a7398455ee03578ddd8661af906f61f64f78"} Sep 30 14:17:12 crc kubenswrapper[4676]: E0930 14:17:12.424368 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-sb\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovsdbserver-sb-0" podUID="e73b3580-31d5-4c06-9bd8-acbd16c5c48d" Sep 30 14:17:12 crc kubenswrapper[4676]: E0930 14:17:12.430677 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-nb\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovsdbserver-nb-0" podUID="35e313d1-3779-4eb1-b12f-c3b5432dfd1d" Sep 30 14:17:12 crc kubenswrapper[4676]: I0930 14:17:12.487272 4676 generic.go:334] "Generic (PLEG): container finished" podID="44390184-0cf3-4920-9d04-63311603ba59" containerID="ee0d0a5f30255affb35ea1b1f8c82a4290262636134f53949593028377e48945" exitCode=0 Sep 30 14:17:12 crc kubenswrapper[4676]: I0930 14:17:12.487597 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-dmzbf" event={"ID":"44390184-0cf3-4920-9d04-63311603ba59","Type":"ContainerDied","Data":"ee0d0a5f30255affb35ea1b1f8c82a4290262636134f53949593028377e48945"} Sep 30 14:17:12 crc kubenswrapper[4676]: I0930 14:17:12.489448 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"35e313d1-3779-4eb1-b12f-c3b5432dfd1d","Type":"ContainerStarted","Data":"2eb616e8b2549a08e2df6efffc2f506e2a4bf3d50c0a1bdfad6a781edca2205f"} Sep 30 14:17:12 crc kubenswrapper[4676]: I0930 14:17:12.491896 4676 generic.go:334] "Generic (PLEG): container finished" podID="17ee04bb-14fc-4d85-9293-fe465ddc167d" containerID="07281f4813bc154b4dd8c2cc83444cc0373495625f41770f5b07d3dc294acc99" exitCode=0 Sep 30 14:17:12 crc kubenswrapper[4676]: I0930 14:17:12.491965 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-c4hz6" event={"ID":"17ee04bb-14fc-4d85-9293-fe465ddc167d","Type":"ContainerDied","Data":"07281f4813bc154b4dd8c2cc83444cc0373495625f41770f5b07d3dc294acc99"} Sep 30 14:17:12 crc kubenswrapper[4676]: I0930 14:17:12.498721 4676 generic.go:334] "Generic (PLEG): container finished" podID="b6fe7625-e1a3-4abd-b42b-591d546d5e37" containerID="ddebadf6975819f2a59d873dcfb0ed3828b28d8c54b43f5c1ad6cc365a4445ac" exitCode=0 Sep 30 14:17:12 crc kubenswrapper[4676]: I0930 14:17:12.498846 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-kp5vg" event={"ID":"b6fe7625-e1a3-4abd-b42b-591d546d5e37","Type":"ContainerDied","Data":"ddebadf6975819f2a59d873dcfb0ed3828b28d8c54b43f5c1ad6cc365a4445ac"} Sep 30 14:17:12 crc kubenswrapper[4676]: I0930 14:17:12.504125 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"e73b3580-31d5-4c06-9bd8-acbd16c5c48d","Type":"ContainerStarted","Data":"02ae432f9d16ed73fad56e2a9aa2e4ab60c59d922e4f2f0095550e8878e60ecd"} Sep 30 14:17:12 crc kubenswrapper[4676]: E0930 14:17:12.509566 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-sb\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified\\\"\"" pod="openstack/ovsdbserver-sb-0" podUID="e73b3580-31d5-4c06-9bd8-acbd16c5c48d" Sep 30 14:17:12 crc kubenswrapper[4676]: I0930 14:17:12.529342 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"9762912c-8ca3-4791-93c0-4d5728543998","Type":"ContainerStarted","Data":"a04c5df4164ca44c2399424543eace3af2c8b1b60af11323573f864dc3ad4e2b"} Sep 30 14:17:12 crc kubenswrapper[4676]: I0930 14:17:12.535588 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-w8c76" event={"ID":"39632f21-878d-4bb9-ba72-afcac2cd0b5d","Type":"ContainerStarted","Data":"1b74c043edd44be5ef9e9e1153486cf01b9e581b3eea58c55755c254d2e2b748"} Sep 30 14:17:12 crc kubenswrapper[4676]: I0930 14:17:12.573713 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-7t792" event={"ID":"a69f5429-b9b9-47c3-b720-1a59c5d87b27","Type":"ContainerStarted","Data":"13465d98527f2e60aa7b36229d2b20ad5841ea3c7e47d7ab7216e02e57aa76e8"} Sep 30 14:17:12 crc kubenswrapper[4676]: I0930 14:17:12.575001 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Sep 30 14:17:12 crc kubenswrapper[4676]: E0930 14:17:12.576443 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0\\\"\"" pod="openstack/kube-state-metrics-0" podUID="ad8d4649-f28a-4d12-884f-44308450c02b" Sep 30 14:17:12 crc kubenswrapper[4676]: I0930 14:17:12.642839 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=25.735794106 podStartE2EDuration="2m3.642819449s" podCreationTimestamp="2025-09-30 14:15:09 +0000 UTC" firstStartedPulling="2025-09-30 14:15:26.617099776 +0000 UTC m=+1030.600188205" lastFinishedPulling="2025-09-30 14:17:04.524125119 +0000 UTC m=+1128.507213548" observedRunningTime="2025-09-30 14:17:12.639814568 +0000 UTC m=+1136.622902997" watchObservedRunningTime="2025-09-30 14:17:12.642819449 +0000 UTC m=+1136.625907878" Sep 30 14:17:12 crc kubenswrapper[4676]: I0930 14:17:12.698937 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-w8c76" podStartSLOduration=27.372418459 podStartE2EDuration="30.698912065s" podCreationTimestamp="2025-09-30 14:16:42 +0000 UTC" firstStartedPulling="2025-09-30 14:17:08.270301419 +0000 UTC m=+1132.253389848" lastFinishedPulling="2025-09-30 14:17:11.596795025 +0000 UTC m=+1135.579883454" observedRunningTime="2025-09-30 14:17:12.661433862 +0000 UTC m=+1136.644522301" watchObservedRunningTime="2025-09-30 14:17:12.698912065 +0000 UTC m=+1136.682000494" Sep 30 14:17:12 crc kubenswrapper[4676]: I0930 14:17:12.885081 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-kp5vg" Sep 30 14:17:12 crc kubenswrapper[4676]: I0930 14:17:12.952598 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6fe7625-e1a3-4abd-b42b-591d546d5e37-dns-svc\") pod \"b6fe7625-e1a3-4abd-b42b-591d546d5e37\" (UID: \"b6fe7625-e1a3-4abd-b42b-591d546d5e37\") " Sep 30 14:17:12 crc kubenswrapper[4676]: I0930 14:17:12.953251 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ffw8\" (UniqueName: \"kubernetes.io/projected/b6fe7625-e1a3-4abd-b42b-591d546d5e37-kube-api-access-6ffw8\") pod \"b6fe7625-e1a3-4abd-b42b-591d546d5e37\" (UID: \"b6fe7625-e1a3-4abd-b42b-591d546d5e37\") " Sep 30 14:17:12 crc kubenswrapper[4676]: I0930 14:17:12.954244 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6fe7625-e1a3-4abd-b42b-591d546d5e37-config\") pod \"b6fe7625-e1a3-4abd-b42b-591d546d5e37\" (UID: \"b6fe7625-e1a3-4abd-b42b-591d546d5e37\") " Sep 30 14:17:12 crc kubenswrapper[4676]: I0930 14:17:12.963425 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6fe7625-e1a3-4abd-b42b-591d546d5e37-kube-api-access-6ffw8" (OuterVolumeSpecName: "kube-api-access-6ffw8") pod "b6fe7625-e1a3-4abd-b42b-591d546d5e37" (UID: "b6fe7625-e1a3-4abd-b42b-591d546d5e37"). InnerVolumeSpecName "kube-api-access-6ffw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:17:13 crc kubenswrapper[4676]: I0930 14:17:13.001753 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6fe7625-e1a3-4abd-b42b-591d546d5e37-config" (OuterVolumeSpecName: "config") pod "b6fe7625-e1a3-4abd-b42b-591d546d5e37" (UID: "b6fe7625-e1a3-4abd-b42b-591d546d5e37"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:17:13 crc kubenswrapper[4676]: I0930 14:17:13.003227 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6fe7625-e1a3-4abd-b42b-591d546d5e37-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b6fe7625-e1a3-4abd-b42b-591d546d5e37" (UID: "b6fe7625-e1a3-4abd-b42b-591d546d5e37"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:17:13 crc kubenswrapper[4676]: I0930 14:17:13.057976 4676 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6fe7625-e1a3-4abd-b42b-591d546d5e37-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 14:17:13 crc kubenswrapper[4676]: I0930 14:17:13.058007 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ffw8\" (UniqueName: \"kubernetes.io/projected/b6fe7625-e1a3-4abd-b42b-591d546d5e37-kube-api-access-6ffw8\") on node \"crc\" DevicePath \"\"" Sep 30 14:17:13 crc kubenswrapper[4676]: I0930 14:17:13.058020 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6fe7625-e1a3-4abd-b42b-591d546d5e37-config\") on node \"crc\" DevicePath \"\"" Sep 30 14:17:13 crc kubenswrapper[4676]: I0930 14:17:13.585343 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-dmzbf" event={"ID":"44390184-0cf3-4920-9d04-63311603ba59","Type":"ContainerStarted","Data":"9e722b0464a39ea68b7c22a897b2b1a8e1a17d50a45e8e5834a0d2c87b476ef5"} Sep 30 14:17:13 crc kubenswrapper[4676]: I0930 14:17:13.585946 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7fd796d7df-dmzbf" Sep 30 14:17:13 crc kubenswrapper[4676]: I0930 14:17:13.588468 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-c4hz6" event={"ID":"17ee04bb-14fc-4d85-9293-fe465ddc167d","Type":"ContainerStarted","Data":"121430a402f3098e0f8e2a63bbe8499b0f62579eb5aaaa82336bf60e604359ae"} Sep 30 14:17:13 crc kubenswrapper[4676]: I0930 14:17:13.588608 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-c4hz6" Sep 30 14:17:13 crc kubenswrapper[4676]: I0930 14:17:13.592041 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-kp5vg" event={"ID":"b6fe7625-e1a3-4abd-b42b-591d546d5e37","Type":"ContainerDied","Data":"478dc7fff95b07ff7747b232da99f30605451d6bbba400f9de97bfe47c28f506"} Sep 30 14:17:13 crc kubenswrapper[4676]: I0930 14:17:13.592090 4676 scope.go:117] "RemoveContainer" containerID="ddebadf6975819f2a59d873dcfb0ed3828b28d8c54b43f5c1ad6cc365a4445ac" Sep 30 14:17:13 crc kubenswrapper[4676]: I0930 14:17:13.592198 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-kp5vg" Sep 30 14:17:13 crc kubenswrapper[4676]: I0930 14:17:13.602345 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-7t792" event={"ID":"a69f5429-b9b9-47c3-b720-1a59c5d87b27","Type":"ContainerStarted","Data":"b61053cac0fd63a17d35e9cb0462b986aaadaa52dd6a9ae293b7eeb9432e7837"} Sep 30 14:17:13 crc kubenswrapper[4676]: E0930 14:17:13.631241 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-sb\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified\\\"\"" pod="openstack/ovsdbserver-sb-0" podUID="e73b3580-31d5-4c06-9bd8-acbd16c5c48d" Sep 30 14:17:13 crc kubenswrapper[4676]: I0930 14:17:13.655307 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-7t792" podStartSLOduration=23.08413067 podStartE2EDuration="1m59.655268265s" podCreationTimestamp="2025-09-30 14:15:14 +0000 UTC" firstStartedPulling="2025-09-30 14:15:27.966086188 +0000 UTC m=+1031.949174617" lastFinishedPulling="2025-09-30 14:17:04.537223783 +0000 UTC m=+1128.520312212" observedRunningTime="2025-09-30 14:17:13.647364021 +0000 UTC m=+1137.630452460" watchObservedRunningTime="2025-09-30 14:17:13.655268265 +0000 UTC m=+1137.638356704" Sep 30 14:17:13 crc kubenswrapper[4676]: I0930 14:17:13.657954 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7fd796d7df-dmzbf" podStartSLOduration=31.657939858 podStartE2EDuration="31.657939858s" podCreationTimestamp="2025-09-30 14:16:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:17:13.610553557 +0000 UTC m=+1137.593641976" watchObservedRunningTime="2025-09-30 14:17:13.657939858 +0000 UTC m=+1137.641028287" Sep 30 14:17:13 crc kubenswrapper[4676]: I0930 14:17:13.739968 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-kp5vg"] Sep 30 14:17:13 crc kubenswrapper[4676]: I0930 14:17:13.747116 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-kp5vg"] Sep 30 14:17:14 crc kubenswrapper[4676]: I0930 14:17:14.614717 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"35e313d1-3779-4eb1-b12f-c3b5432dfd1d","Type":"ContainerStarted","Data":"b174a83f489acdaa53bb86af7a832fc1898e63f72d9534daa409b7d02a250e8f"} Sep 30 14:17:14 crc kubenswrapper[4676]: I0930 14:17:14.615154 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-7t792" Sep 30 14:17:14 crc kubenswrapper[4676]: I0930 14:17:14.615565 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-7t792" Sep 30 14:17:14 crc kubenswrapper[4676]: I0930 14:17:14.640952 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-c4hz6" podStartSLOduration=32.640929858 podStartE2EDuration="32.640929858s" podCreationTimestamp="2025-09-30 14:16:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:17:13.7583028 +0000 UTC m=+1137.741391239" watchObservedRunningTime="2025-09-30 14:17:14.640929858 +0000 UTC m=+1138.624018287" Sep 30 14:17:14 crc kubenswrapper[4676]: I0930 14:17:14.643905 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=15.611560874 podStartE2EDuration="2m0.643874028s" podCreationTimestamp="2025-09-30 14:15:14 +0000 UTC" firstStartedPulling="2025-09-30 14:15:28.264234775 +0000 UTC m=+1032.247323204" lastFinishedPulling="2025-09-30 14:17:13.296547929 +0000 UTC m=+1137.279636358" observedRunningTime="2025-09-30 14:17:14.635274255 +0000 UTC m=+1138.618362684" watchObservedRunningTime="2025-09-30 14:17:14.643874028 +0000 UTC m=+1138.626980827" Sep 30 14:17:15 crc kubenswrapper[4676]: I0930 14:17:15.442272 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6fe7625-e1a3-4abd-b42b-591d546d5e37" path="/var/lib/kubelet/pods/b6fe7625-e1a3-4abd-b42b-591d546d5e37/volumes" Sep 30 14:17:16 crc kubenswrapper[4676]: I0930 14:17:16.173027 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Sep 30 14:17:16 crc kubenswrapper[4676]: I0930 14:17:16.173081 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Sep 30 14:17:16 crc kubenswrapper[4676]: I0930 14:17:16.208204 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Sep 30 14:17:17 crc kubenswrapper[4676]: I0930 14:17:17.957044 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7fd796d7df-dmzbf" Sep 30 14:17:18 crc kubenswrapper[4676]: I0930 14:17:18.219069 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-c4hz6" Sep 30 14:17:18 crc kubenswrapper[4676]: I0930 14:17:18.285103 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-dmzbf"] Sep 30 14:17:18 crc kubenswrapper[4676]: I0930 14:17:18.644738 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7fd796d7df-dmzbf" podUID="44390184-0cf3-4920-9d04-63311603ba59" containerName="dnsmasq-dns" containerID="cri-o://9e722b0464a39ea68b7c22a897b2b1a8e1a17d50a45e8e5834a0d2c87b476ef5" gracePeriod=10 Sep 30 14:17:19 crc kubenswrapper[4676]: I0930 14:17:19.124375 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-dmzbf" Sep 30 14:17:19 crc kubenswrapper[4676]: I0930 14:17:19.265453 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g66gs\" (UniqueName: \"kubernetes.io/projected/44390184-0cf3-4920-9d04-63311603ba59-kube-api-access-g66gs\") pod \"44390184-0cf3-4920-9d04-63311603ba59\" (UID: \"44390184-0cf3-4920-9d04-63311603ba59\") " Sep 30 14:17:19 crc kubenswrapper[4676]: I0930 14:17:19.265499 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/44390184-0cf3-4920-9d04-63311603ba59-dns-svc\") pod \"44390184-0cf3-4920-9d04-63311603ba59\" (UID: \"44390184-0cf3-4920-9d04-63311603ba59\") " Sep 30 14:17:19 crc kubenswrapper[4676]: I0930 14:17:19.265707 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/44390184-0cf3-4920-9d04-63311603ba59-ovsdbserver-nb\") pod \"44390184-0cf3-4920-9d04-63311603ba59\" (UID: \"44390184-0cf3-4920-9d04-63311603ba59\") " Sep 30 14:17:19 crc kubenswrapper[4676]: I0930 14:17:19.265751 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44390184-0cf3-4920-9d04-63311603ba59-config\") pod \"44390184-0cf3-4920-9d04-63311603ba59\" (UID: \"44390184-0cf3-4920-9d04-63311603ba59\") " Sep 30 14:17:19 crc kubenswrapper[4676]: I0930 14:17:19.279242 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44390184-0cf3-4920-9d04-63311603ba59-kube-api-access-g66gs" (OuterVolumeSpecName: "kube-api-access-g66gs") pod "44390184-0cf3-4920-9d04-63311603ba59" (UID: "44390184-0cf3-4920-9d04-63311603ba59"). InnerVolumeSpecName "kube-api-access-g66gs". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:17:19 crc kubenswrapper[4676]: I0930 14:17:19.305541 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44390184-0cf3-4920-9d04-63311603ba59-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "44390184-0cf3-4920-9d04-63311603ba59" (UID: "44390184-0cf3-4920-9d04-63311603ba59"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:17:19 crc kubenswrapper[4676]: I0930 14:17:19.305586 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44390184-0cf3-4920-9d04-63311603ba59-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "44390184-0cf3-4920-9d04-63311603ba59" (UID: "44390184-0cf3-4920-9d04-63311603ba59"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:17:19 crc kubenswrapper[4676]: I0930 14:17:19.312800 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44390184-0cf3-4920-9d04-63311603ba59-config" (OuterVolumeSpecName: "config") pod "44390184-0cf3-4920-9d04-63311603ba59" (UID: "44390184-0cf3-4920-9d04-63311603ba59"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:17:19 crc kubenswrapper[4676]: I0930 14:17:19.368463 4676 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/44390184-0cf3-4920-9d04-63311603ba59-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 14:17:19 crc kubenswrapper[4676]: I0930 14:17:19.368516 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44390184-0cf3-4920-9d04-63311603ba59-config\") on node \"crc\" DevicePath \"\"" Sep 30 14:17:19 crc kubenswrapper[4676]: I0930 14:17:19.368568 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g66gs\" (UniqueName: \"kubernetes.io/projected/44390184-0cf3-4920-9d04-63311603ba59-kube-api-access-g66gs\") on node \"crc\" DevicePath \"\"" Sep 30 14:17:19 crc kubenswrapper[4676]: I0930 14:17:19.368587 4676 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/44390184-0cf3-4920-9d04-63311603ba59-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 14:17:19 crc kubenswrapper[4676]: I0930 14:17:19.657078 4676 generic.go:334] "Generic (PLEG): container finished" podID="44390184-0cf3-4920-9d04-63311603ba59" containerID="9e722b0464a39ea68b7c22a897b2b1a8e1a17d50a45e8e5834a0d2c87b476ef5" exitCode=0 Sep 30 14:17:19 crc kubenswrapper[4676]: I0930 14:17:19.657150 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-dmzbf" Sep 30 14:17:19 crc kubenswrapper[4676]: I0930 14:17:19.657145 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-dmzbf" event={"ID":"44390184-0cf3-4920-9d04-63311603ba59","Type":"ContainerDied","Data":"9e722b0464a39ea68b7c22a897b2b1a8e1a17d50a45e8e5834a0d2c87b476ef5"} Sep 30 14:17:19 crc kubenswrapper[4676]: I0930 14:17:19.657238 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-dmzbf" event={"ID":"44390184-0cf3-4920-9d04-63311603ba59","Type":"ContainerDied","Data":"4ea46eb01b011bc39003f8ef8bd240a97642add49526aa27b7d1178289bf02ec"} Sep 30 14:17:19 crc kubenswrapper[4676]: I0930 14:17:19.657270 4676 scope.go:117] "RemoveContainer" containerID="9e722b0464a39ea68b7c22a897b2b1a8e1a17d50a45e8e5834a0d2c87b476ef5" Sep 30 14:17:19 crc kubenswrapper[4676]: I0930 14:17:19.688385 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-dmzbf"] Sep 30 14:17:19 crc kubenswrapper[4676]: I0930 14:17:19.694658 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-dmzbf"] Sep 30 14:17:19 crc kubenswrapper[4676]: I0930 14:17:19.702953 4676 scope.go:117] "RemoveContainer" containerID="ee0d0a5f30255affb35ea1b1f8c82a4290262636134f53949593028377e48945" Sep 30 14:17:19 crc kubenswrapper[4676]: I0930 14:17:19.734080 4676 scope.go:117] "RemoveContainer" containerID="9e722b0464a39ea68b7c22a897b2b1a8e1a17d50a45e8e5834a0d2c87b476ef5" Sep 30 14:17:19 crc kubenswrapper[4676]: E0930 14:17:19.734616 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e722b0464a39ea68b7c22a897b2b1a8e1a17d50a45e8e5834a0d2c87b476ef5\": container with ID starting with 9e722b0464a39ea68b7c22a897b2b1a8e1a17d50a45e8e5834a0d2c87b476ef5 not found: ID does not exist" containerID="9e722b0464a39ea68b7c22a897b2b1a8e1a17d50a45e8e5834a0d2c87b476ef5" Sep 30 14:17:19 crc kubenswrapper[4676]: I0930 14:17:19.734675 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e722b0464a39ea68b7c22a897b2b1a8e1a17d50a45e8e5834a0d2c87b476ef5"} err="failed to get container status \"9e722b0464a39ea68b7c22a897b2b1a8e1a17d50a45e8e5834a0d2c87b476ef5\": rpc error: code = NotFound desc = could not find container \"9e722b0464a39ea68b7c22a897b2b1a8e1a17d50a45e8e5834a0d2c87b476ef5\": container with ID starting with 9e722b0464a39ea68b7c22a897b2b1a8e1a17d50a45e8e5834a0d2c87b476ef5 not found: ID does not exist" Sep 30 14:17:19 crc kubenswrapper[4676]: I0930 14:17:19.734706 4676 scope.go:117] "RemoveContainer" containerID="ee0d0a5f30255affb35ea1b1f8c82a4290262636134f53949593028377e48945" Sep 30 14:17:19 crc kubenswrapper[4676]: E0930 14:17:19.735240 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee0d0a5f30255affb35ea1b1f8c82a4290262636134f53949593028377e48945\": container with ID starting with ee0d0a5f30255affb35ea1b1f8c82a4290262636134f53949593028377e48945 not found: ID does not exist" containerID="ee0d0a5f30255affb35ea1b1f8c82a4290262636134f53949593028377e48945" Sep 30 14:17:19 crc kubenswrapper[4676]: I0930 14:17:19.735278 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee0d0a5f30255affb35ea1b1f8c82a4290262636134f53949593028377e48945"} err="failed to get container status \"ee0d0a5f30255affb35ea1b1f8c82a4290262636134f53949593028377e48945\": rpc error: code = NotFound desc = could not find container \"ee0d0a5f30255affb35ea1b1f8c82a4290262636134f53949593028377e48945\": container with ID starting with ee0d0a5f30255affb35ea1b1f8c82a4290262636134f53949593028377e48945 not found: ID does not exist" Sep 30 14:17:19 crc kubenswrapper[4676]: I0930 14:17:19.962042 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Sep 30 14:17:20 crc kubenswrapper[4676]: I0930 14:17:20.669467 4676 generic.go:334] "Generic (PLEG): container finished" podID="d32e1e85-6a70-4751-9223-85e7018c3cc7" containerID="57b2c4c3985f96e63724996d6147259085ff756eaff7d4bbbf0987d70cca8042" exitCode=0 Sep 30 14:17:20 crc kubenswrapper[4676]: I0930 14:17:20.669563 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d32e1e85-6a70-4751-9223-85e7018c3cc7","Type":"ContainerDied","Data":"57b2c4c3985f96e63724996d6147259085ff756eaff7d4bbbf0987d70cca8042"} Sep 30 14:17:20 crc kubenswrapper[4676]: I0930 14:17:20.671442 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9nlxb" event={"ID":"72431bf0-bd8f-431d-81a8-082f9ef654e1","Type":"ContainerStarted","Data":"0aa0e566bd37bbac63467d52b3764ca3d50466b2cd438c45f47bc915427679f7"} Sep 30 14:17:20 crc kubenswrapper[4676]: I0930 14:17:20.671639 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-9nlxb" Sep 30 14:17:20 crc kubenswrapper[4676]: I0930 14:17:20.736672 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-9nlxb" podStartSLOduration=14.927493318 podStartE2EDuration="2m6.736655834s" podCreationTimestamp="2025-09-30 14:15:14 +0000 UTC" firstStartedPulling="2025-09-30 14:15:27.799141439 +0000 UTC m=+1031.782229868" lastFinishedPulling="2025-09-30 14:17:19.608303955 +0000 UTC m=+1143.591392384" observedRunningTime="2025-09-30 14:17:20.730610091 +0000 UTC m=+1144.713698530" watchObservedRunningTime="2025-09-30 14:17:20.736655834 +0000 UTC m=+1144.719744263" Sep 30 14:17:21 crc kubenswrapper[4676]: I0930 14:17:21.224390 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Sep 30 14:17:21 crc kubenswrapper[4676]: I0930 14:17:21.443251 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44390184-0cf3-4920-9d04-63311603ba59" path="/var/lib/kubelet/pods/44390184-0cf3-4920-9d04-63311603ba59/volumes" Sep 30 14:17:21 crc kubenswrapper[4676]: I0930 14:17:21.627345 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-zmzf2"] Sep 30 14:17:21 crc kubenswrapper[4676]: E0930 14:17:21.627742 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6fe7625-e1a3-4abd-b42b-591d546d5e37" containerName="init" Sep 30 14:17:21 crc kubenswrapper[4676]: I0930 14:17:21.627762 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6fe7625-e1a3-4abd-b42b-591d546d5e37" containerName="init" Sep 30 14:17:21 crc kubenswrapper[4676]: E0930 14:17:21.627784 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44390184-0cf3-4920-9d04-63311603ba59" containerName="dnsmasq-dns" Sep 30 14:17:21 crc kubenswrapper[4676]: I0930 14:17:21.627792 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="44390184-0cf3-4920-9d04-63311603ba59" containerName="dnsmasq-dns" Sep 30 14:17:21 crc kubenswrapper[4676]: E0930 14:17:21.627816 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44390184-0cf3-4920-9d04-63311603ba59" containerName="init" Sep 30 14:17:21 crc kubenswrapper[4676]: I0930 14:17:21.627824 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="44390184-0cf3-4920-9d04-63311603ba59" containerName="init" Sep 30 14:17:21 crc kubenswrapper[4676]: I0930 14:17:21.628018 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6fe7625-e1a3-4abd-b42b-591d546d5e37" containerName="init" Sep 30 14:17:21 crc kubenswrapper[4676]: I0930 14:17:21.628049 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="44390184-0cf3-4920-9d04-63311603ba59" containerName="dnsmasq-dns" Sep 30 14:17:21 crc kubenswrapper[4676]: I0930 14:17:21.629144 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-zmzf2" Sep 30 14:17:21 crc kubenswrapper[4676]: I0930 14:17:21.683326 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-zmzf2"] Sep 30 14:17:21 crc kubenswrapper[4676]: I0930 14:17:21.687406 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d32e1e85-6a70-4751-9223-85e7018c3cc7","Type":"ContainerStarted","Data":"14856e6842c1ac0e267accaa84b1a7c7f37a5391d7683fc91e7104f9233446c7"} Sep 30 14:17:21 crc kubenswrapper[4676]: I0930 14:17:21.710843 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/70a50116-382a-4dae-8f0a-d47de54cffcf-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-zmzf2\" (UID: \"70a50116-382a-4dae-8f0a-d47de54cffcf\") " pod="openstack/dnsmasq-dns-698758b865-zmzf2" Sep 30 14:17:21 crc kubenswrapper[4676]: I0930 14:17:21.710940 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70a50116-382a-4dae-8f0a-d47de54cffcf-config\") pod \"dnsmasq-dns-698758b865-zmzf2\" (UID: \"70a50116-382a-4dae-8f0a-d47de54cffcf\") " pod="openstack/dnsmasq-dns-698758b865-zmzf2" Sep 30 14:17:21 crc kubenswrapper[4676]: I0930 14:17:21.710991 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtjbp\" (UniqueName: \"kubernetes.io/projected/70a50116-382a-4dae-8f0a-d47de54cffcf-kube-api-access-mtjbp\") pod \"dnsmasq-dns-698758b865-zmzf2\" (UID: \"70a50116-382a-4dae-8f0a-d47de54cffcf\") " pod="openstack/dnsmasq-dns-698758b865-zmzf2" Sep 30 14:17:21 crc kubenswrapper[4676]: I0930 14:17:21.711124 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/70a50116-382a-4dae-8f0a-d47de54cffcf-dns-svc\") pod \"dnsmasq-dns-698758b865-zmzf2\" (UID: \"70a50116-382a-4dae-8f0a-d47de54cffcf\") " pod="openstack/dnsmasq-dns-698758b865-zmzf2" Sep 30 14:17:21 crc kubenswrapper[4676]: I0930 14:17:21.711311 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/70a50116-382a-4dae-8f0a-d47de54cffcf-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-zmzf2\" (UID: \"70a50116-382a-4dae-8f0a-d47de54cffcf\") " pod="openstack/dnsmasq-dns-698758b865-zmzf2" Sep 30 14:17:21 crc kubenswrapper[4676]: I0930 14:17:21.762295 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=38.052908791 podStartE2EDuration="2m14.762234346s" podCreationTimestamp="2025-09-30 14:15:07 +0000 UTC" firstStartedPulling="2025-09-30 14:15:27.801809873 +0000 UTC m=+1031.784898302" lastFinishedPulling="2025-09-30 14:17:04.511135428 +0000 UTC m=+1128.494223857" observedRunningTime="2025-09-30 14:17:21.759600875 +0000 UTC m=+1145.742689314" watchObservedRunningTime="2025-09-30 14:17:21.762234346 +0000 UTC m=+1145.745322775" Sep 30 14:17:21 crc kubenswrapper[4676]: I0930 14:17:21.813570 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/70a50116-382a-4dae-8f0a-d47de54cffcf-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-zmzf2\" (UID: \"70a50116-382a-4dae-8f0a-d47de54cffcf\") " pod="openstack/dnsmasq-dns-698758b865-zmzf2" Sep 30 14:17:21 crc kubenswrapper[4676]: I0930 14:17:21.813676 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70a50116-382a-4dae-8f0a-d47de54cffcf-config\") pod \"dnsmasq-dns-698758b865-zmzf2\" (UID: \"70a50116-382a-4dae-8f0a-d47de54cffcf\") " pod="openstack/dnsmasq-dns-698758b865-zmzf2" Sep 30 14:17:21 crc kubenswrapper[4676]: I0930 14:17:21.813704 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtjbp\" (UniqueName: \"kubernetes.io/projected/70a50116-382a-4dae-8f0a-d47de54cffcf-kube-api-access-mtjbp\") pod \"dnsmasq-dns-698758b865-zmzf2\" (UID: \"70a50116-382a-4dae-8f0a-d47de54cffcf\") " pod="openstack/dnsmasq-dns-698758b865-zmzf2" Sep 30 14:17:21 crc kubenswrapper[4676]: I0930 14:17:21.813749 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/70a50116-382a-4dae-8f0a-d47de54cffcf-dns-svc\") pod \"dnsmasq-dns-698758b865-zmzf2\" (UID: \"70a50116-382a-4dae-8f0a-d47de54cffcf\") " pod="openstack/dnsmasq-dns-698758b865-zmzf2" Sep 30 14:17:21 crc kubenswrapper[4676]: I0930 14:17:21.813810 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/70a50116-382a-4dae-8f0a-d47de54cffcf-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-zmzf2\" (UID: \"70a50116-382a-4dae-8f0a-d47de54cffcf\") " pod="openstack/dnsmasq-dns-698758b865-zmzf2" Sep 30 14:17:21 crc kubenswrapper[4676]: I0930 14:17:21.814770 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70a50116-382a-4dae-8f0a-d47de54cffcf-config\") pod \"dnsmasq-dns-698758b865-zmzf2\" (UID: \"70a50116-382a-4dae-8f0a-d47de54cffcf\") " pod="openstack/dnsmasq-dns-698758b865-zmzf2" Sep 30 14:17:21 crc kubenswrapper[4676]: I0930 14:17:21.814774 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/70a50116-382a-4dae-8f0a-d47de54cffcf-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-zmzf2\" (UID: \"70a50116-382a-4dae-8f0a-d47de54cffcf\") " pod="openstack/dnsmasq-dns-698758b865-zmzf2" Sep 30 14:17:21 crc kubenswrapper[4676]: I0930 14:17:21.815153 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/70a50116-382a-4dae-8f0a-d47de54cffcf-dns-svc\") pod \"dnsmasq-dns-698758b865-zmzf2\" (UID: \"70a50116-382a-4dae-8f0a-d47de54cffcf\") " pod="openstack/dnsmasq-dns-698758b865-zmzf2" Sep 30 14:17:21 crc kubenswrapper[4676]: I0930 14:17:21.815676 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/70a50116-382a-4dae-8f0a-d47de54cffcf-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-zmzf2\" (UID: \"70a50116-382a-4dae-8f0a-d47de54cffcf\") " pod="openstack/dnsmasq-dns-698758b865-zmzf2" Sep 30 14:17:21 crc kubenswrapper[4676]: I0930 14:17:21.848828 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtjbp\" (UniqueName: \"kubernetes.io/projected/70a50116-382a-4dae-8f0a-d47de54cffcf-kube-api-access-mtjbp\") pod \"dnsmasq-dns-698758b865-zmzf2\" (UID: \"70a50116-382a-4dae-8f0a-d47de54cffcf\") " pod="openstack/dnsmasq-dns-698758b865-zmzf2" Sep 30 14:17:21 crc kubenswrapper[4676]: I0930 14:17:21.960854 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-zmzf2" Sep 30 14:17:22 crc kubenswrapper[4676]: I0930 14:17:22.497222 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-zmzf2"] Sep 30 14:17:22 crc kubenswrapper[4676]: W0930 14:17:22.504405 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70a50116_382a_4dae_8f0a_d47de54cffcf.slice/crio-a031752b598ddea02d7de95edc0ffffa34d8b6d442fb955b5de8d9c13d41ea05 WatchSource:0}: Error finding container a031752b598ddea02d7de95edc0ffffa34d8b6d442fb955b5de8d9c13d41ea05: Status 404 returned error can't find the container with id a031752b598ddea02d7de95edc0ffffa34d8b6d442fb955b5de8d9c13d41ea05 Sep 30 14:17:22 crc kubenswrapper[4676]: I0930 14:17:22.696413 4676 generic.go:334] "Generic (PLEG): container finished" podID="9762912c-8ca3-4791-93c0-4d5728543998" containerID="a04c5df4164ca44c2399424543eace3af2c8b1b60af11323573f864dc3ad4e2b" exitCode=0 Sep 30 14:17:22 crc kubenswrapper[4676]: I0930 14:17:22.696490 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"9762912c-8ca3-4791-93c0-4d5728543998","Type":"ContainerDied","Data":"a04c5df4164ca44c2399424543eace3af2c8b1b60af11323573f864dc3ad4e2b"} Sep 30 14:17:22 crc kubenswrapper[4676]: I0930 14:17:22.700069 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-zmzf2" event={"ID":"70a50116-382a-4dae-8f0a-d47de54cffcf","Type":"ContainerStarted","Data":"a031752b598ddea02d7de95edc0ffffa34d8b6d442fb955b5de8d9c13d41ea05"} Sep 30 14:17:22 crc kubenswrapper[4676]: I0930 14:17:22.894085 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Sep 30 14:17:22 crc kubenswrapper[4676]: I0930 14:17:22.908728 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Sep 30 14:17:22 crc kubenswrapper[4676]: I0930 14:17:22.911376 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-lhtsr" Sep 30 14:17:22 crc kubenswrapper[4676]: I0930 14:17:22.911786 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Sep 30 14:17:22 crc kubenswrapper[4676]: I0930 14:17:22.911996 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Sep 30 14:17:22 crc kubenswrapper[4676]: I0930 14:17:22.913798 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Sep 30 14:17:22 crc kubenswrapper[4676]: I0930 14:17:22.915135 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Sep 30 14:17:23 crc kubenswrapper[4676]: I0930 14:17:23.040431 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnv5g\" (UniqueName: \"kubernetes.io/projected/6c0230d2-8bbc-4ad0-8f3d-062d2d940013-kube-api-access-dnv5g\") pod \"swift-storage-0\" (UID: \"6c0230d2-8bbc-4ad0-8f3d-062d2d940013\") " pod="openstack/swift-storage-0" Sep 30 14:17:23 crc kubenswrapper[4676]: I0930 14:17:23.040652 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6c0230d2-8bbc-4ad0-8f3d-062d2d940013-cache\") pod \"swift-storage-0\" (UID: \"6c0230d2-8bbc-4ad0-8f3d-062d2d940013\") " pod="openstack/swift-storage-0" Sep 30 14:17:23 crc kubenswrapper[4676]: I0930 14:17:23.040718 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"6c0230d2-8bbc-4ad0-8f3d-062d2d940013\") " pod="openstack/swift-storage-0" Sep 30 14:17:23 crc kubenswrapper[4676]: I0930 14:17:23.040741 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6c0230d2-8bbc-4ad0-8f3d-062d2d940013-lock\") pod \"swift-storage-0\" (UID: \"6c0230d2-8bbc-4ad0-8f3d-062d2d940013\") " pod="openstack/swift-storage-0" Sep 30 14:17:23 crc kubenswrapper[4676]: I0930 14:17:23.040846 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6c0230d2-8bbc-4ad0-8f3d-062d2d940013-etc-swift\") pod \"swift-storage-0\" (UID: \"6c0230d2-8bbc-4ad0-8f3d-062d2d940013\") " pod="openstack/swift-storage-0" Sep 30 14:17:23 crc kubenswrapper[4676]: I0930 14:17:23.142979 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6c0230d2-8bbc-4ad0-8f3d-062d2d940013-lock\") pod \"swift-storage-0\" (UID: \"6c0230d2-8bbc-4ad0-8f3d-062d2d940013\") " pod="openstack/swift-storage-0" Sep 30 14:17:23 crc kubenswrapper[4676]: I0930 14:17:23.143456 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"6c0230d2-8bbc-4ad0-8f3d-062d2d940013\") " pod="openstack/swift-storage-0" Sep 30 14:17:23 crc kubenswrapper[4676]: I0930 14:17:23.143573 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6c0230d2-8bbc-4ad0-8f3d-062d2d940013-etc-swift\") pod \"swift-storage-0\" (UID: \"6c0230d2-8bbc-4ad0-8f3d-062d2d940013\") " pod="openstack/swift-storage-0" Sep 30 14:17:23 crc kubenswrapper[4676]: I0930 14:17:23.143606 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnv5g\" (UniqueName: \"kubernetes.io/projected/6c0230d2-8bbc-4ad0-8f3d-062d2d940013-kube-api-access-dnv5g\") pod \"swift-storage-0\" (UID: \"6c0230d2-8bbc-4ad0-8f3d-062d2d940013\") " pod="openstack/swift-storage-0" Sep 30 14:17:23 crc kubenswrapper[4676]: I0930 14:17:23.143668 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6c0230d2-8bbc-4ad0-8f3d-062d2d940013-lock\") pod \"swift-storage-0\" (UID: \"6c0230d2-8bbc-4ad0-8f3d-062d2d940013\") " pod="openstack/swift-storage-0" Sep 30 14:17:23 crc kubenswrapper[4676]: I0930 14:17:23.143676 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6c0230d2-8bbc-4ad0-8f3d-062d2d940013-cache\") pod \"swift-storage-0\" (UID: \"6c0230d2-8bbc-4ad0-8f3d-062d2d940013\") " pod="openstack/swift-storage-0" Sep 30 14:17:23 crc kubenswrapper[4676]: E0930 14:17:23.143749 4676 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Sep 30 14:17:23 crc kubenswrapper[4676]: E0930 14:17:23.143771 4676 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Sep 30 14:17:23 crc kubenswrapper[4676]: E0930 14:17:23.143835 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6c0230d2-8bbc-4ad0-8f3d-062d2d940013-etc-swift podName:6c0230d2-8bbc-4ad0-8f3d-062d2d940013 nodeName:}" failed. No retries permitted until 2025-09-30 14:17:23.64380959 +0000 UTC m=+1147.626898079 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6c0230d2-8bbc-4ad0-8f3d-062d2d940013-etc-swift") pod "swift-storage-0" (UID: "6c0230d2-8bbc-4ad0-8f3d-062d2d940013") : configmap "swift-ring-files" not found Sep 30 14:17:23 crc kubenswrapper[4676]: I0930 14:17:23.143836 4676 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"6c0230d2-8bbc-4ad0-8f3d-062d2d940013\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/swift-storage-0" Sep 30 14:17:23 crc kubenswrapper[4676]: I0930 14:17:23.144323 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6c0230d2-8bbc-4ad0-8f3d-062d2d940013-cache\") pod \"swift-storage-0\" (UID: \"6c0230d2-8bbc-4ad0-8f3d-062d2d940013\") " pod="openstack/swift-storage-0" Sep 30 14:17:23 crc kubenswrapper[4676]: I0930 14:17:23.171530 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnv5g\" (UniqueName: \"kubernetes.io/projected/6c0230d2-8bbc-4ad0-8f3d-062d2d940013-kube-api-access-dnv5g\") pod \"swift-storage-0\" (UID: \"6c0230d2-8bbc-4ad0-8f3d-062d2d940013\") " pod="openstack/swift-storage-0" Sep 30 14:17:23 crc kubenswrapper[4676]: I0930 14:17:23.175342 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"6c0230d2-8bbc-4ad0-8f3d-062d2d940013\") " pod="openstack/swift-storage-0" Sep 30 14:17:23 crc kubenswrapper[4676]: I0930 14:17:23.371667 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-vbg62"] Sep 30 14:17:23 crc kubenswrapper[4676]: I0930 14:17:23.373022 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vbg62" Sep 30 14:17:23 crc kubenswrapper[4676]: I0930 14:17:23.377182 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Sep 30 14:17:23 crc kubenswrapper[4676]: I0930 14:17:23.377387 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Sep 30 14:17:23 crc kubenswrapper[4676]: I0930 14:17:23.380317 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Sep 30 14:17:23 crc kubenswrapper[4676]: I0930 14:17:23.391934 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-vbg62"] Sep 30 14:17:23 crc kubenswrapper[4676]: I0930 14:17:23.449324 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2r6vk\" (UniqueName: \"kubernetes.io/projected/fb668321-0ea7-4c30-9773-6c7f511959f4-kube-api-access-2r6vk\") pod \"swift-ring-rebalance-vbg62\" (UID: \"fb668321-0ea7-4c30-9773-6c7f511959f4\") " pod="openstack/swift-ring-rebalance-vbg62" Sep 30 14:17:23 crc kubenswrapper[4676]: I0930 14:17:23.449403 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fb668321-0ea7-4c30-9773-6c7f511959f4-dispersionconf\") pod \"swift-ring-rebalance-vbg62\" (UID: \"fb668321-0ea7-4c30-9773-6c7f511959f4\") " pod="openstack/swift-ring-rebalance-vbg62" Sep 30 14:17:23 crc kubenswrapper[4676]: I0930 14:17:23.449474 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fb668321-0ea7-4c30-9773-6c7f511959f4-etc-swift\") pod \"swift-ring-rebalance-vbg62\" (UID: \"fb668321-0ea7-4c30-9773-6c7f511959f4\") " pod="openstack/swift-ring-rebalance-vbg62" Sep 30 14:17:23 crc kubenswrapper[4676]: I0930 14:17:23.449523 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fb668321-0ea7-4c30-9773-6c7f511959f4-ring-data-devices\") pod \"swift-ring-rebalance-vbg62\" (UID: \"fb668321-0ea7-4c30-9773-6c7f511959f4\") " pod="openstack/swift-ring-rebalance-vbg62" Sep 30 14:17:23 crc kubenswrapper[4676]: I0930 14:17:23.449566 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fb668321-0ea7-4c30-9773-6c7f511959f4-scripts\") pod \"swift-ring-rebalance-vbg62\" (UID: \"fb668321-0ea7-4c30-9773-6c7f511959f4\") " pod="openstack/swift-ring-rebalance-vbg62" Sep 30 14:17:23 crc kubenswrapper[4676]: I0930 14:17:23.449586 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb668321-0ea7-4c30-9773-6c7f511959f4-combined-ca-bundle\") pod \"swift-ring-rebalance-vbg62\" (UID: \"fb668321-0ea7-4c30-9773-6c7f511959f4\") " pod="openstack/swift-ring-rebalance-vbg62" Sep 30 14:17:23 crc kubenswrapper[4676]: I0930 14:17:23.449701 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fb668321-0ea7-4c30-9773-6c7f511959f4-swiftconf\") pod \"swift-ring-rebalance-vbg62\" (UID: \"fb668321-0ea7-4c30-9773-6c7f511959f4\") " pod="openstack/swift-ring-rebalance-vbg62" Sep 30 14:17:23 crc kubenswrapper[4676]: I0930 14:17:23.552605 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fb668321-0ea7-4c30-9773-6c7f511959f4-swiftconf\") pod \"swift-ring-rebalance-vbg62\" (UID: \"fb668321-0ea7-4c30-9773-6c7f511959f4\") " pod="openstack/swift-ring-rebalance-vbg62" Sep 30 14:17:23 crc kubenswrapper[4676]: I0930 14:17:23.554503 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2r6vk\" (UniqueName: \"kubernetes.io/projected/fb668321-0ea7-4c30-9773-6c7f511959f4-kube-api-access-2r6vk\") pod \"swift-ring-rebalance-vbg62\" (UID: \"fb668321-0ea7-4c30-9773-6c7f511959f4\") " pod="openstack/swift-ring-rebalance-vbg62" Sep 30 14:17:23 crc kubenswrapper[4676]: I0930 14:17:23.554616 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fb668321-0ea7-4c30-9773-6c7f511959f4-dispersionconf\") pod \"swift-ring-rebalance-vbg62\" (UID: \"fb668321-0ea7-4c30-9773-6c7f511959f4\") " pod="openstack/swift-ring-rebalance-vbg62" Sep 30 14:17:23 crc kubenswrapper[4676]: I0930 14:17:23.554733 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fb668321-0ea7-4c30-9773-6c7f511959f4-etc-swift\") pod \"swift-ring-rebalance-vbg62\" (UID: \"fb668321-0ea7-4c30-9773-6c7f511959f4\") " pod="openstack/swift-ring-rebalance-vbg62" Sep 30 14:17:23 crc kubenswrapper[4676]: I0930 14:17:23.554762 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fb668321-0ea7-4c30-9773-6c7f511959f4-ring-data-devices\") pod \"swift-ring-rebalance-vbg62\" (UID: \"fb668321-0ea7-4c30-9773-6c7f511959f4\") " pod="openstack/swift-ring-rebalance-vbg62" Sep 30 14:17:23 crc kubenswrapper[4676]: I0930 14:17:23.554846 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fb668321-0ea7-4c30-9773-6c7f511959f4-scripts\") pod \"swift-ring-rebalance-vbg62\" (UID: \"fb668321-0ea7-4c30-9773-6c7f511959f4\") " pod="openstack/swift-ring-rebalance-vbg62" Sep 30 14:17:23 crc kubenswrapper[4676]: I0930 14:17:23.554894 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb668321-0ea7-4c30-9773-6c7f511959f4-combined-ca-bundle\") pod \"swift-ring-rebalance-vbg62\" (UID: \"fb668321-0ea7-4c30-9773-6c7f511959f4\") " pod="openstack/swift-ring-rebalance-vbg62" Sep 30 14:17:23 crc kubenswrapper[4676]: I0930 14:17:23.556964 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fb668321-0ea7-4c30-9773-6c7f511959f4-ring-data-devices\") pod \"swift-ring-rebalance-vbg62\" (UID: \"fb668321-0ea7-4c30-9773-6c7f511959f4\") " pod="openstack/swift-ring-rebalance-vbg62" Sep 30 14:17:23 crc kubenswrapper[4676]: I0930 14:17:23.557037 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fb668321-0ea7-4c30-9773-6c7f511959f4-etc-swift\") pod \"swift-ring-rebalance-vbg62\" (UID: \"fb668321-0ea7-4c30-9773-6c7f511959f4\") " pod="openstack/swift-ring-rebalance-vbg62" Sep 30 14:17:23 crc kubenswrapper[4676]: I0930 14:17:23.557347 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fb668321-0ea7-4c30-9773-6c7f511959f4-scripts\") pod \"swift-ring-rebalance-vbg62\" (UID: \"fb668321-0ea7-4c30-9773-6c7f511959f4\") " pod="openstack/swift-ring-rebalance-vbg62" Sep 30 14:17:23 crc kubenswrapper[4676]: I0930 14:17:23.560005 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fb668321-0ea7-4c30-9773-6c7f511959f4-dispersionconf\") pod \"swift-ring-rebalance-vbg62\" (UID: \"fb668321-0ea7-4c30-9773-6c7f511959f4\") " pod="openstack/swift-ring-rebalance-vbg62" Sep 30 14:17:23 crc kubenswrapper[4676]: I0930 14:17:23.560382 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fb668321-0ea7-4c30-9773-6c7f511959f4-swiftconf\") pod \"swift-ring-rebalance-vbg62\" (UID: \"fb668321-0ea7-4c30-9773-6c7f511959f4\") " pod="openstack/swift-ring-rebalance-vbg62" Sep 30 14:17:23 crc kubenswrapper[4676]: I0930 14:17:23.560710 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb668321-0ea7-4c30-9773-6c7f511959f4-combined-ca-bundle\") pod \"swift-ring-rebalance-vbg62\" (UID: \"fb668321-0ea7-4c30-9773-6c7f511959f4\") " pod="openstack/swift-ring-rebalance-vbg62" Sep 30 14:17:23 crc kubenswrapper[4676]: I0930 14:17:23.574707 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2r6vk\" (UniqueName: \"kubernetes.io/projected/fb668321-0ea7-4c30-9773-6c7f511959f4-kube-api-access-2r6vk\") pod \"swift-ring-rebalance-vbg62\" (UID: \"fb668321-0ea7-4c30-9773-6c7f511959f4\") " pod="openstack/swift-ring-rebalance-vbg62" Sep 30 14:17:23 crc kubenswrapper[4676]: I0930 14:17:23.656198 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6c0230d2-8bbc-4ad0-8f3d-062d2d940013-etc-swift\") pod \"swift-storage-0\" (UID: \"6c0230d2-8bbc-4ad0-8f3d-062d2d940013\") " pod="openstack/swift-storage-0" Sep 30 14:17:23 crc kubenswrapper[4676]: E0930 14:17:23.656443 4676 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Sep 30 14:17:23 crc kubenswrapper[4676]: E0930 14:17:23.656480 4676 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Sep 30 14:17:23 crc kubenswrapper[4676]: E0930 14:17:23.656552 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6c0230d2-8bbc-4ad0-8f3d-062d2d940013-etc-swift podName:6c0230d2-8bbc-4ad0-8f3d-062d2d940013 nodeName:}" failed. No retries permitted until 2025-09-30 14:17:24.656527599 +0000 UTC m=+1148.639616028 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6c0230d2-8bbc-4ad0-8f3d-062d2d940013-etc-swift") pod "swift-storage-0" (UID: "6c0230d2-8bbc-4ad0-8f3d-062d2d940013") : configmap "swift-ring-files" not found Sep 30 14:17:23 crc kubenswrapper[4676]: I0930 14:17:23.696756 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vbg62" Sep 30 14:17:23 crc kubenswrapper[4676]: I0930 14:17:23.709607 4676 generic.go:334] "Generic (PLEG): container finished" podID="70a50116-382a-4dae-8f0a-d47de54cffcf" containerID="4087fb74a9a71f9b96ff7c1db8c5c123be4ab7d7437366d6e270a91611bd8d05" exitCode=0 Sep 30 14:17:23 crc kubenswrapper[4676]: I0930 14:17:23.709686 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-zmzf2" event={"ID":"70a50116-382a-4dae-8f0a-d47de54cffcf","Type":"ContainerDied","Data":"4087fb74a9a71f9b96ff7c1db8c5c123be4ab7d7437366d6e270a91611bd8d05"} Sep 30 14:17:23 crc kubenswrapper[4676]: I0930 14:17:23.713076 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"9762912c-8ca3-4791-93c0-4d5728543998","Type":"ContainerStarted","Data":"3664d1eed83c3c837c3f5c348b265b0164dcc17a6b9ae41de2fcc3d6ea0708ef"} Sep 30 14:17:24 crc kubenswrapper[4676]: I0930 14:17:24.170678 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=34.965496295 podStartE2EDuration="2m16.170651806s" podCreationTimestamp="2025-09-30 14:15:08 +0000 UTC" firstStartedPulling="2025-09-30 14:15:27.967893369 +0000 UTC m=+1031.950981798" lastFinishedPulling="2025-09-30 14:17:09.17304888 +0000 UTC m=+1133.156137309" observedRunningTime="2025-09-30 14:17:23.752589095 +0000 UTC m=+1147.735677604" watchObservedRunningTime="2025-09-30 14:17:24.170651806 +0000 UTC m=+1148.153740235" Sep 30 14:17:24 crc kubenswrapper[4676]: I0930 14:17:24.173850 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-vbg62"] Sep 30 14:17:24 crc kubenswrapper[4676]: I0930 14:17:24.675448 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6c0230d2-8bbc-4ad0-8f3d-062d2d940013-etc-swift\") pod \"swift-storage-0\" (UID: \"6c0230d2-8bbc-4ad0-8f3d-062d2d940013\") " pod="openstack/swift-storage-0" Sep 30 14:17:24 crc kubenswrapper[4676]: E0930 14:17:24.675685 4676 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Sep 30 14:17:24 crc kubenswrapper[4676]: E0930 14:17:24.675716 4676 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Sep 30 14:17:24 crc kubenswrapper[4676]: E0930 14:17:24.675778 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6c0230d2-8bbc-4ad0-8f3d-062d2d940013-etc-swift podName:6c0230d2-8bbc-4ad0-8f3d-062d2d940013 nodeName:}" failed. No retries permitted until 2025-09-30 14:17:26.675760839 +0000 UTC m=+1150.658849268 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6c0230d2-8bbc-4ad0-8f3d-062d2d940013-etc-swift") pod "swift-storage-0" (UID: "6c0230d2-8bbc-4ad0-8f3d-062d2d940013") : configmap "swift-ring-files" not found Sep 30 14:17:24 crc kubenswrapper[4676]: I0930 14:17:24.721376 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-vbg62" event={"ID":"fb668321-0ea7-4c30-9773-6c7f511959f4","Type":"ContainerStarted","Data":"cea21690f828c4aa63ccdad007a4163c0c8744eebdb828fa7cffce2374dde70f"} Sep 30 14:17:24 crc kubenswrapper[4676]: I0930 14:17:24.723813 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-zmzf2" event={"ID":"70a50116-382a-4dae-8f0a-d47de54cffcf","Type":"ContainerStarted","Data":"95c822ac8de5d5cc4cbc980762011dabb92ba7666e85647803c484eebce9cac6"} Sep 30 14:17:24 crc kubenswrapper[4676]: I0930 14:17:24.724057 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-zmzf2" Sep 30 14:17:24 crc kubenswrapper[4676]: I0930 14:17:24.746258 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-zmzf2" podStartSLOduration=3.746240304 podStartE2EDuration="3.746240304s" podCreationTimestamp="2025-09-30 14:17:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:17:24.742508763 +0000 UTC m=+1148.725597192" watchObservedRunningTime="2025-09-30 14:17:24.746240304 +0000 UTC m=+1148.729328733" Sep 30 14:17:25 crc kubenswrapper[4676]: I0930 14:17:25.736717 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"e73b3580-31d5-4c06-9bd8-acbd16c5c48d","Type":"ContainerStarted","Data":"825a14aa88f7f499864e5663e1518ae98b48fae978b05c5ef6f83a97a263092f"} Sep 30 14:17:25 crc kubenswrapper[4676]: I0930 14:17:25.766342 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=12.119642897 podStartE2EDuration="2m8.766320867s" podCreationTimestamp="2025-09-30 14:15:17 +0000 UTC" firstStartedPulling="2025-09-30 14:15:28.657130315 +0000 UTC m=+1032.640218744" lastFinishedPulling="2025-09-30 14:17:25.303808285 +0000 UTC m=+1149.286896714" observedRunningTime="2025-09-30 14:17:25.76236119 +0000 UTC m=+1149.745449629" watchObservedRunningTime="2025-09-30 14:17:25.766320867 +0000 UTC m=+1149.749409296" Sep 30 14:17:26 crc kubenswrapper[4676]: I0930 14:17:26.551096 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:17:26 crc kubenswrapper[4676]: I0930 14:17:26.712370 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6c0230d2-8bbc-4ad0-8f3d-062d2d940013-etc-swift\") pod \"swift-storage-0\" (UID: \"6c0230d2-8bbc-4ad0-8f3d-062d2d940013\") " pod="openstack/swift-storage-0" Sep 30 14:17:26 crc kubenswrapper[4676]: E0930 14:17:26.712560 4676 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Sep 30 14:17:26 crc kubenswrapper[4676]: E0930 14:17:26.712577 4676 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Sep 30 14:17:26 crc kubenswrapper[4676]: E0930 14:17:26.712628 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6c0230d2-8bbc-4ad0-8f3d-062d2d940013-etc-swift podName:6c0230d2-8bbc-4ad0-8f3d-062d2d940013 nodeName:}" failed. No retries permitted until 2025-09-30 14:17:30.712613675 +0000 UTC m=+1154.695702104 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6c0230d2-8bbc-4ad0-8f3d-062d2d940013-etc-swift") pod "swift-storage-0" (UID: "6c0230d2-8bbc-4ad0-8f3d-062d2d940013") : configmap "swift-ring-files" not found Sep 30 14:17:26 crc kubenswrapper[4676]: I0930 14:17:26.843083 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Sep 30 14:17:28 crc kubenswrapper[4676]: I0930 14:17:28.401570 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Sep 30 14:17:28 crc kubenswrapper[4676]: I0930 14:17:28.449721 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Sep 30 14:17:28 crc kubenswrapper[4676]: I0930 14:17:28.758158 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Sep 30 14:17:29 crc kubenswrapper[4676]: I0930 14:17:29.093787 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Sep 30 14:17:29 crc kubenswrapper[4676]: I0930 14:17:29.094156 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Sep 30 14:17:29 crc kubenswrapper[4676]: I0930 14:17:29.149577 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Sep 30 14:17:29 crc kubenswrapper[4676]: I0930 14:17:29.535676 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Sep 30 14:17:29 crc kubenswrapper[4676]: I0930 14:17:29.535751 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Sep 30 14:17:29 crc kubenswrapper[4676]: I0930 14:17:29.843622 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Sep 30 14:17:30 crc kubenswrapper[4676]: I0930 14:17:30.242951 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-6j825"] Sep 30 14:17:30 crc kubenswrapper[4676]: I0930 14:17:30.244051 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-6j825" Sep 30 14:17:30 crc kubenswrapper[4676]: I0930 14:17:30.253999 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-6j825"] Sep 30 14:17:30 crc kubenswrapper[4676]: I0930 14:17:30.375922 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhq2n\" (UniqueName: \"kubernetes.io/projected/1b1b569a-2fee-4891-8eb7-279d8efe9600-kube-api-access-mhq2n\") pod \"glance-db-create-6j825\" (UID: \"1b1b569a-2fee-4891-8eb7-279d8efe9600\") " pod="openstack/glance-db-create-6j825" Sep 30 14:17:30 crc kubenswrapper[4676]: I0930 14:17:30.477698 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhq2n\" (UniqueName: \"kubernetes.io/projected/1b1b569a-2fee-4891-8eb7-279d8efe9600-kube-api-access-mhq2n\") pod \"glance-db-create-6j825\" (UID: \"1b1b569a-2fee-4891-8eb7-279d8efe9600\") " pod="openstack/glance-db-create-6j825" Sep 30 14:17:30 crc kubenswrapper[4676]: I0930 14:17:30.500914 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhq2n\" (UniqueName: \"kubernetes.io/projected/1b1b569a-2fee-4891-8eb7-279d8efe9600-kube-api-access-mhq2n\") pod \"glance-db-create-6j825\" (UID: \"1b1b569a-2fee-4891-8eb7-279d8efe9600\") " pod="openstack/glance-db-create-6j825" Sep 30 14:17:30 crc kubenswrapper[4676]: I0930 14:17:30.570566 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-6j825" Sep 30 14:17:30 crc kubenswrapper[4676]: I0930 14:17:30.782250 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6c0230d2-8bbc-4ad0-8f3d-062d2d940013-etc-swift\") pod \"swift-storage-0\" (UID: \"6c0230d2-8bbc-4ad0-8f3d-062d2d940013\") " pod="openstack/swift-storage-0" Sep 30 14:17:30 crc kubenswrapper[4676]: E0930 14:17:30.782439 4676 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Sep 30 14:17:30 crc kubenswrapper[4676]: E0930 14:17:30.782495 4676 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Sep 30 14:17:30 crc kubenswrapper[4676]: E0930 14:17:30.782549 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6c0230d2-8bbc-4ad0-8f3d-062d2d940013-etc-swift podName:6c0230d2-8bbc-4ad0-8f3d-062d2d940013 nodeName:}" failed. No retries permitted until 2025-09-30 14:17:38.782532176 +0000 UTC m=+1162.765620605 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6c0230d2-8bbc-4ad0-8f3d-062d2d940013-etc-swift") pod "swift-storage-0" (UID: "6c0230d2-8bbc-4ad0-8f3d-062d2d940013") : configmap "swift-ring-files" not found Sep 30 14:17:31 crc kubenswrapper[4676]: I0930 14:17:31.019044 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-6j825"] Sep 30 14:17:31 crc kubenswrapper[4676]: W0930 14:17:31.030689 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b1b569a_2fee_4891_8eb7_279d8efe9600.slice/crio-f5b8b3aa399f3eb362c60adba7d60a91f3512aa117f906fc1bd0ce3f2ae0b0d3 WatchSource:0}: Error finding container f5b8b3aa399f3eb362c60adba7d60a91f3512aa117f906fc1bd0ce3f2ae0b0d3: Status 404 returned error can't find the container with id f5b8b3aa399f3eb362c60adba7d60a91f3512aa117f906fc1bd0ce3f2ae0b0d3 Sep 30 14:17:31 crc kubenswrapper[4676]: I0930 14:17:31.790487 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-6j825" event={"ID":"1b1b569a-2fee-4891-8eb7-279d8efe9600","Type":"ContainerStarted","Data":"f5b8b3aa399f3eb362c60adba7d60a91f3512aa117f906fc1bd0ce3f2ae0b0d3"} Sep 30 14:17:31 crc kubenswrapper[4676]: I0930 14:17:31.962247 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-zmzf2" Sep 30 14:17:32 crc kubenswrapper[4676]: I0930 14:17:32.021194 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-c4hz6"] Sep 30 14:17:32 crc kubenswrapper[4676]: I0930 14:17:32.021624 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-c4hz6" podUID="17ee04bb-14fc-4d85-9293-fe465ddc167d" containerName="dnsmasq-dns" containerID="cri-o://121430a402f3098e0f8e2a63bbe8499b0f62579eb5aaaa82336bf60e604359ae" gracePeriod=10 Sep 30 14:17:32 crc kubenswrapper[4676]: I0930 14:17:32.811489 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-vbg62" event={"ID":"fb668321-0ea7-4c30-9773-6c7f511959f4","Type":"ContainerStarted","Data":"c5e5548589305e5cfc9fd6b658354112d72a6dacb865ba1d7017a328e34acb22"} Sep 30 14:17:32 crc kubenswrapper[4676]: I0930 14:17:32.813606 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-6j825" event={"ID":"1b1b569a-2fee-4891-8eb7-279d8efe9600","Type":"ContainerStarted","Data":"20f52802eb411ddd44c1482ca9bae3b03ee472d9745b38fd59b42ce41f2d7667"} Sep 30 14:17:32 crc kubenswrapper[4676]: I0930 14:17:32.817680 4676 generic.go:334] "Generic (PLEG): container finished" podID="17ee04bb-14fc-4d85-9293-fe465ddc167d" containerID="121430a402f3098e0f8e2a63bbe8499b0f62579eb5aaaa82336bf60e604359ae" exitCode=0 Sep 30 14:17:32 crc kubenswrapper[4676]: I0930 14:17:32.817824 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-c4hz6" event={"ID":"17ee04bb-14fc-4d85-9293-fe465ddc167d","Type":"ContainerDied","Data":"121430a402f3098e0f8e2a63bbe8499b0f62579eb5aaaa82336bf60e604359ae"} Sep 30 14:17:32 crc kubenswrapper[4676]: I0930 14:17:32.820122 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ad8d4649-f28a-4d12-884f-44308450c02b","Type":"ContainerStarted","Data":"f5f8191a45fd716a3ac9c49c6e38b898a0404f7fa3f2f5cdcf56ce5ef014b738"} Sep 30 14:17:32 crc kubenswrapper[4676]: I0930 14:17:32.820407 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Sep 30 14:17:32 crc kubenswrapper[4676]: I0930 14:17:32.843869 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-vbg62" podStartSLOduration=4.268794169 podStartE2EDuration="9.843841453s" podCreationTimestamp="2025-09-30 14:17:23 +0000 UTC" firstStartedPulling="2025-09-30 14:17:24.182785854 +0000 UTC m=+1148.165874283" lastFinishedPulling="2025-09-30 14:17:29.757833138 +0000 UTC m=+1153.740921567" observedRunningTime="2025-09-30 14:17:32.836542266 +0000 UTC m=+1156.819630695" watchObservedRunningTime="2025-09-30 14:17:32.843841453 +0000 UTC m=+1156.826929892" Sep 30 14:17:32 crc kubenswrapper[4676]: I0930 14:17:32.857150 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-6j825" podStartSLOduration=2.857101292 podStartE2EDuration="2.857101292s" podCreationTimestamp="2025-09-30 14:17:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:17:32.848690344 +0000 UTC m=+1156.831778773" watchObservedRunningTime="2025-09-30 14:17:32.857101292 +0000 UTC m=+1156.840189721" Sep 30 14:17:32 crc kubenswrapper[4676]: I0930 14:17:32.872382 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=18.72028171 podStartE2EDuration="2m21.872355864s" podCreationTimestamp="2025-09-30 14:15:11 +0000 UTC" firstStartedPulling="2025-09-30 14:15:26.608251401 +0000 UTC m=+1030.591339830" lastFinishedPulling="2025-09-30 14:17:29.760325555 +0000 UTC m=+1153.743413984" observedRunningTime="2025-09-30 14:17:32.867784051 +0000 UTC m=+1156.850872480" watchObservedRunningTime="2025-09-30 14:17:32.872355864 +0000 UTC m=+1156.855444293" Sep 30 14:17:33 crc kubenswrapper[4676]: I0930 14:17:33.219278 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-86db49b7ff-c4hz6" podUID="17ee04bb-14fc-4d85-9293-fe465ddc167d" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.112:5353: connect: connection refused" Sep 30 14:17:33 crc kubenswrapper[4676]: I0930 14:17:33.634473 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-c4hz6" Sep 30 14:17:33 crc kubenswrapper[4676]: I0930 14:17:33.640743 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Sep 30 14:17:33 crc kubenswrapper[4676]: I0930 14:17:33.710341 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Sep 30 14:17:33 crc kubenswrapper[4676]: I0930 14:17:33.745225 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lnzrd\" (UniqueName: \"kubernetes.io/projected/17ee04bb-14fc-4d85-9293-fe465ddc167d-kube-api-access-lnzrd\") pod \"17ee04bb-14fc-4d85-9293-fe465ddc167d\" (UID: \"17ee04bb-14fc-4d85-9293-fe465ddc167d\") " Sep 30 14:17:33 crc kubenswrapper[4676]: I0930 14:17:33.745354 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/17ee04bb-14fc-4d85-9293-fe465ddc167d-ovsdbserver-nb\") pod \"17ee04bb-14fc-4d85-9293-fe465ddc167d\" (UID: \"17ee04bb-14fc-4d85-9293-fe465ddc167d\") " Sep 30 14:17:33 crc kubenswrapper[4676]: I0930 14:17:33.745541 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17ee04bb-14fc-4d85-9293-fe465ddc167d-dns-svc\") pod \"17ee04bb-14fc-4d85-9293-fe465ddc167d\" (UID: \"17ee04bb-14fc-4d85-9293-fe465ddc167d\") " Sep 30 14:17:33 crc kubenswrapper[4676]: I0930 14:17:33.745651 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/17ee04bb-14fc-4d85-9293-fe465ddc167d-ovsdbserver-sb\") pod \"17ee04bb-14fc-4d85-9293-fe465ddc167d\" (UID: \"17ee04bb-14fc-4d85-9293-fe465ddc167d\") " Sep 30 14:17:33 crc kubenswrapper[4676]: I0930 14:17:33.745731 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17ee04bb-14fc-4d85-9293-fe465ddc167d-config\") pod \"17ee04bb-14fc-4d85-9293-fe465ddc167d\" (UID: \"17ee04bb-14fc-4d85-9293-fe465ddc167d\") " Sep 30 14:17:33 crc kubenswrapper[4676]: I0930 14:17:33.757481 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17ee04bb-14fc-4d85-9293-fe465ddc167d-kube-api-access-lnzrd" (OuterVolumeSpecName: "kube-api-access-lnzrd") pod "17ee04bb-14fc-4d85-9293-fe465ddc167d" (UID: "17ee04bb-14fc-4d85-9293-fe465ddc167d"). InnerVolumeSpecName "kube-api-access-lnzrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:17:33 crc kubenswrapper[4676]: I0930 14:17:33.793939 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17ee04bb-14fc-4d85-9293-fe465ddc167d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "17ee04bb-14fc-4d85-9293-fe465ddc167d" (UID: "17ee04bb-14fc-4d85-9293-fe465ddc167d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:17:33 crc kubenswrapper[4676]: I0930 14:17:33.798578 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17ee04bb-14fc-4d85-9293-fe465ddc167d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "17ee04bb-14fc-4d85-9293-fe465ddc167d" (UID: "17ee04bb-14fc-4d85-9293-fe465ddc167d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:17:33 crc kubenswrapper[4676]: I0930 14:17:33.820929 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17ee04bb-14fc-4d85-9293-fe465ddc167d-config" (OuterVolumeSpecName: "config") pod "17ee04bb-14fc-4d85-9293-fe465ddc167d" (UID: "17ee04bb-14fc-4d85-9293-fe465ddc167d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:17:33 crc kubenswrapper[4676]: I0930 14:17:33.827909 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17ee04bb-14fc-4d85-9293-fe465ddc167d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "17ee04bb-14fc-4d85-9293-fe465ddc167d" (UID: "17ee04bb-14fc-4d85-9293-fe465ddc167d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:17:33 crc kubenswrapper[4676]: I0930 14:17:33.834249 4676 generic.go:334] "Generic (PLEG): container finished" podID="1b1b569a-2fee-4891-8eb7-279d8efe9600" containerID="20f52802eb411ddd44c1482ca9bae3b03ee472d9745b38fd59b42ce41f2d7667" exitCode=0 Sep 30 14:17:33 crc kubenswrapper[4676]: I0930 14:17:33.834334 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-6j825" event={"ID":"1b1b569a-2fee-4891-8eb7-279d8efe9600","Type":"ContainerDied","Data":"20f52802eb411ddd44c1482ca9bae3b03ee472d9745b38fd59b42ce41f2d7667"} Sep 30 14:17:33 crc kubenswrapper[4676]: I0930 14:17:33.837100 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-c4hz6" Sep 30 14:17:33 crc kubenswrapper[4676]: I0930 14:17:33.837182 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-c4hz6" event={"ID":"17ee04bb-14fc-4d85-9293-fe465ddc167d","Type":"ContainerDied","Data":"8d5f6a5a29763fc7de515e0cac178c8a55dfd42f7fb13de94e71930622eb03b5"} Sep 30 14:17:33 crc kubenswrapper[4676]: I0930 14:17:33.837226 4676 scope.go:117] "RemoveContainer" containerID="121430a402f3098e0f8e2a63bbe8499b0f62579eb5aaaa82336bf60e604359ae" Sep 30 14:17:33 crc kubenswrapper[4676]: I0930 14:17:33.849633 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17ee04bb-14fc-4d85-9293-fe465ddc167d-config\") on node \"crc\" DevicePath \"\"" Sep 30 14:17:33 crc kubenswrapper[4676]: I0930 14:17:33.849680 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lnzrd\" (UniqueName: \"kubernetes.io/projected/17ee04bb-14fc-4d85-9293-fe465ddc167d-kube-api-access-lnzrd\") on node \"crc\" DevicePath \"\"" Sep 30 14:17:33 crc kubenswrapper[4676]: I0930 14:17:33.849696 4676 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/17ee04bb-14fc-4d85-9293-fe465ddc167d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 14:17:33 crc kubenswrapper[4676]: I0930 14:17:33.849711 4676 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17ee04bb-14fc-4d85-9293-fe465ddc167d-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 14:17:33 crc kubenswrapper[4676]: I0930 14:17:33.849724 4676 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/17ee04bb-14fc-4d85-9293-fe465ddc167d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 14:17:33 crc kubenswrapper[4676]: I0930 14:17:33.898966 4676 scope.go:117] "RemoveContainer" containerID="07281f4813bc154b4dd8c2cc83444cc0373495625f41770f5b07d3dc294acc99" Sep 30 14:17:33 crc kubenswrapper[4676]: I0930 14:17:33.917838 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-c4hz6"] Sep 30 14:17:33 crc kubenswrapper[4676]: I0930 14:17:33.935427 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-c4hz6"] Sep 30 14:17:34 crc kubenswrapper[4676]: I0930 14:17:34.443165 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Sep 30 14:17:34 crc kubenswrapper[4676]: I0930 14:17:34.622351 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Sep 30 14:17:34 crc kubenswrapper[4676]: E0930 14:17:34.623067 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17ee04bb-14fc-4d85-9293-fe465ddc167d" containerName="dnsmasq-dns" Sep 30 14:17:34 crc kubenswrapper[4676]: I0930 14:17:34.623087 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="17ee04bb-14fc-4d85-9293-fe465ddc167d" containerName="dnsmasq-dns" Sep 30 14:17:34 crc kubenswrapper[4676]: E0930 14:17:34.623116 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17ee04bb-14fc-4d85-9293-fe465ddc167d" containerName="init" Sep 30 14:17:34 crc kubenswrapper[4676]: I0930 14:17:34.623122 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="17ee04bb-14fc-4d85-9293-fe465ddc167d" containerName="init" Sep 30 14:17:34 crc kubenswrapper[4676]: I0930 14:17:34.623288 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="17ee04bb-14fc-4d85-9293-fe465ddc167d" containerName="dnsmasq-dns" Sep 30 14:17:34 crc kubenswrapper[4676]: I0930 14:17:34.624100 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Sep 30 14:17:34 crc kubenswrapper[4676]: I0930 14:17:34.628061 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-vn67z" Sep 30 14:17:34 crc kubenswrapper[4676]: I0930 14:17:34.629020 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Sep 30 14:17:34 crc kubenswrapper[4676]: I0930 14:17:34.629166 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Sep 30 14:17:34 crc kubenswrapper[4676]: I0930 14:17:34.629399 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Sep 30 14:17:34 crc kubenswrapper[4676]: I0930 14:17:34.646962 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Sep 30 14:17:34 crc kubenswrapper[4676]: I0930 14:17:34.663705 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2aa43108-6602-4fc3-b8b1-ce07a8ef0b31-scripts\") pod \"ovn-northd-0\" (UID: \"2aa43108-6602-4fc3-b8b1-ce07a8ef0b31\") " pod="openstack/ovn-northd-0" Sep 30 14:17:34 crc kubenswrapper[4676]: I0930 14:17:34.663770 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2aa43108-6602-4fc3-b8b1-ce07a8ef0b31-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"2aa43108-6602-4fc3-b8b1-ce07a8ef0b31\") " pod="openstack/ovn-northd-0" Sep 30 14:17:34 crc kubenswrapper[4676]: I0930 14:17:34.663807 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2aa43108-6602-4fc3-b8b1-ce07a8ef0b31-config\") pod \"ovn-northd-0\" (UID: \"2aa43108-6602-4fc3-b8b1-ce07a8ef0b31\") " pod="openstack/ovn-northd-0" Sep 30 14:17:34 crc kubenswrapper[4676]: I0930 14:17:34.663853 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqn4v\" (UniqueName: \"kubernetes.io/projected/2aa43108-6602-4fc3-b8b1-ce07a8ef0b31-kube-api-access-gqn4v\") pod \"ovn-northd-0\" (UID: \"2aa43108-6602-4fc3-b8b1-ce07a8ef0b31\") " pod="openstack/ovn-northd-0" Sep 30 14:17:34 crc kubenswrapper[4676]: I0930 14:17:34.664039 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/2aa43108-6602-4fc3-b8b1-ce07a8ef0b31-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"2aa43108-6602-4fc3-b8b1-ce07a8ef0b31\") " pod="openstack/ovn-northd-0" Sep 30 14:17:34 crc kubenswrapper[4676]: I0930 14:17:34.664101 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2aa43108-6602-4fc3-b8b1-ce07a8ef0b31-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"2aa43108-6602-4fc3-b8b1-ce07a8ef0b31\") " pod="openstack/ovn-northd-0" Sep 30 14:17:34 crc kubenswrapper[4676]: I0930 14:17:34.664141 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2aa43108-6602-4fc3-b8b1-ce07a8ef0b31-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"2aa43108-6602-4fc3-b8b1-ce07a8ef0b31\") " pod="openstack/ovn-northd-0" Sep 30 14:17:34 crc kubenswrapper[4676]: I0930 14:17:34.765849 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqn4v\" (UniqueName: \"kubernetes.io/projected/2aa43108-6602-4fc3-b8b1-ce07a8ef0b31-kube-api-access-gqn4v\") pod \"ovn-northd-0\" (UID: \"2aa43108-6602-4fc3-b8b1-ce07a8ef0b31\") " pod="openstack/ovn-northd-0" Sep 30 14:17:34 crc kubenswrapper[4676]: I0930 14:17:34.766014 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/2aa43108-6602-4fc3-b8b1-ce07a8ef0b31-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"2aa43108-6602-4fc3-b8b1-ce07a8ef0b31\") " pod="openstack/ovn-northd-0" Sep 30 14:17:34 crc kubenswrapper[4676]: I0930 14:17:34.766099 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2aa43108-6602-4fc3-b8b1-ce07a8ef0b31-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"2aa43108-6602-4fc3-b8b1-ce07a8ef0b31\") " pod="openstack/ovn-northd-0" Sep 30 14:17:34 crc kubenswrapper[4676]: I0930 14:17:34.766160 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2aa43108-6602-4fc3-b8b1-ce07a8ef0b31-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"2aa43108-6602-4fc3-b8b1-ce07a8ef0b31\") " pod="openstack/ovn-northd-0" Sep 30 14:17:34 crc kubenswrapper[4676]: I0930 14:17:34.766196 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2aa43108-6602-4fc3-b8b1-ce07a8ef0b31-scripts\") pod \"ovn-northd-0\" (UID: \"2aa43108-6602-4fc3-b8b1-ce07a8ef0b31\") " pod="openstack/ovn-northd-0" Sep 30 14:17:34 crc kubenswrapper[4676]: I0930 14:17:34.766251 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2aa43108-6602-4fc3-b8b1-ce07a8ef0b31-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"2aa43108-6602-4fc3-b8b1-ce07a8ef0b31\") " pod="openstack/ovn-northd-0" Sep 30 14:17:34 crc kubenswrapper[4676]: I0930 14:17:34.766281 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2aa43108-6602-4fc3-b8b1-ce07a8ef0b31-config\") pod \"ovn-northd-0\" (UID: \"2aa43108-6602-4fc3-b8b1-ce07a8ef0b31\") " pod="openstack/ovn-northd-0" Sep 30 14:17:34 crc kubenswrapper[4676]: I0930 14:17:34.766815 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2aa43108-6602-4fc3-b8b1-ce07a8ef0b31-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"2aa43108-6602-4fc3-b8b1-ce07a8ef0b31\") " pod="openstack/ovn-northd-0" Sep 30 14:17:34 crc kubenswrapper[4676]: I0930 14:17:34.767617 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2aa43108-6602-4fc3-b8b1-ce07a8ef0b31-scripts\") pod \"ovn-northd-0\" (UID: \"2aa43108-6602-4fc3-b8b1-ce07a8ef0b31\") " pod="openstack/ovn-northd-0" Sep 30 14:17:34 crc kubenswrapper[4676]: I0930 14:17:34.767718 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2aa43108-6602-4fc3-b8b1-ce07a8ef0b31-config\") pod \"ovn-northd-0\" (UID: \"2aa43108-6602-4fc3-b8b1-ce07a8ef0b31\") " pod="openstack/ovn-northd-0" Sep 30 14:17:34 crc kubenswrapper[4676]: I0930 14:17:34.773178 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2aa43108-6602-4fc3-b8b1-ce07a8ef0b31-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"2aa43108-6602-4fc3-b8b1-ce07a8ef0b31\") " pod="openstack/ovn-northd-0" Sep 30 14:17:34 crc kubenswrapper[4676]: I0930 14:17:34.773301 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/2aa43108-6602-4fc3-b8b1-ce07a8ef0b31-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"2aa43108-6602-4fc3-b8b1-ce07a8ef0b31\") " pod="openstack/ovn-northd-0" Sep 30 14:17:34 crc kubenswrapper[4676]: I0930 14:17:34.777093 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2aa43108-6602-4fc3-b8b1-ce07a8ef0b31-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"2aa43108-6602-4fc3-b8b1-ce07a8ef0b31\") " pod="openstack/ovn-northd-0" Sep 30 14:17:34 crc kubenswrapper[4676]: I0930 14:17:34.784370 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqn4v\" (UniqueName: \"kubernetes.io/projected/2aa43108-6602-4fc3-b8b1-ce07a8ef0b31-kube-api-access-gqn4v\") pod \"ovn-northd-0\" (UID: \"2aa43108-6602-4fc3-b8b1-ce07a8ef0b31\") " pod="openstack/ovn-northd-0" Sep 30 14:17:34 crc kubenswrapper[4676]: I0930 14:17:34.943499 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Sep 30 14:17:35 crc kubenswrapper[4676]: I0930 14:17:35.107573 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-6j825" Sep 30 14:17:35 crc kubenswrapper[4676]: I0930 14:17:35.172487 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhq2n\" (UniqueName: \"kubernetes.io/projected/1b1b569a-2fee-4891-8eb7-279d8efe9600-kube-api-access-mhq2n\") pod \"1b1b569a-2fee-4891-8eb7-279d8efe9600\" (UID: \"1b1b569a-2fee-4891-8eb7-279d8efe9600\") " Sep 30 14:17:35 crc kubenswrapper[4676]: I0930 14:17:35.177773 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b1b569a-2fee-4891-8eb7-279d8efe9600-kube-api-access-mhq2n" (OuterVolumeSpecName: "kube-api-access-mhq2n") pod "1b1b569a-2fee-4891-8eb7-279d8efe9600" (UID: "1b1b569a-2fee-4891-8eb7-279d8efe9600"). InnerVolumeSpecName "kube-api-access-mhq2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:17:35 crc kubenswrapper[4676]: I0930 14:17:35.274706 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhq2n\" (UniqueName: \"kubernetes.io/projected/1b1b569a-2fee-4891-8eb7-279d8efe9600-kube-api-access-mhq2n\") on node \"crc\" DevicePath \"\"" Sep 30 14:17:35 crc kubenswrapper[4676]: I0930 14:17:35.429689 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Sep 30 14:17:35 crc kubenswrapper[4676]: W0930 14:17:35.434250 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2aa43108_6602_4fc3_b8b1_ce07a8ef0b31.slice/crio-8bf6387e2aa0f4f75ae8918c58e917c2986d53199a6ee84a9aee56fa30d01b19 WatchSource:0}: Error finding container 8bf6387e2aa0f4f75ae8918c58e917c2986d53199a6ee84a9aee56fa30d01b19: Status 404 returned error can't find the container with id 8bf6387e2aa0f4f75ae8918c58e917c2986d53199a6ee84a9aee56fa30d01b19 Sep 30 14:17:35 crc kubenswrapper[4676]: I0930 14:17:35.445367 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17ee04bb-14fc-4d85-9293-fe465ddc167d" path="/var/lib/kubelet/pods/17ee04bb-14fc-4d85-9293-fe465ddc167d/volumes" Sep 30 14:17:35 crc kubenswrapper[4676]: I0930 14:17:35.855440 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-6j825" event={"ID":"1b1b569a-2fee-4891-8eb7-279d8efe9600","Type":"ContainerDied","Data":"f5b8b3aa399f3eb362c60adba7d60a91f3512aa117f906fc1bd0ce3f2ae0b0d3"} Sep 30 14:17:35 crc kubenswrapper[4676]: I0930 14:17:35.855490 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-6j825" Sep 30 14:17:35 crc kubenswrapper[4676]: I0930 14:17:35.855496 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5b8b3aa399f3eb362c60adba7d60a91f3512aa117f906fc1bd0ce3f2ae0b0d3" Sep 30 14:17:35 crc kubenswrapper[4676]: I0930 14:17:35.856897 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"2aa43108-6602-4fc3-b8b1-ce07a8ef0b31","Type":"ContainerStarted","Data":"8bf6387e2aa0f4f75ae8918c58e917c2986d53199a6ee84a9aee56fa30d01b19"} Sep 30 14:17:37 crc kubenswrapper[4676]: I0930 14:17:37.875649 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"2aa43108-6602-4fc3-b8b1-ce07a8ef0b31","Type":"ContainerStarted","Data":"24f252e88ba0d65a780903cc438e607c7832b7d7784ebafa81f02cdbf74f0663"} Sep 30 14:17:37 crc kubenswrapper[4676]: I0930 14:17:37.877786 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Sep 30 14:17:37 crc kubenswrapper[4676]: I0930 14:17:37.877897 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"2aa43108-6602-4fc3-b8b1-ce07a8ef0b31","Type":"ContainerStarted","Data":"b0f70ba713e0b9aa4824df241f8636ada160029d328dce039c660978f850af19"} Sep 30 14:17:37 crc kubenswrapper[4676]: I0930 14:17:37.909003 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.324111375 podStartE2EDuration="3.908981245s" podCreationTimestamp="2025-09-30 14:17:34 +0000 UTC" firstStartedPulling="2025-09-30 14:17:35.436512803 +0000 UTC m=+1159.419601232" lastFinishedPulling="2025-09-30 14:17:37.021382673 +0000 UTC m=+1161.004471102" observedRunningTime="2025-09-30 14:17:37.89511479 +0000 UTC m=+1161.878203239" watchObservedRunningTime="2025-09-30 14:17:37.908981245 +0000 UTC m=+1161.892069674" Sep 30 14:17:38 crc kubenswrapper[4676]: I0930 14:17:38.361445 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-l4wcb"] Sep 30 14:17:38 crc kubenswrapper[4676]: E0930 14:17:38.361828 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b1b569a-2fee-4891-8eb7-279d8efe9600" containerName="mariadb-database-create" Sep 30 14:17:38 crc kubenswrapper[4676]: I0930 14:17:38.361842 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b1b569a-2fee-4891-8eb7-279d8efe9600" containerName="mariadb-database-create" Sep 30 14:17:38 crc kubenswrapper[4676]: I0930 14:17:38.362059 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b1b569a-2fee-4891-8eb7-279d8efe9600" containerName="mariadb-database-create" Sep 30 14:17:38 crc kubenswrapper[4676]: I0930 14:17:38.362823 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-l4wcb" Sep 30 14:17:38 crc kubenswrapper[4676]: I0930 14:17:38.375580 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-l4wcb"] Sep 30 14:17:38 crc kubenswrapper[4676]: I0930 14:17:38.430728 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk2ln\" (UniqueName: \"kubernetes.io/projected/b9f4602f-67b5-451d-9944-b3125ac805b2-kube-api-access-zk2ln\") pod \"barbican-db-create-l4wcb\" (UID: \"b9f4602f-67b5-451d-9944-b3125ac805b2\") " pod="openstack/barbican-db-create-l4wcb" Sep 30 14:17:38 crc kubenswrapper[4676]: I0930 14:17:38.468224 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-msxhv"] Sep 30 14:17:38 crc kubenswrapper[4676]: I0930 14:17:38.470206 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-msxhv" Sep 30 14:17:38 crc kubenswrapper[4676]: I0930 14:17:38.480797 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-msxhv"] Sep 30 14:17:38 crc kubenswrapper[4676]: I0930 14:17:38.532166 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhfgv\" (UniqueName: \"kubernetes.io/projected/53f1c642-a974-44d6-8d01-457730d2a186-kube-api-access-fhfgv\") pod \"cinder-db-create-msxhv\" (UID: \"53f1c642-a974-44d6-8d01-457730d2a186\") " pod="openstack/cinder-db-create-msxhv" Sep 30 14:17:38 crc kubenswrapper[4676]: I0930 14:17:38.532265 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zk2ln\" (UniqueName: \"kubernetes.io/projected/b9f4602f-67b5-451d-9944-b3125ac805b2-kube-api-access-zk2ln\") pod \"barbican-db-create-l4wcb\" (UID: \"b9f4602f-67b5-451d-9944-b3125ac805b2\") " pod="openstack/barbican-db-create-l4wcb" Sep 30 14:17:38 crc kubenswrapper[4676]: I0930 14:17:38.564473 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zk2ln\" (UniqueName: \"kubernetes.io/projected/b9f4602f-67b5-451d-9944-b3125ac805b2-kube-api-access-zk2ln\") pod \"barbican-db-create-l4wcb\" (UID: \"b9f4602f-67b5-451d-9944-b3125ac805b2\") " pod="openstack/barbican-db-create-l4wcb" Sep 30 14:17:38 crc kubenswrapper[4676]: I0930 14:17:38.583450 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-k4qfl"] Sep 30 14:17:38 crc kubenswrapper[4676]: I0930 14:17:38.586012 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-k4qfl" Sep 30 14:17:38 crc kubenswrapper[4676]: I0930 14:17:38.610467 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-k4qfl"] Sep 30 14:17:38 crc kubenswrapper[4676]: I0930 14:17:38.633770 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhfgv\" (UniqueName: \"kubernetes.io/projected/53f1c642-a974-44d6-8d01-457730d2a186-kube-api-access-fhfgv\") pod \"cinder-db-create-msxhv\" (UID: \"53f1c642-a974-44d6-8d01-457730d2a186\") " pod="openstack/cinder-db-create-msxhv" Sep 30 14:17:38 crc kubenswrapper[4676]: I0930 14:17:38.633830 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k62b5\" (UniqueName: \"kubernetes.io/projected/dd479fd7-ab79-40ee-a0f0-e4f2b192c80c-kube-api-access-k62b5\") pod \"neutron-db-create-k4qfl\" (UID: \"dd479fd7-ab79-40ee-a0f0-e4f2b192c80c\") " pod="openstack/neutron-db-create-k4qfl" Sep 30 14:17:38 crc kubenswrapper[4676]: I0930 14:17:38.656225 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhfgv\" (UniqueName: \"kubernetes.io/projected/53f1c642-a974-44d6-8d01-457730d2a186-kube-api-access-fhfgv\") pod \"cinder-db-create-msxhv\" (UID: \"53f1c642-a974-44d6-8d01-457730d2a186\") " pod="openstack/cinder-db-create-msxhv" Sep 30 14:17:38 crc kubenswrapper[4676]: I0930 14:17:38.693469 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-l4wcb" Sep 30 14:17:38 crc kubenswrapper[4676]: I0930 14:17:38.735830 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k62b5\" (UniqueName: \"kubernetes.io/projected/dd479fd7-ab79-40ee-a0f0-e4f2b192c80c-kube-api-access-k62b5\") pod \"neutron-db-create-k4qfl\" (UID: \"dd479fd7-ab79-40ee-a0f0-e4f2b192c80c\") " pod="openstack/neutron-db-create-k4qfl" Sep 30 14:17:38 crc kubenswrapper[4676]: I0930 14:17:38.757261 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k62b5\" (UniqueName: \"kubernetes.io/projected/dd479fd7-ab79-40ee-a0f0-e4f2b192c80c-kube-api-access-k62b5\") pod \"neutron-db-create-k4qfl\" (UID: \"dd479fd7-ab79-40ee-a0f0-e4f2b192c80c\") " pod="openstack/neutron-db-create-k4qfl" Sep 30 14:17:38 crc kubenswrapper[4676]: I0930 14:17:38.790331 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-msxhv" Sep 30 14:17:38 crc kubenswrapper[4676]: I0930 14:17:38.837840 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6c0230d2-8bbc-4ad0-8f3d-062d2d940013-etc-swift\") pod \"swift-storage-0\" (UID: \"6c0230d2-8bbc-4ad0-8f3d-062d2d940013\") " pod="openstack/swift-storage-0" Sep 30 14:17:38 crc kubenswrapper[4676]: E0930 14:17:38.838011 4676 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Sep 30 14:17:38 crc kubenswrapper[4676]: E0930 14:17:38.838252 4676 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Sep 30 14:17:38 crc kubenswrapper[4676]: E0930 14:17:38.838308 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6c0230d2-8bbc-4ad0-8f3d-062d2d940013-etc-swift podName:6c0230d2-8bbc-4ad0-8f3d-062d2d940013 nodeName:}" failed. No retries permitted until 2025-09-30 14:17:54.838290534 +0000 UTC m=+1178.821378963 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6c0230d2-8bbc-4ad0-8f3d-062d2d940013-etc-swift") pod "swift-storage-0" (UID: "6c0230d2-8bbc-4ad0-8f3d-062d2d940013") : configmap "swift-ring-files" not found Sep 30 14:17:38 crc kubenswrapper[4676]: I0930 14:17:38.915082 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-k4qfl" Sep 30 14:17:39 crc kubenswrapper[4676]: I0930 14:17:39.115688 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-l4wcb"] Sep 30 14:17:39 crc kubenswrapper[4676]: I0930 14:17:39.232907 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-msxhv"] Sep 30 14:17:39 crc kubenswrapper[4676]: W0930 14:17:39.253058 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod53f1c642_a974_44d6_8d01_457730d2a186.slice/crio-30c15169eb717c822c11e8997fa8b43aa531affec2f6625ca7d1a810f064b9a7 WatchSource:0}: Error finding container 30c15169eb717c822c11e8997fa8b43aa531affec2f6625ca7d1a810f064b9a7: Status 404 returned error can't find the container with id 30c15169eb717c822c11e8997fa8b43aa531affec2f6625ca7d1a810f064b9a7 Sep 30 14:17:39 crc kubenswrapper[4676]: I0930 14:17:39.359974 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-k4qfl"] Sep 30 14:17:39 crc kubenswrapper[4676]: W0930 14:17:39.366792 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd479fd7_ab79_40ee_a0f0_e4f2b192c80c.slice/crio-c8e24f0b0fc3b0bd9ce8d3ce50250e55569333f7fd97b596053345d27532c9ab WatchSource:0}: Error finding container c8e24f0b0fc3b0bd9ce8d3ce50250e55569333f7fd97b596053345d27532c9ab: Status 404 returned error can't find the container with id c8e24f0b0fc3b0bd9ce8d3ce50250e55569333f7fd97b596053345d27532c9ab Sep 30 14:17:39 crc kubenswrapper[4676]: I0930 14:17:39.528485 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-hw99z"] Sep 30 14:17:39 crc kubenswrapper[4676]: I0930 14:17:39.529759 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-hw99z" Sep 30 14:17:39 crc kubenswrapper[4676]: I0930 14:17:39.554970 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t59bt\" (UniqueName: \"kubernetes.io/projected/1ac6c313-76fd-461b-9ca1-b93c7e1fb915-kube-api-access-t59bt\") pod \"keystone-db-create-hw99z\" (UID: \"1ac6c313-76fd-461b-9ca1-b93c7e1fb915\") " pod="openstack/keystone-db-create-hw99z" Sep 30 14:17:39 crc kubenswrapper[4676]: I0930 14:17:39.558442 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-hw99z"] Sep 30 14:17:39 crc kubenswrapper[4676]: I0930 14:17:39.656263 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t59bt\" (UniqueName: \"kubernetes.io/projected/1ac6c313-76fd-461b-9ca1-b93c7e1fb915-kube-api-access-t59bt\") pod \"keystone-db-create-hw99z\" (UID: \"1ac6c313-76fd-461b-9ca1-b93c7e1fb915\") " pod="openstack/keystone-db-create-hw99z" Sep 30 14:17:39 crc kubenswrapper[4676]: I0930 14:17:39.679148 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t59bt\" (UniqueName: \"kubernetes.io/projected/1ac6c313-76fd-461b-9ca1-b93c7e1fb915-kube-api-access-t59bt\") pod \"keystone-db-create-hw99z\" (UID: \"1ac6c313-76fd-461b-9ca1-b93c7e1fb915\") " pod="openstack/keystone-db-create-hw99z" Sep 30 14:17:39 crc kubenswrapper[4676]: I0930 14:17:39.822671 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-jlwpr"] Sep 30 14:17:39 crc kubenswrapper[4676]: I0930 14:17:39.824563 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-jlwpr" Sep 30 14:17:39 crc kubenswrapper[4676]: I0930 14:17:39.833398 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-jlwpr"] Sep 30 14:17:39 crc kubenswrapper[4676]: I0930 14:17:39.859397 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-hw99z" Sep 30 14:17:39 crc kubenswrapper[4676]: I0930 14:17:39.859948 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5tkn\" (UniqueName: \"kubernetes.io/projected/9ca1c7f3-f23d-4e35-9183-4cbf943e567d-kube-api-access-h5tkn\") pod \"placement-db-create-jlwpr\" (UID: \"9ca1c7f3-f23d-4e35-9183-4cbf943e567d\") " pod="openstack/placement-db-create-jlwpr" Sep 30 14:17:39 crc kubenswrapper[4676]: I0930 14:17:39.893236 4676 generic.go:334] "Generic (PLEG): container finished" podID="b9f4602f-67b5-451d-9944-b3125ac805b2" containerID="23fe04c2f3809d1f1c2287ff1b727f9347f93f0336fd8272f23001dbf8600b22" exitCode=0 Sep 30 14:17:39 crc kubenswrapper[4676]: I0930 14:17:39.893323 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-l4wcb" event={"ID":"b9f4602f-67b5-451d-9944-b3125ac805b2","Type":"ContainerDied","Data":"23fe04c2f3809d1f1c2287ff1b727f9347f93f0336fd8272f23001dbf8600b22"} Sep 30 14:17:39 crc kubenswrapper[4676]: I0930 14:17:39.893664 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-l4wcb" event={"ID":"b9f4602f-67b5-451d-9944-b3125ac805b2","Type":"ContainerStarted","Data":"b4386e29195b3295e6401c72ff4aa46f4ccb5dd471277e0ee835d897f07e7881"} Sep 30 14:17:39 crc kubenswrapper[4676]: I0930 14:17:39.896411 4676 generic.go:334] "Generic (PLEG): container finished" podID="53f1c642-a974-44d6-8d01-457730d2a186" containerID="92cb55d388f6edb10e4c1e36fa5a9909cf2af12ec1c63b66cc7624d13a3a8e13" exitCode=0 Sep 30 14:17:39 crc kubenswrapper[4676]: I0930 14:17:39.896471 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-msxhv" event={"ID":"53f1c642-a974-44d6-8d01-457730d2a186","Type":"ContainerDied","Data":"92cb55d388f6edb10e4c1e36fa5a9909cf2af12ec1c63b66cc7624d13a3a8e13"} Sep 30 14:17:39 crc kubenswrapper[4676]: I0930 14:17:39.896501 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-msxhv" event={"ID":"53f1c642-a974-44d6-8d01-457730d2a186","Type":"ContainerStarted","Data":"30c15169eb717c822c11e8997fa8b43aa531affec2f6625ca7d1a810f064b9a7"} Sep 30 14:17:39 crc kubenswrapper[4676]: I0930 14:17:39.899775 4676 generic.go:334] "Generic (PLEG): container finished" podID="dd479fd7-ab79-40ee-a0f0-e4f2b192c80c" containerID="d0ce6ba72a1c93abd37956fee26ea9e9d4e759d0f1e835d8ab6423288b3aed90" exitCode=0 Sep 30 14:17:39 crc kubenswrapper[4676]: I0930 14:17:39.900587 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-k4qfl" event={"ID":"dd479fd7-ab79-40ee-a0f0-e4f2b192c80c","Type":"ContainerDied","Data":"d0ce6ba72a1c93abd37956fee26ea9e9d4e759d0f1e835d8ab6423288b3aed90"} Sep 30 14:17:39 crc kubenswrapper[4676]: I0930 14:17:39.900606 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-k4qfl" event={"ID":"dd479fd7-ab79-40ee-a0f0-e4f2b192c80c","Type":"ContainerStarted","Data":"c8e24f0b0fc3b0bd9ce8d3ce50250e55569333f7fd97b596053345d27532c9ab"} Sep 30 14:17:39 crc kubenswrapper[4676]: I0930 14:17:39.961708 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5tkn\" (UniqueName: \"kubernetes.io/projected/9ca1c7f3-f23d-4e35-9183-4cbf943e567d-kube-api-access-h5tkn\") pod \"placement-db-create-jlwpr\" (UID: \"9ca1c7f3-f23d-4e35-9183-4cbf943e567d\") " pod="openstack/placement-db-create-jlwpr" Sep 30 14:17:39 crc kubenswrapper[4676]: I0930 14:17:39.980647 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5tkn\" (UniqueName: \"kubernetes.io/projected/9ca1c7f3-f23d-4e35-9183-4cbf943e567d-kube-api-access-h5tkn\") pod \"placement-db-create-jlwpr\" (UID: \"9ca1c7f3-f23d-4e35-9183-4cbf943e567d\") " pod="openstack/placement-db-create-jlwpr" Sep 30 14:17:40 crc kubenswrapper[4676]: I0930 14:17:40.139896 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-jlwpr" Sep 30 14:17:40 crc kubenswrapper[4676]: I0930 14:17:40.308694 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-hw99z"] Sep 30 14:17:40 crc kubenswrapper[4676]: I0930 14:17:40.598198 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-jlwpr"] Sep 30 14:17:40 crc kubenswrapper[4676]: W0930 14:17:40.598694 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ca1c7f3_f23d_4e35_9183_4cbf943e567d.slice/crio-a2979356db95fe24148d92859cd66cda7a5b080ac1645ccd55e03e2841e1c2c3 WatchSource:0}: Error finding container a2979356db95fe24148d92859cd66cda7a5b080ac1645ccd55e03e2841e1c2c3: Status 404 returned error can't find the container with id a2979356db95fe24148d92859cd66cda7a5b080ac1645ccd55e03e2841e1c2c3 Sep 30 14:17:40 crc kubenswrapper[4676]: I0930 14:17:40.911946 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-jlwpr" event={"ID":"9ca1c7f3-f23d-4e35-9183-4cbf943e567d","Type":"ContainerStarted","Data":"04d4adbd34469149417bcd5d2ed90120c0c6da2aec29a1482c962af67dcaeaae"} Sep 30 14:17:40 crc kubenswrapper[4676]: I0930 14:17:40.912296 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-jlwpr" event={"ID":"9ca1c7f3-f23d-4e35-9183-4cbf943e567d","Type":"ContainerStarted","Data":"a2979356db95fe24148d92859cd66cda7a5b080ac1645ccd55e03e2841e1c2c3"} Sep 30 14:17:40 crc kubenswrapper[4676]: I0930 14:17:40.914930 4676 generic.go:334] "Generic (PLEG): container finished" podID="1ac6c313-76fd-461b-9ca1-b93c7e1fb915" containerID="e51fa9717e27d3e2f72286bb5b0738d6d9067bb2754f56d1b5795876c3d36644" exitCode=0 Sep 30 14:17:40 crc kubenswrapper[4676]: I0930 14:17:40.915017 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-hw99z" event={"ID":"1ac6c313-76fd-461b-9ca1-b93c7e1fb915","Type":"ContainerDied","Data":"e51fa9717e27d3e2f72286bb5b0738d6d9067bb2754f56d1b5795876c3d36644"} Sep 30 14:17:40 crc kubenswrapper[4676]: I0930 14:17:40.915055 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-hw99z" event={"ID":"1ac6c313-76fd-461b-9ca1-b93c7e1fb915","Type":"ContainerStarted","Data":"f65d23ac036dbc8068f110a4f13211c9bdaf6da34f99471c667302bf0b89c0aa"} Sep 30 14:17:40 crc kubenswrapper[4676]: I0930 14:17:40.916807 4676 generic.go:334] "Generic (PLEG): container finished" podID="fb668321-0ea7-4c30-9773-6c7f511959f4" containerID="c5e5548589305e5cfc9fd6b658354112d72a6dacb865ba1d7017a328e34acb22" exitCode=0 Sep 30 14:17:40 crc kubenswrapper[4676]: I0930 14:17:40.916913 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-vbg62" event={"ID":"fb668321-0ea7-4c30-9773-6c7f511959f4","Type":"ContainerDied","Data":"c5e5548589305e5cfc9fd6b658354112d72a6dacb865ba1d7017a328e34acb22"} Sep 30 14:17:40 crc kubenswrapper[4676]: I0930 14:17:40.938109 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-jlwpr" podStartSLOduration=1.938091412 podStartE2EDuration="1.938091412s" podCreationTimestamp="2025-09-30 14:17:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:17:40.9302277 +0000 UTC m=+1164.913316189" watchObservedRunningTime="2025-09-30 14:17:40.938091412 +0000 UTC m=+1164.921179841" Sep 30 14:17:41 crc kubenswrapper[4676]: I0930 14:17:41.388921 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-k4qfl" Sep 30 14:17:41 crc kubenswrapper[4676]: I0930 14:17:41.491812 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k62b5\" (UniqueName: \"kubernetes.io/projected/dd479fd7-ab79-40ee-a0f0-e4f2b192c80c-kube-api-access-k62b5\") pod \"dd479fd7-ab79-40ee-a0f0-e4f2b192c80c\" (UID: \"dd479fd7-ab79-40ee-a0f0-e4f2b192c80c\") " Sep 30 14:17:41 crc kubenswrapper[4676]: I0930 14:17:41.497374 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-msxhv" Sep 30 14:17:41 crc kubenswrapper[4676]: I0930 14:17:41.497623 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd479fd7-ab79-40ee-a0f0-e4f2b192c80c-kube-api-access-k62b5" (OuterVolumeSpecName: "kube-api-access-k62b5") pod "dd479fd7-ab79-40ee-a0f0-e4f2b192c80c" (UID: "dd479fd7-ab79-40ee-a0f0-e4f2b192c80c"). InnerVolumeSpecName "kube-api-access-k62b5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:17:41 crc kubenswrapper[4676]: I0930 14:17:41.506529 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-l4wcb" Sep 30 14:17:41 crc kubenswrapper[4676]: I0930 14:17:41.593659 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhfgv\" (UniqueName: \"kubernetes.io/projected/53f1c642-a974-44d6-8d01-457730d2a186-kube-api-access-fhfgv\") pod \"53f1c642-a974-44d6-8d01-457730d2a186\" (UID: \"53f1c642-a974-44d6-8d01-457730d2a186\") " Sep 30 14:17:41 crc kubenswrapper[4676]: I0930 14:17:41.593808 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zk2ln\" (UniqueName: \"kubernetes.io/projected/b9f4602f-67b5-451d-9944-b3125ac805b2-kube-api-access-zk2ln\") pod \"b9f4602f-67b5-451d-9944-b3125ac805b2\" (UID: \"b9f4602f-67b5-451d-9944-b3125ac805b2\") " Sep 30 14:17:41 crc kubenswrapper[4676]: I0930 14:17:41.594163 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Sep 30 14:17:41 crc kubenswrapper[4676]: I0930 14:17:41.594395 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k62b5\" (UniqueName: \"kubernetes.io/projected/dd479fd7-ab79-40ee-a0f0-e4f2b192c80c-kube-api-access-k62b5\") on node \"crc\" DevicePath \"\"" Sep 30 14:17:41 crc kubenswrapper[4676]: I0930 14:17:41.596709 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9f4602f-67b5-451d-9944-b3125ac805b2-kube-api-access-zk2ln" (OuterVolumeSpecName: "kube-api-access-zk2ln") pod "b9f4602f-67b5-451d-9944-b3125ac805b2" (UID: "b9f4602f-67b5-451d-9944-b3125ac805b2"). InnerVolumeSpecName "kube-api-access-zk2ln". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:17:41 crc kubenswrapper[4676]: I0930 14:17:41.597785 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53f1c642-a974-44d6-8d01-457730d2a186-kube-api-access-fhfgv" (OuterVolumeSpecName: "kube-api-access-fhfgv") pod "53f1c642-a974-44d6-8d01-457730d2a186" (UID: "53f1c642-a974-44d6-8d01-457730d2a186"). InnerVolumeSpecName "kube-api-access-fhfgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:17:41 crc kubenswrapper[4676]: I0930 14:17:41.696014 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhfgv\" (UniqueName: \"kubernetes.io/projected/53f1c642-a974-44d6-8d01-457730d2a186-kube-api-access-fhfgv\") on node \"crc\" DevicePath \"\"" Sep 30 14:17:41 crc kubenswrapper[4676]: I0930 14:17:41.696062 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zk2ln\" (UniqueName: \"kubernetes.io/projected/b9f4602f-67b5-451d-9944-b3125ac805b2-kube-api-access-zk2ln\") on node \"crc\" DevicePath \"\"" Sep 30 14:17:41 crc kubenswrapper[4676]: I0930 14:17:41.927935 4676 generic.go:334] "Generic (PLEG): container finished" podID="9ca1c7f3-f23d-4e35-9183-4cbf943e567d" containerID="04d4adbd34469149417bcd5d2ed90120c0c6da2aec29a1482c962af67dcaeaae" exitCode=0 Sep 30 14:17:41 crc kubenswrapper[4676]: I0930 14:17:41.928049 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-jlwpr" event={"ID":"9ca1c7f3-f23d-4e35-9183-4cbf943e567d","Type":"ContainerDied","Data":"04d4adbd34469149417bcd5d2ed90120c0c6da2aec29a1482c962af67dcaeaae"} Sep 30 14:17:41 crc kubenswrapper[4676]: I0930 14:17:41.931652 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-k4qfl" Sep 30 14:17:41 crc kubenswrapper[4676]: I0930 14:17:41.931998 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-k4qfl" event={"ID":"dd479fd7-ab79-40ee-a0f0-e4f2b192c80c","Type":"ContainerDied","Data":"c8e24f0b0fc3b0bd9ce8d3ce50250e55569333f7fd97b596053345d27532c9ab"} Sep 30 14:17:41 crc kubenswrapper[4676]: I0930 14:17:41.932064 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8e24f0b0fc3b0bd9ce8d3ce50250e55569333f7fd97b596053345d27532c9ab" Sep 30 14:17:41 crc kubenswrapper[4676]: I0930 14:17:41.940402 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-l4wcb" event={"ID":"b9f4602f-67b5-451d-9944-b3125ac805b2","Type":"ContainerDied","Data":"b4386e29195b3295e6401c72ff4aa46f4ccb5dd471277e0ee835d897f07e7881"} Sep 30 14:17:41 crc kubenswrapper[4676]: I0930 14:17:41.940733 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b4386e29195b3295e6401c72ff4aa46f4ccb5dd471277e0ee835d897f07e7881" Sep 30 14:17:41 crc kubenswrapper[4676]: I0930 14:17:41.940431 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-l4wcb" Sep 30 14:17:41 crc kubenswrapper[4676]: I0930 14:17:41.944666 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-msxhv" event={"ID":"53f1c642-a974-44d6-8d01-457730d2a186","Type":"ContainerDied","Data":"30c15169eb717c822c11e8997fa8b43aa531affec2f6625ca7d1a810f064b9a7"} Sep 30 14:17:41 crc kubenswrapper[4676]: I0930 14:17:41.944723 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30c15169eb717c822c11e8997fa8b43aa531affec2f6625ca7d1a810f064b9a7" Sep 30 14:17:41 crc kubenswrapper[4676]: I0930 14:17:41.944805 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-msxhv" Sep 30 14:17:42 crc kubenswrapper[4676]: I0930 14:17:42.351576 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vbg62" Sep 30 14:17:42 crc kubenswrapper[4676]: I0930 14:17:42.408684 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb668321-0ea7-4c30-9773-6c7f511959f4-combined-ca-bundle\") pod \"fb668321-0ea7-4c30-9773-6c7f511959f4\" (UID: \"fb668321-0ea7-4c30-9773-6c7f511959f4\") " Sep 30 14:17:42 crc kubenswrapper[4676]: I0930 14:17:42.408756 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2r6vk\" (UniqueName: \"kubernetes.io/projected/fb668321-0ea7-4c30-9773-6c7f511959f4-kube-api-access-2r6vk\") pod \"fb668321-0ea7-4c30-9773-6c7f511959f4\" (UID: \"fb668321-0ea7-4c30-9773-6c7f511959f4\") " Sep 30 14:17:42 crc kubenswrapper[4676]: I0930 14:17:42.408791 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fb668321-0ea7-4c30-9773-6c7f511959f4-ring-data-devices\") pod \"fb668321-0ea7-4c30-9773-6c7f511959f4\" (UID: \"fb668321-0ea7-4c30-9773-6c7f511959f4\") " Sep 30 14:17:42 crc kubenswrapper[4676]: I0930 14:17:42.408848 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fb668321-0ea7-4c30-9773-6c7f511959f4-scripts\") pod \"fb668321-0ea7-4c30-9773-6c7f511959f4\" (UID: \"fb668321-0ea7-4c30-9773-6c7f511959f4\") " Sep 30 14:17:42 crc kubenswrapper[4676]: I0930 14:17:42.409016 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fb668321-0ea7-4c30-9773-6c7f511959f4-dispersionconf\") pod \"fb668321-0ea7-4c30-9773-6c7f511959f4\" (UID: \"fb668321-0ea7-4c30-9773-6c7f511959f4\") " Sep 30 14:17:42 crc kubenswrapper[4676]: I0930 14:17:42.409052 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fb668321-0ea7-4c30-9773-6c7f511959f4-swiftconf\") pod \"fb668321-0ea7-4c30-9773-6c7f511959f4\" (UID: \"fb668321-0ea7-4c30-9773-6c7f511959f4\") " Sep 30 14:17:42 crc kubenswrapper[4676]: I0930 14:17:42.409082 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fb668321-0ea7-4c30-9773-6c7f511959f4-etc-swift\") pod \"fb668321-0ea7-4c30-9773-6c7f511959f4\" (UID: \"fb668321-0ea7-4c30-9773-6c7f511959f4\") " Sep 30 14:17:42 crc kubenswrapper[4676]: I0930 14:17:42.410628 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb668321-0ea7-4c30-9773-6c7f511959f4-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "fb668321-0ea7-4c30-9773-6c7f511959f4" (UID: "fb668321-0ea7-4c30-9773-6c7f511959f4"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:17:42 crc kubenswrapper[4676]: I0930 14:17:42.412791 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb668321-0ea7-4c30-9773-6c7f511959f4-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "fb668321-0ea7-4c30-9773-6c7f511959f4" (UID: "fb668321-0ea7-4c30-9773-6c7f511959f4"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:17:42 crc kubenswrapper[4676]: I0930 14:17:42.418704 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb668321-0ea7-4c30-9773-6c7f511959f4-kube-api-access-2r6vk" (OuterVolumeSpecName: "kube-api-access-2r6vk") pod "fb668321-0ea7-4c30-9773-6c7f511959f4" (UID: "fb668321-0ea7-4c30-9773-6c7f511959f4"). InnerVolumeSpecName "kube-api-access-2r6vk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:17:42 crc kubenswrapper[4676]: I0930 14:17:42.425093 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb668321-0ea7-4c30-9773-6c7f511959f4-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "fb668321-0ea7-4c30-9773-6c7f511959f4" (UID: "fb668321-0ea7-4c30-9773-6c7f511959f4"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:17:42 crc kubenswrapper[4676]: I0930 14:17:42.434650 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb668321-0ea7-4c30-9773-6c7f511959f4-scripts" (OuterVolumeSpecName: "scripts") pod "fb668321-0ea7-4c30-9773-6c7f511959f4" (UID: "fb668321-0ea7-4c30-9773-6c7f511959f4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:17:42 crc kubenswrapper[4676]: I0930 14:17:42.439649 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb668321-0ea7-4c30-9773-6c7f511959f4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fb668321-0ea7-4c30-9773-6c7f511959f4" (UID: "fb668321-0ea7-4c30-9773-6c7f511959f4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:17:42 crc kubenswrapper[4676]: I0930 14:17:42.446132 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb668321-0ea7-4c30-9773-6c7f511959f4-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "fb668321-0ea7-4c30-9773-6c7f511959f4" (UID: "fb668321-0ea7-4c30-9773-6c7f511959f4"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:17:42 crc kubenswrapper[4676]: I0930 14:17:42.467603 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-hw99z" Sep 30 14:17:42 crc kubenswrapper[4676]: I0930 14:17:42.510692 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t59bt\" (UniqueName: \"kubernetes.io/projected/1ac6c313-76fd-461b-9ca1-b93c7e1fb915-kube-api-access-t59bt\") pod \"1ac6c313-76fd-461b-9ca1-b93c7e1fb915\" (UID: \"1ac6c313-76fd-461b-9ca1-b93c7e1fb915\") " Sep 30 14:17:42 crc kubenswrapper[4676]: I0930 14:17:42.512017 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb668321-0ea7-4c30-9773-6c7f511959f4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:17:42 crc kubenswrapper[4676]: I0930 14:17:42.512044 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2r6vk\" (UniqueName: \"kubernetes.io/projected/fb668321-0ea7-4c30-9773-6c7f511959f4-kube-api-access-2r6vk\") on node \"crc\" DevicePath \"\"" Sep 30 14:17:42 crc kubenswrapper[4676]: I0930 14:17:42.512062 4676 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fb668321-0ea7-4c30-9773-6c7f511959f4-ring-data-devices\") on node \"crc\" DevicePath \"\"" Sep 30 14:17:42 crc kubenswrapper[4676]: I0930 14:17:42.512073 4676 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fb668321-0ea7-4c30-9773-6c7f511959f4-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 14:17:42 crc kubenswrapper[4676]: I0930 14:17:42.512083 4676 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fb668321-0ea7-4c30-9773-6c7f511959f4-dispersionconf\") on node \"crc\" DevicePath \"\"" Sep 30 14:17:42 crc kubenswrapper[4676]: I0930 14:17:42.512096 4676 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fb668321-0ea7-4c30-9773-6c7f511959f4-swiftconf\") on node \"crc\" DevicePath \"\"" Sep 30 14:17:42 crc kubenswrapper[4676]: I0930 14:17:42.512106 4676 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fb668321-0ea7-4c30-9773-6c7f511959f4-etc-swift\") on node \"crc\" DevicePath \"\"" Sep 30 14:17:42 crc kubenswrapper[4676]: I0930 14:17:42.518175 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ac6c313-76fd-461b-9ca1-b93c7e1fb915-kube-api-access-t59bt" (OuterVolumeSpecName: "kube-api-access-t59bt") pod "1ac6c313-76fd-461b-9ca1-b93c7e1fb915" (UID: "1ac6c313-76fd-461b-9ca1-b93c7e1fb915"). InnerVolumeSpecName "kube-api-access-t59bt". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:17:42 crc kubenswrapper[4676]: I0930 14:17:42.613466 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t59bt\" (UniqueName: \"kubernetes.io/projected/1ac6c313-76fd-461b-9ca1-b93c7e1fb915-kube-api-access-t59bt\") on node \"crc\" DevicePath \"\"" Sep 30 14:17:42 crc kubenswrapper[4676]: I0930 14:17:42.957975 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-vbg62" event={"ID":"fb668321-0ea7-4c30-9773-6c7f511959f4","Type":"ContainerDied","Data":"cea21690f828c4aa63ccdad007a4163c0c8744eebdb828fa7cffce2374dde70f"} Sep 30 14:17:42 crc kubenswrapper[4676]: I0930 14:17:42.958026 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cea21690f828c4aa63ccdad007a4163c0c8744eebdb828fa7cffce2374dde70f" Sep 30 14:17:42 crc kubenswrapper[4676]: I0930 14:17:42.958031 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vbg62" Sep 30 14:17:42 crc kubenswrapper[4676]: I0930 14:17:42.959539 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-hw99z" event={"ID":"1ac6c313-76fd-461b-9ca1-b93c7e1fb915","Type":"ContainerDied","Data":"f65d23ac036dbc8068f110a4f13211c9bdaf6da34f99471c667302bf0b89c0aa"} Sep 30 14:17:42 crc kubenswrapper[4676]: I0930 14:17:42.959606 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f65d23ac036dbc8068f110a4f13211c9bdaf6da34f99471c667302bf0b89c0aa" Sep 30 14:17:42 crc kubenswrapper[4676]: I0930 14:17:42.959565 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-hw99z" Sep 30 14:17:43 crc kubenswrapper[4676]: I0930 14:17:43.249776 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-jlwpr" Sep 30 14:17:43 crc kubenswrapper[4676]: I0930 14:17:43.324028 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5tkn\" (UniqueName: \"kubernetes.io/projected/9ca1c7f3-f23d-4e35-9183-4cbf943e567d-kube-api-access-h5tkn\") pod \"9ca1c7f3-f23d-4e35-9183-4cbf943e567d\" (UID: \"9ca1c7f3-f23d-4e35-9183-4cbf943e567d\") " Sep 30 14:17:43 crc kubenswrapper[4676]: I0930 14:17:43.327540 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ca1c7f3-f23d-4e35-9183-4cbf943e567d-kube-api-access-h5tkn" (OuterVolumeSpecName: "kube-api-access-h5tkn") pod "9ca1c7f3-f23d-4e35-9183-4cbf943e567d" (UID: "9ca1c7f3-f23d-4e35-9183-4cbf943e567d"). InnerVolumeSpecName "kube-api-access-h5tkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:17:43 crc kubenswrapper[4676]: I0930 14:17:43.425941 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5tkn\" (UniqueName: \"kubernetes.io/projected/9ca1c7f3-f23d-4e35-9183-4cbf943e567d-kube-api-access-h5tkn\") on node \"crc\" DevicePath \"\"" Sep 30 14:17:43 crc kubenswrapper[4676]: I0930 14:17:43.968102 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-jlwpr" event={"ID":"9ca1c7f3-f23d-4e35-9183-4cbf943e567d","Type":"ContainerDied","Data":"a2979356db95fe24148d92859cd66cda7a5b080ac1645ccd55e03e2841e1c2c3"} Sep 30 14:17:43 crc kubenswrapper[4676]: I0930 14:17:43.968133 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-jlwpr" Sep 30 14:17:43 crc kubenswrapper[4676]: I0930 14:17:43.968156 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2979356db95fe24148d92859cd66cda7a5b080ac1645ccd55e03e2841e1c2c3" Sep 30 14:17:45 crc kubenswrapper[4676]: I0930 14:17:45.374635 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-7t792" Sep 30 14:17:45 crc kubenswrapper[4676]: I0930 14:17:45.378362 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-7t792" Sep 30 14:17:45 crc kubenswrapper[4676]: I0930 14:17:45.600449 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-9nlxb-config-qq75c"] Sep 30 14:17:45 crc kubenswrapper[4676]: E0930 14:17:45.602294 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb668321-0ea7-4c30-9773-6c7f511959f4" containerName="swift-ring-rebalance" Sep 30 14:17:45 crc kubenswrapper[4676]: I0930 14:17:45.602321 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb668321-0ea7-4c30-9773-6c7f511959f4" containerName="swift-ring-rebalance" Sep 30 14:17:45 crc kubenswrapper[4676]: E0930 14:17:45.602338 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd479fd7-ab79-40ee-a0f0-e4f2b192c80c" containerName="mariadb-database-create" Sep 30 14:17:45 crc kubenswrapper[4676]: I0930 14:17:45.602344 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd479fd7-ab79-40ee-a0f0-e4f2b192c80c" containerName="mariadb-database-create" Sep 30 14:17:45 crc kubenswrapper[4676]: E0930 14:17:45.602356 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9f4602f-67b5-451d-9944-b3125ac805b2" containerName="mariadb-database-create" Sep 30 14:17:45 crc kubenswrapper[4676]: I0930 14:17:45.602361 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9f4602f-67b5-451d-9944-b3125ac805b2" containerName="mariadb-database-create" Sep 30 14:17:45 crc kubenswrapper[4676]: E0930 14:17:45.602373 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ac6c313-76fd-461b-9ca1-b93c7e1fb915" containerName="mariadb-database-create" Sep 30 14:17:45 crc kubenswrapper[4676]: I0930 14:17:45.602378 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ac6c313-76fd-461b-9ca1-b93c7e1fb915" containerName="mariadb-database-create" Sep 30 14:17:45 crc kubenswrapper[4676]: E0930 14:17:45.602400 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ca1c7f3-f23d-4e35-9183-4cbf943e567d" containerName="mariadb-database-create" Sep 30 14:17:45 crc kubenswrapper[4676]: I0930 14:17:45.602406 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ca1c7f3-f23d-4e35-9183-4cbf943e567d" containerName="mariadb-database-create" Sep 30 14:17:45 crc kubenswrapper[4676]: E0930 14:17:45.602418 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53f1c642-a974-44d6-8d01-457730d2a186" containerName="mariadb-database-create" Sep 30 14:17:45 crc kubenswrapper[4676]: I0930 14:17:45.602423 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="53f1c642-a974-44d6-8d01-457730d2a186" containerName="mariadb-database-create" Sep 30 14:17:45 crc kubenswrapper[4676]: I0930 14:17:45.602582 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ac6c313-76fd-461b-9ca1-b93c7e1fb915" containerName="mariadb-database-create" Sep 30 14:17:45 crc kubenswrapper[4676]: I0930 14:17:45.602602 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ca1c7f3-f23d-4e35-9183-4cbf943e567d" containerName="mariadb-database-create" Sep 30 14:17:45 crc kubenswrapper[4676]: I0930 14:17:45.602614 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="53f1c642-a974-44d6-8d01-457730d2a186" containerName="mariadb-database-create" Sep 30 14:17:45 crc kubenswrapper[4676]: I0930 14:17:45.602627 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb668321-0ea7-4c30-9773-6c7f511959f4" containerName="swift-ring-rebalance" Sep 30 14:17:45 crc kubenswrapper[4676]: I0930 14:17:45.602636 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9f4602f-67b5-451d-9944-b3125ac805b2" containerName="mariadb-database-create" Sep 30 14:17:45 crc kubenswrapper[4676]: I0930 14:17:45.602644 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd479fd7-ab79-40ee-a0f0-e4f2b192c80c" containerName="mariadb-database-create" Sep 30 14:17:45 crc kubenswrapper[4676]: I0930 14:17:45.603207 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9nlxb-config-qq75c" Sep 30 14:17:45 crc kubenswrapper[4676]: I0930 14:17:45.607112 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Sep 30 14:17:45 crc kubenswrapper[4676]: I0930 14:17:45.661131 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9nlxb-config-qq75c"] Sep 30 14:17:45 crc kubenswrapper[4676]: I0930 14:17:45.661574 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9c502fce-3c55-4e57-abb0-fb1cca9f11e4-var-run-ovn\") pod \"ovn-controller-9nlxb-config-qq75c\" (UID: \"9c502fce-3c55-4e57-abb0-fb1cca9f11e4\") " pod="openstack/ovn-controller-9nlxb-config-qq75c" Sep 30 14:17:45 crc kubenswrapper[4676]: I0930 14:17:45.661677 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9c502fce-3c55-4e57-abb0-fb1cca9f11e4-var-log-ovn\") pod \"ovn-controller-9nlxb-config-qq75c\" (UID: \"9c502fce-3c55-4e57-abb0-fb1cca9f11e4\") " pod="openstack/ovn-controller-9nlxb-config-qq75c" Sep 30 14:17:45 crc kubenswrapper[4676]: I0930 14:17:45.661736 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cb7hm\" (UniqueName: \"kubernetes.io/projected/9c502fce-3c55-4e57-abb0-fb1cca9f11e4-kube-api-access-cb7hm\") pod \"ovn-controller-9nlxb-config-qq75c\" (UID: \"9c502fce-3c55-4e57-abb0-fb1cca9f11e4\") " pod="openstack/ovn-controller-9nlxb-config-qq75c" Sep 30 14:17:45 crc kubenswrapper[4676]: I0930 14:17:45.661770 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9c502fce-3c55-4e57-abb0-fb1cca9f11e4-additional-scripts\") pod \"ovn-controller-9nlxb-config-qq75c\" (UID: \"9c502fce-3c55-4e57-abb0-fb1cca9f11e4\") " pod="openstack/ovn-controller-9nlxb-config-qq75c" Sep 30 14:17:45 crc kubenswrapper[4676]: I0930 14:17:45.661940 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c502fce-3c55-4e57-abb0-fb1cca9f11e4-scripts\") pod \"ovn-controller-9nlxb-config-qq75c\" (UID: \"9c502fce-3c55-4e57-abb0-fb1cca9f11e4\") " pod="openstack/ovn-controller-9nlxb-config-qq75c" Sep 30 14:17:45 crc kubenswrapper[4676]: I0930 14:17:45.661973 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9c502fce-3c55-4e57-abb0-fb1cca9f11e4-var-run\") pod \"ovn-controller-9nlxb-config-qq75c\" (UID: \"9c502fce-3c55-4e57-abb0-fb1cca9f11e4\") " pod="openstack/ovn-controller-9nlxb-config-qq75c" Sep 30 14:17:45 crc kubenswrapper[4676]: I0930 14:17:45.763523 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9c502fce-3c55-4e57-abb0-fb1cca9f11e4-var-log-ovn\") pod \"ovn-controller-9nlxb-config-qq75c\" (UID: \"9c502fce-3c55-4e57-abb0-fb1cca9f11e4\") " pod="openstack/ovn-controller-9nlxb-config-qq75c" Sep 30 14:17:45 crc kubenswrapper[4676]: I0930 14:17:45.763607 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cb7hm\" (UniqueName: \"kubernetes.io/projected/9c502fce-3c55-4e57-abb0-fb1cca9f11e4-kube-api-access-cb7hm\") pod \"ovn-controller-9nlxb-config-qq75c\" (UID: \"9c502fce-3c55-4e57-abb0-fb1cca9f11e4\") " pod="openstack/ovn-controller-9nlxb-config-qq75c" Sep 30 14:17:45 crc kubenswrapper[4676]: I0930 14:17:45.763639 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9c502fce-3c55-4e57-abb0-fb1cca9f11e4-additional-scripts\") pod \"ovn-controller-9nlxb-config-qq75c\" (UID: \"9c502fce-3c55-4e57-abb0-fb1cca9f11e4\") " pod="openstack/ovn-controller-9nlxb-config-qq75c" Sep 30 14:17:45 crc kubenswrapper[4676]: I0930 14:17:45.763764 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c502fce-3c55-4e57-abb0-fb1cca9f11e4-scripts\") pod \"ovn-controller-9nlxb-config-qq75c\" (UID: \"9c502fce-3c55-4e57-abb0-fb1cca9f11e4\") " pod="openstack/ovn-controller-9nlxb-config-qq75c" Sep 30 14:17:45 crc kubenswrapper[4676]: I0930 14:17:45.763846 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9c502fce-3c55-4e57-abb0-fb1cca9f11e4-var-run\") pod \"ovn-controller-9nlxb-config-qq75c\" (UID: \"9c502fce-3c55-4e57-abb0-fb1cca9f11e4\") " pod="openstack/ovn-controller-9nlxb-config-qq75c" Sep 30 14:17:45 crc kubenswrapper[4676]: I0930 14:17:45.763916 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9c502fce-3c55-4e57-abb0-fb1cca9f11e4-var-log-ovn\") pod \"ovn-controller-9nlxb-config-qq75c\" (UID: \"9c502fce-3c55-4e57-abb0-fb1cca9f11e4\") " pod="openstack/ovn-controller-9nlxb-config-qq75c" Sep 30 14:17:45 crc kubenswrapper[4676]: I0930 14:17:45.763942 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9c502fce-3c55-4e57-abb0-fb1cca9f11e4-var-run-ovn\") pod \"ovn-controller-9nlxb-config-qq75c\" (UID: \"9c502fce-3c55-4e57-abb0-fb1cca9f11e4\") " pod="openstack/ovn-controller-9nlxb-config-qq75c" Sep 30 14:17:45 crc kubenswrapper[4676]: I0930 14:17:45.764029 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9c502fce-3c55-4e57-abb0-fb1cca9f11e4-var-run-ovn\") pod \"ovn-controller-9nlxb-config-qq75c\" (UID: \"9c502fce-3c55-4e57-abb0-fb1cca9f11e4\") " pod="openstack/ovn-controller-9nlxb-config-qq75c" Sep 30 14:17:45 crc kubenswrapper[4676]: I0930 14:17:45.764166 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9c502fce-3c55-4e57-abb0-fb1cca9f11e4-var-run\") pod \"ovn-controller-9nlxb-config-qq75c\" (UID: \"9c502fce-3c55-4e57-abb0-fb1cca9f11e4\") " pod="openstack/ovn-controller-9nlxb-config-qq75c" Sep 30 14:17:45 crc kubenswrapper[4676]: I0930 14:17:45.765130 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9c502fce-3c55-4e57-abb0-fb1cca9f11e4-additional-scripts\") pod \"ovn-controller-9nlxb-config-qq75c\" (UID: \"9c502fce-3c55-4e57-abb0-fb1cca9f11e4\") " pod="openstack/ovn-controller-9nlxb-config-qq75c" Sep 30 14:17:45 crc kubenswrapper[4676]: I0930 14:17:45.767281 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c502fce-3c55-4e57-abb0-fb1cca9f11e4-scripts\") pod \"ovn-controller-9nlxb-config-qq75c\" (UID: \"9c502fce-3c55-4e57-abb0-fb1cca9f11e4\") " pod="openstack/ovn-controller-9nlxb-config-qq75c" Sep 30 14:17:45 crc kubenswrapper[4676]: I0930 14:17:45.785452 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cb7hm\" (UniqueName: \"kubernetes.io/projected/9c502fce-3c55-4e57-abb0-fb1cca9f11e4-kube-api-access-cb7hm\") pod \"ovn-controller-9nlxb-config-qq75c\" (UID: \"9c502fce-3c55-4e57-abb0-fb1cca9f11e4\") " pod="openstack/ovn-controller-9nlxb-config-qq75c" Sep 30 14:17:45 crc kubenswrapper[4676]: I0930 14:17:45.921102 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9nlxb-config-qq75c" Sep 30 14:17:46 crc kubenswrapper[4676]: I0930 14:17:46.376577 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9nlxb-config-qq75c"] Sep 30 14:17:46 crc kubenswrapper[4676]: I0930 14:17:46.995090 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9nlxb-config-qq75c" event={"ID":"9c502fce-3c55-4e57-abb0-fb1cca9f11e4","Type":"ContainerStarted","Data":"bc1ab75b26feecc1cc385882130093498814d61294f51694dac523605102ac71"} Sep 30 14:17:46 crc kubenswrapper[4676]: I0930 14:17:46.995398 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9nlxb-config-qq75c" event={"ID":"9c502fce-3c55-4e57-abb0-fb1cca9f11e4","Type":"ContainerStarted","Data":"cff7bc51dd1c808581c5f01b781f97f9e9faba2cfd07a73b854ff791e3ab0ee1"} Sep 30 14:17:48 crc kubenswrapper[4676]: I0930 14:17:48.013616 4676 generic.go:334] "Generic (PLEG): container finished" podID="9c502fce-3c55-4e57-abb0-fb1cca9f11e4" containerID="bc1ab75b26feecc1cc385882130093498814d61294f51694dac523605102ac71" exitCode=0 Sep 30 14:17:48 crc kubenswrapper[4676]: I0930 14:17:48.013995 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9nlxb-config-qq75c" event={"ID":"9c502fce-3c55-4e57-abb0-fb1cca9f11e4","Type":"ContainerDied","Data":"bc1ab75b26feecc1cc385882130093498814d61294f51694dac523605102ac71"} Sep 30 14:17:48 crc kubenswrapper[4676]: I0930 14:17:48.375924 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9nlxb-config-qq75c" Sep 30 14:17:48 crc kubenswrapper[4676]: I0930 14:17:48.421955 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cb7hm\" (UniqueName: \"kubernetes.io/projected/9c502fce-3c55-4e57-abb0-fb1cca9f11e4-kube-api-access-cb7hm\") pod \"9c502fce-3c55-4e57-abb0-fb1cca9f11e4\" (UID: \"9c502fce-3c55-4e57-abb0-fb1cca9f11e4\") " Sep 30 14:17:48 crc kubenswrapper[4676]: I0930 14:17:48.422011 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c502fce-3c55-4e57-abb0-fb1cca9f11e4-scripts\") pod \"9c502fce-3c55-4e57-abb0-fb1cca9f11e4\" (UID: \"9c502fce-3c55-4e57-abb0-fb1cca9f11e4\") " Sep 30 14:17:48 crc kubenswrapper[4676]: I0930 14:17:48.422037 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9c502fce-3c55-4e57-abb0-fb1cca9f11e4-additional-scripts\") pod \"9c502fce-3c55-4e57-abb0-fb1cca9f11e4\" (UID: \"9c502fce-3c55-4e57-abb0-fb1cca9f11e4\") " Sep 30 14:17:48 crc kubenswrapper[4676]: I0930 14:17:48.422068 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9c502fce-3c55-4e57-abb0-fb1cca9f11e4-var-run-ovn\") pod \"9c502fce-3c55-4e57-abb0-fb1cca9f11e4\" (UID: \"9c502fce-3c55-4e57-abb0-fb1cca9f11e4\") " Sep 30 14:17:48 crc kubenswrapper[4676]: I0930 14:17:48.422286 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9c502fce-3c55-4e57-abb0-fb1cca9f11e4-var-log-ovn\") pod \"9c502fce-3c55-4e57-abb0-fb1cca9f11e4\" (UID: \"9c502fce-3c55-4e57-abb0-fb1cca9f11e4\") " Sep 30 14:17:48 crc kubenswrapper[4676]: I0930 14:17:48.422315 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9c502fce-3c55-4e57-abb0-fb1cca9f11e4-var-run\") pod \"9c502fce-3c55-4e57-abb0-fb1cca9f11e4\" (UID: \"9c502fce-3c55-4e57-abb0-fb1cca9f11e4\") " Sep 30 14:17:48 crc kubenswrapper[4676]: I0930 14:17:48.422845 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9c502fce-3c55-4e57-abb0-fb1cca9f11e4-var-run" (OuterVolumeSpecName: "var-run") pod "9c502fce-3c55-4e57-abb0-fb1cca9f11e4" (UID: "9c502fce-3c55-4e57-abb0-fb1cca9f11e4"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 14:17:48 crc kubenswrapper[4676]: I0930 14:17:48.422989 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9c502fce-3c55-4e57-abb0-fb1cca9f11e4-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "9c502fce-3c55-4e57-abb0-fb1cca9f11e4" (UID: "9c502fce-3c55-4e57-abb0-fb1cca9f11e4"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 14:17:48 crc kubenswrapper[4676]: I0930 14:17:48.422985 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9c502fce-3c55-4e57-abb0-fb1cca9f11e4-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "9c502fce-3c55-4e57-abb0-fb1cca9f11e4" (UID: "9c502fce-3c55-4e57-abb0-fb1cca9f11e4"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 14:17:48 crc kubenswrapper[4676]: I0930 14:17:48.423512 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c502fce-3c55-4e57-abb0-fb1cca9f11e4-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "9c502fce-3c55-4e57-abb0-fb1cca9f11e4" (UID: "9c502fce-3c55-4e57-abb0-fb1cca9f11e4"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:17:48 crc kubenswrapper[4676]: I0930 14:17:48.423928 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c502fce-3c55-4e57-abb0-fb1cca9f11e4-scripts" (OuterVolumeSpecName: "scripts") pod "9c502fce-3c55-4e57-abb0-fb1cca9f11e4" (UID: "9c502fce-3c55-4e57-abb0-fb1cca9f11e4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:17:48 crc kubenswrapper[4676]: I0930 14:17:48.429295 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c502fce-3c55-4e57-abb0-fb1cca9f11e4-kube-api-access-cb7hm" (OuterVolumeSpecName: "kube-api-access-cb7hm") pod "9c502fce-3c55-4e57-abb0-fb1cca9f11e4" (UID: "9c502fce-3c55-4e57-abb0-fb1cca9f11e4"). InnerVolumeSpecName "kube-api-access-cb7hm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:17:48 crc kubenswrapper[4676]: I0930 14:17:48.525143 4676 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9c502fce-3c55-4e57-abb0-fb1cca9f11e4-var-run-ovn\") on node \"crc\" DevicePath \"\"" Sep 30 14:17:48 crc kubenswrapper[4676]: I0930 14:17:48.525181 4676 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9c502fce-3c55-4e57-abb0-fb1cca9f11e4-var-log-ovn\") on node \"crc\" DevicePath \"\"" Sep 30 14:17:48 crc kubenswrapper[4676]: I0930 14:17:48.525192 4676 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9c502fce-3c55-4e57-abb0-fb1cca9f11e4-var-run\") on node \"crc\" DevicePath \"\"" Sep 30 14:17:48 crc kubenswrapper[4676]: I0930 14:17:48.525208 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cb7hm\" (UniqueName: \"kubernetes.io/projected/9c502fce-3c55-4e57-abb0-fb1cca9f11e4-kube-api-access-cb7hm\") on node \"crc\" DevicePath \"\"" Sep 30 14:17:48 crc kubenswrapper[4676]: I0930 14:17:48.525218 4676 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c502fce-3c55-4e57-abb0-fb1cca9f11e4-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 14:17:48 crc kubenswrapper[4676]: I0930 14:17:48.525226 4676 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9c502fce-3c55-4e57-abb0-fb1cca9f11e4-additional-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 14:17:48 crc kubenswrapper[4676]: I0930 14:17:48.557085 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-5ebc-account-create-7k42h"] Sep 30 14:17:48 crc kubenswrapper[4676]: E0930 14:17:48.557564 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c502fce-3c55-4e57-abb0-fb1cca9f11e4" containerName="ovn-config" Sep 30 14:17:48 crc kubenswrapper[4676]: I0930 14:17:48.557586 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c502fce-3c55-4e57-abb0-fb1cca9f11e4" containerName="ovn-config" Sep 30 14:17:48 crc kubenswrapper[4676]: I0930 14:17:48.557796 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c502fce-3c55-4e57-abb0-fb1cca9f11e4" containerName="ovn-config" Sep 30 14:17:48 crc kubenswrapper[4676]: I0930 14:17:48.558475 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-5ebc-account-create-7k42h" Sep 30 14:17:48 crc kubenswrapper[4676]: I0930 14:17:48.560134 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Sep 30 14:17:48 crc kubenswrapper[4676]: I0930 14:17:48.568700 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-5ebc-account-create-7k42h"] Sep 30 14:17:48 crc kubenswrapper[4676]: I0930 14:17:48.626621 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlffx\" (UniqueName: \"kubernetes.io/projected/c06b4228-b99e-40ab-bbea-8bfe2bb1ddd7-kube-api-access-vlffx\") pod \"barbican-5ebc-account-create-7k42h\" (UID: \"c06b4228-b99e-40ab-bbea-8bfe2bb1ddd7\") " pod="openstack/barbican-5ebc-account-create-7k42h" Sep 30 14:17:48 crc kubenswrapper[4676]: I0930 14:17:48.658433 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-6aea-account-create-fd92w"] Sep 30 14:17:48 crc kubenswrapper[4676]: I0930 14:17:48.660119 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-6aea-account-create-fd92w" Sep 30 14:17:48 crc kubenswrapper[4676]: I0930 14:17:48.665969 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Sep 30 14:17:48 crc kubenswrapper[4676]: I0930 14:17:48.672253 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-6aea-account-create-fd92w"] Sep 30 14:17:48 crc kubenswrapper[4676]: I0930 14:17:48.728849 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlffx\" (UniqueName: \"kubernetes.io/projected/c06b4228-b99e-40ab-bbea-8bfe2bb1ddd7-kube-api-access-vlffx\") pod \"barbican-5ebc-account-create-7k42h\" (UID: \"c06b4228-b99e-40ab-bbea-8bfe2bb1ddd7\") " pod="openstack/barbican-5ebc-account-create-7k42h" Sep 30 14:17:48 crc kubenswrapper[4676]: I0930 14:17:48.728939 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82rz9\" (UniqueName: \"kubernetes.io/projected/d6a97d90-e028-46e3-afdf-d872d446f162-kube-api-access-82rz9\") pod \"cinder-6aea-account-create-fd92w\" (UID: \"d6a97d90-e028-46e3-afdf-d872d446f162\") " pod="openstack/cinder-6aea-account-create-fd92w" Sep 30 14:17:48 crc kubenswrapper[4676]: I0930 14:17:48.756821 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlffx\" (UniqueName: \"kubernetes.io/projected/c06b4228-b99e-40ab-bbea-8bfe2bb1ddd7-kube-api-access-vlffx\") pod \"barbican-5ebc-account-create-7k42h\" (UID: \"c06b4228-b99e-40ab-bbea-8bfe2bb1ddd7\") " pod="openstack/barbican-5ebc-account-create-7k42h" Sep 30 14:17:48 crc kubenswrapper[4676]: I0930 14:17:48.830768 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82rz9\" (UniqueName: \"kubernetes.io/projected/d6a97d90-e028-46e3-afdf-d872d446f162-kube-api-access-82rz9\") pod \"cinder-6aea-account-create-fd92w\" (UID: \"d6a97d90-e028-46e3-afdf-d872d446f162\") " pod="openstack/cinder-6aea-account-create-fd92w" Sep 30 14:17:48 crc kubenswrapper[4676]: I0930 14:17:48.850526 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-180c-account-create-zkbbd"] Sep 30 14:17:48 crc kubenswrapper[4676]: I0930 14:17:48.851665 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-180c-account-create-zkbbd" Sep 30 14:17:48 crc kubenswrapper[4676]: I0930 14:17:48.860652 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Sep 30 14:17:48 crc kubenswrapper[4676]: I0930 14:17:48.865265 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82rz9\" (UniqueName: \"kubernetes.io/projected/d6a97d90-e028-46e3-afdf-d872d446f162-kube-api-access-82rz9\") pod \"cinder-6aea-account-create-fd92w\" (UID: \"d6a97d90-e028-46e3-afdf-d872d446f162\") " pod="openstack/cinder-6aea-account-create-fd92w" Sep 30 14:17:48 crc kubenswrapper[4676]: I0930 14:17:48.865457 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-180c-account-create-zkbbd"] Sep 30 14:17:48 crc kubenswrapper[4676]: I0930 14:17:48.884815 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-5ebc-account-create-7k42h" Sep 30 14:17:48 crc kubenswrapper[4676]: I0930 14:17:48.933199 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpvt5\" (UniqueName: \"kubernetes.io/projected/076c2b0a-4ad4-44b9-bec1-def5ed805975-kube-api-access-dpvt5\") pod \"neutron-180c-account-create-zkbbd\" (UID: \"076c2b0a-4ad4-44b9-bec1-def5ed805975\") " pod="openstack/neutron-180c-account-create-zkbbd" Sep 30 14:17:48 crc kubenswrapper[4676]: I0930 14:17:48.976739 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-6aea-account-create-fd92w" Sep 30 14:17:49 crc kubenswrapper[4676]: I0930 14:17:49.027755 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9nlxb-config-qq75c" event={"ID":"9c502fce-3c55-4e57-abb0-fb1cca9f11e4","Type":"ContainerDied","Data":"cff7bc51dd1c808581c5f01b781f97f9e9faba2cfd07a73b854ff791e3ab0ee1"} Sep 30 14:17:49 crc kubenswrapper[4676]: I0930 14:17:49.027791 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cff7bc51dd1c808581c5f01b781f97f9e9faba2cfd07a73b854ff791e3ab0ee1" Sep 30 14:17:49 crc kubenswrapper[4676]: I0930 14:17:49.027863 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9nlxb-config-qq75c" Sep 30 14:17:49 crc kubenswrapper[4676]: I0930 14:17:49.035243 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpvt5\" (UniqueName: \"kubernetes.io/projected/076c2b0a-4ad4-44b9-bec1-def5ed805975-kube-api-access-dpvt5\") pod \"neutron-180c-account-create-zkbbd\" (UID: \"076c2b0a-4ad4-44b9-bec1-def5ed805975\") " pod="openstack/neutron-180c-account-create-zkbbd" Sep 30 14:17:49 crc kubenswrapper[4676]: I0930 14:17:49.055271 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpvt5\" (UniqueName: \"kubernetes.io/projected/076c2b0a-4ad4-44b9-bec1-def5ed805975-kube-api-access-dpvt5\") pod \"neutron-180c-account-create-zkbbd\" (UID: \"076c2b0a-4ad4-44b9-bec1-def5ed805975\") " pod="openstack/neutron-180c-account-create-zkbbd" Sep 30 14:17:49 crc kubenswrapper[4676]: I0930 14:17:49.203610 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-180c-account-create-zkbbd" Sep 30 14:17:49 crc kubenswrapper[4676]: I0930 14:17:49.318826 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-5ebc-account-create-7k42h"] Sep 30 14:17:49 crc kubenswrapper[4676]: W0930 14:17:49.326976 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc06b4228_b99e_40ab_bbea_8bfe2bb1ddd7.slice/crio-d5daadb054dfc3c7f0ee186a919e95b88d5d773284a431cedd543110989534fe WatchSource:0}: Error finding container d5daadb054dfc3c7f0ee186a919e95b88d5d773284a431cedd543110989534fe: Status 404 returned error can't find the container with id d5daadb054dfc3c7f0ee186a919e95b88d5d773284a431cedd543110989534fe Sep 30 14:17:49 crc kubenswrapper[4676]: I0930 14:17:49.484590 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-6aea-account-create-fd92w"] Sep 30 14:17:49 crc kubenswrapper[4676]: I0930 14:17:49.516956 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-9nlxb-config-qq75c"] Sep 30 14:17:49 crc kubenswrapper[4676]: I0930 14:17:49.523695 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-9nlxb-config-qq75c"] Sep 30 14:17:49 crc kubenswrapper[4676]: I0930 14:17:49.639907 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-180c-account-create-zkbbd"] Sep 30 14:17:49 crc kubenswrapper[4676]: W0930 14:17:49.647104 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod076c2b0a_4ad4_44b9_bec1_def5ed805975.slice/crio-fa2a71f17d395cbfad7263f03490d0b365f996d705b32912bd61ca88bc5d7b56 WatchSource:0}: Error finding container fa2a71f17d395cbfad7263f03490d0b365f996d705b32912bd61ca88bc5d7b56: Status 404 returned error can't find the container with id fa2a71f17d395cbfad7263f03490d0b365f996d705b32912bd61ca88bc5d7b56 Sep 30 14:17:49 crc kubenswrapper[4676]: I0930 14:17:49.658988 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-1aaa-account-create-gxldf"] Sep 30 14:17:49 crc kubenswrapper[4676]: I0930 14:17:49.660231 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1aaa-account-create-gxldf" Sep 30 14:17:49 crc kubenswrapper[4676]: I0930 14:17:49.662450 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Sep 30 14:17:49 crc kubenswrapper[4676]: I0930 14:17:49.671498 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-1aaa-account-create-gxldf"] Sep 30 14:17:49 crc kubenswrapper[4676]: I0930 14:17:49.745741 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mrtc\" (UniqueName: \"kubernetes.io/projected/f56c6587-9965-4011-848b-c822e4572d6d-kube-api-access-7mrtc\") pod \"keystone-1aaa-account-create-gxldf\" (UID: \"f56c6587-9965-4011-848b-c822e4572d6d\") " pod="openstack/keystone-1aaa-account-create-gxldf" Sep 30 14:17:49 crc kubenswrapper[4676]: I0930 14:17:49.848101 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mrtc\" (UniqueName: \"kubernetes.io/projected/f56c6587-9965-4011-848b-c822e4572d6d-kube-api-access-7mrtc\") pod \"keystone-1aaa-account-create-gxldf\" (UID: \"f56c6587-9965-4011-848b-c822e4572d6d\") " pod="openstack/keystone-1aaa-account-create-gxldf" Sep 30 14:17:49 crc kubenswrapper[4676]: I0930 14:17:49.868871 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mrtc\" (UniqueName: \"kubernetes.io/projected/f56c6587-9965-4011-848b-c822e4572d6d-kube-api-access-7mrtc\") pod \"keystone-1aaa-account-create-gxldf\" (UID: \"f56c6587-9965-4011-848b-c822e4572d6d\") " pod="openstack/keystone-1aaa-account-create-gxldf" Sep 30 14:17:49 crc kubenswrapper[4676]: I0930 14:17:49.954184 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-bc91-account-create-zznlq"] Sep 30 14:17:49 crc kubenswrapper[4676]: I0930 14:17:49.955524 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-bc91-account-create-zznlq" Sep 30 14:17:49 crc kubenswrapper[4676]: I0930 14:17:49.958362 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Sep 30 14:17:49 crc kubenswrapper[4676]: I0930 14:17:49.962837 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-bc91-account-create-zznlq"] Sep 30 14:17:49 crc kubenswrapper[4676]: I0930 14:17:49.989759 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1aaa-account-create-gxldf" Sep 30 14:17:50 crc kubenswrapper[4676]: I0930 14:17:50.011692 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Sep 30 14:17:50 crc kubenswrapper[4676]: I0930 14:17:50.047488 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-5ebc-account-create-7k42h" event={"ID":"c06b4228-b99e-40ab-bbea-8bfe2bb1ddd7","Type":"ContainerStarted","Data":"3b37d0fbff8312be575df3c088d4b2f86e874759e4a10839aa7bcab348b2f8ba"} Sep 30 14:17:50 crc kubenswrapper[4676]: I0930 14:17:50.047547 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-5ebc-account-create-7k42h" event={"ID":"c06b4228-b99e-40ab-bbea-8bfe2bb1ddd7","Type":"ContainerStarted","Data":"d5daadb054dfc3c7f0ee186a919e95b88d5d773284a431cedd543110989534fe"} Sep 30 14:17:50 crc kubenswrapper[4676]: I0930 14:17:50.051384 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-180c-account-create-zkbbd" event={"ID":"076c2b0a-4ad4-44b9-bec1-def5ed805975","Type":"ContainerStarted","Data":"f84cc29caea02667f85586eca8ef07d5f55a0e948f5838aac06c86ad4009c774"} Sep 30 14:17:50 crc kubenswrapper[4676]: I0930 14:17:50.051433 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-180c-account-create-zkbbd" event={"ID":"076c2b0a-4ad4-44b9-bec1-def5ed805975","Type":"ContainerStarted","Data":"fa2a71f17d395cbfad7263f03490d0b365f996d705b32912bd61ca88bc5d7b56"} Sep 30 14:17:50 crc kubenswrapper[4676]: I0930 14:17:50.052013 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9x2n\" (UniqueName: \"kubernetes.io/projected/c77bd056-9c07-43d9-bd25-02177d6b53cc-kube-api-access-b9x2n\") pod \"placement-bc91-account-create-zznlq\" (UID: \"c77bd056-9c07-43d9-bd25-02177d6b53cc\") " pod="openstack/placement-bc91-account-create-zznlq" Sep 30 14:17:50 crc kubenswrapper[4676]: I0930 14:17:50.054645 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-6aea-account-create-fd92w" event={"ID":"d6a97d90-e028-46e3-afdf-d872d446f162","Type":"ContainerStarted","Data":"566181ad0387e74b65009333ef7a68a4306d2ae6824e78d7717d1153368e4148"} Sep 30 14:17:50 crc kubenswrapper[4676]: I0930 14:17:50.054682 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-6aea-account-create-fd92w" event={"ID":"d6a97d90-e028-46e3-afdf-d872d446f162","Type":"ContainerStarted","Data":"ecbce8995e0646774d24adb9345d3b20d4ececb63b55e7c132d8c4ff0ba4959b"} Sep 30 14:17:50 crc kubenswrapper[4676]: I0930 14:17:50.079123 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-5ebc-account-create-7k42h" podStartSLOduration=2.079097448 podStartE2EDuration="2.079097448s" podCreationTimestamp="2025-09-30 14:17:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:17:50.063109859 +0000 UTC m=+1174.046198288" watchObservedRunningTime="2025-09-30 14:17:50.079097448 +0000 UTC m=+1174.062185887" Sep 30 14:17:50 crc kubenswrapper[4676]: I0930 14:17:50.105788 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-6aea-account-create-fd92w" podStartSLOduration=2.105767844 podStartE2EDuration="2.105767844s" podCreationTimestamp="2025-09-30 14:17:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:17:50.086192988 +0000 UTC m=+1174.069281437" watchObservedRunningTime="2025-09-30 14:17:50.105767844 +0000 UTC m=+1174.088856283" Sep 30 14:17:50 crc kubenswrapper[4676]: I0930 14:17:50.149948 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-180c-account-create-zkbbd" podStartSLOduration=2.14991342 podStartE2EDuration="2.14991342s" podCreationTimestamp="2025-09-30 14:17:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:17:50.1152685 +0000 UTC m=+1174.098356939" watchObservedRunningTime="2025-09-30 14:17:50.14991342 +0000 UTC m=+1174.133001849" Sep 30 14:17:50 crc kubenswrapper[4676]: I0930 14:17:50.153372 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9x2n\" (UniqueName: \"kubernetes.io/projected/c77bd056-9c07-43d9-bd25-02177d6b53cc-kube-api-access-b9x2n\") pod \"placement-bc91-account-create-zznlq\" (UID: \"c77bd056-9c07-43d9-bd25-02177d6b53cc\") " pod="openstack/placement-bc91-account-create-zznlq" Sep 30 14:17:50 crc kubenswrapper[4676]: I0930 14:17:50.178928 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9x2n\" (UniqueName: \"kubernetes.io/projected/c77bd056-9c07-43d9-bd25-02177d6b53cc-kube-api-access-b9x2n\") pod \"placement-bc91-account-create-zznlq\" (UID: \"c77bd056-9c07-43d9-bd25-02177d6b53cc\") " pod="openstack/placement-bc91-account-create-zznlq" Sep 30 14:17:50 crc kubenswrapper[4676]: I0930 14:17:50.257817 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-38b7-account-create-msmmq"] Sep 30 14:17:50 crc kubenswrapper[4676]: I0930 14:17:50.259332 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-38b7-account-create-msmmq" Sep 30 14:17:50 crc kubenswrapper[4676]: I0930 14:17:50.262594 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Sep 30 14:17:50 crc kubenswrapper[4676]: I0930 14:17:50.280135 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-bc91-account-create-zznlq" Sep 30 14:17:50 crc kubenswrapper[4676]: I0930 14:17:50.285427 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-38b7-account-create-msmmq"] Sep 30 14:17:50 crc kubenswrapper[4676]: I0930 14:17:50.357154 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crpb2\" (UniqueName: \"kubernetes.io/projected/62ea603d-33b6-4dfe-ad26-ccf513a14ae5-kube-api-access-crpb2\") pod \"glance-38b7-account-create-msmmq\" (UID: \"62ea603d-33b6-4dfe-ad26-ccf513a14ae5\") " pod="openstack/glance-38b7-account-create-msmmq" Sep 30 14:17:50 crc kubenswrapper[4676]: I0930 14:17:50.365137 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-9nlxb" Sep 30 14:17:50 crc kubenswrapper[4676]: I0930 14:17:50.459591 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crpb2\" (UniqueName: \"kubernetes.io/projected/62ea603d-33b6-4dfe-ad26-ccf513a14ae5-kube-api-access-crpb2\") pod \"glance-38b7-account-create-msmmq\" (UID: \"62ea603d-33b6-4dfe-ad26-ccf513a14ae5\") " pod="openstack/glance-38b7-account-create-msmmq" Sep 30 14:17:50 crc kubenswrapper[4676]: I0930 14:17:50.470921 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-1aaa-account-create-gxldf"] Sep 30 14:17:50 crc kubenswrapper[4676]: I0930 14:17:50.487452 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crpb2\" (UniqueName: \"kubernetes.io/projected/62ea603d-33b6-4dfe-ad26-ccf513a14ae5-kube-api-access-crpb2\") pod \"glance-38b7-account-create-msmmq\" (UID: \"62ea603d-33b6-4dfe-ad26-ccf513a14ae5\") " pod="openstack/glance-38b7-account-create-msmmq" Sep 30 14:17:50 crc kubenswrapper[4676]: I0930 14:17:50.760594 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-38b7-account-create-msmmq" Sep 30 14:17:50 crc kubenswrapper[4676]: I0930 14:17:50.768544 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-bc91-account-create-zznlq"] Sep 30 14:17:51 crc kubenswrapper[4676]: I0930 14:17:51.066434 4676 generic.go:334] "Generic (PLEG): container finished" podID="c06b4228-b99e-40ab-bbea-8bfe2bb1ddd7" containerID="3b37d0fbff8312be575df3c088d4b2f86e874759e4a10839aa7bcab348b2f8ba" exitCode=0 Sep 30 14:17:51 crc kubenswrapper[4676]: I0930 14:17:51.066553 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-5ebc-account-create-7k42h" event={"ID":"c06b4228-b99e-40ab-bbea-8bfe2bb1ddd7","Type":"ContainerDied","Data":"3b37d0fbff8312be575df3c088d4b2f86e874759e4a10839aa7bcab348b2f8ba"} Sep 30 14:17:51 crc kubenswrapper[4676]: I0930 14:17:51.069584 4676 generic.go:334] "Generic (PLEG): container finished" podID="076c2b0a-4ad4-44b9-bec1-def5ed805975" containerID="f84cc29caea02667f85586eca8ef07d5f55a0e948f5838aac06c86ad4009c774" exitCode=0 Sep 30 14:17:51 crc kubenswrapper[4676]: I0930 14:17:51.071096 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-180c-account-create-zkbbd" event={"ID":"076c2b0a-4ad4-44b9-bec1-def5ed805975","Type":"ContainerDied","Data":"f84cc29caea02667f85586eca8ef07d5f55a0e948f5838aac06c86ad4009c774"} Sep 30 14:17:51 crc kubenswrapper[4676]: I0930 14:17:51.076644 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-bc91-account-create-zznlq" event={"ID":"c77bd056-9c07-43d9-bd25-02177d6b53cc","Type":"ContainerStarted","Data":"4022940783cd607b2d2bc22640a6f04a0ac969684e89b732dd4720875430e4bb"} Sep 30 14:17:51 crc kubenswrapper[4676]: I0930 14:17:51.078688 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-1aaa-account-create-gxldf" event={"ID":"f56c6587-9965-4011-848b-c822e4572d6d","Type":"ContainerStarted","Data":"e9b7ce2baaeb59e75ede60b3cbd9b035f3193b688a8c1fe680c3fd5314355ad3"} Sep 30 14:17:51 crc kubenswrapper[4676]: I0930 14:17:51.080626 4676 generic.go:334] "Generic (PLEG): container finished" podID="d6a97d90-e028-46e3-afdf-d872d446f162" containerID="566181ad0387e74b65009333ef7a68a4306d2ae6824e78d7717d1153368e4148" exitCode=0 Sep 30 14:17:51 crc kubenswrapper[4676]: I0930 14:17:51.080661 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-6aea-account-create-fd92w" event={"ID":"d6a97d90-e028-46e3-afdf-d872d446f162","Type":"ContainerDied","Data":"566181ad0387e74b65009333ef7a68a4306d2ae6824e78d7717d1153368e4148"} Sep 30 14:17:51 crc kubenswrapper[4676]: I0930 14:17:51.196058 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-38b7-account-create-msmmq"] Sep 30 14:17:51 crc kubenswrapper[4676]: W0930 14:17:51.202071 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62ea603d_33b6_4dfe_ad26_ccf513a14ae5.slice/crio-bcbca0f9fcc041abf78464c0ab33c4e55f015c0bb529eb63324966f9d18c59b8 WatchSource:0}: Error finding container bcbca0f9fcc041abf78464c0ab33c4e55f015c0bb529eb63324966f9d18c59b8: Status 404 returned error can't find the container with id bcbca0f9fcc041abf78464c0ab33c4e55f015c0bb529eb63324966f9d18c59b8 Sep 30 14:17:51 crc kubenswrapper[4676]: I0930 14:17:51.443706 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c502fce-3c55-4e57-abb0-fb1cca9f11e4" path="/var/lib/kubelet/pods/9c502fce-3c55-4e57-abb0-fb1cca9f11e4/volumes" Sep 30 14:17:52 crc kubenswrapper[4676]: I0930 14:17:52.090078 4676 generic.go:334] "Generic (PLEG): container finished" podID="62ea603d-33b6-4dfe-ad26-ccf513a14ae5" containerID="e72daba28095d339cd316212f8ade7d1e406e315c44893a2a47b1b080542f5ce" exitCode=0 Sep 30 14:17:52 crc kubenswrapper[4676]: I0930 14:17:52.090180 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-38b7-account-create-msmmq" event={"ID":"62ea603d-33b6-4dfe-ad26-ccf513a14ae5","Type":"ContainerDied","Data":"e72daba28095d339cd316212f8ade7d1e406e315c44893a2a47b1b080542f5ce"} Sep 30 14:17:52 crc kubenswrapper[4676]: I0930 14:17:52.091690 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-38b7-account-create-msmmq" event={"ID":"62ea603d-33b6-4dfe-ad26-ccf513a14ae5","Type":"ContainerStarted","Data":"bcbca0f9fcc041abf78464c0ab33c4e55f015c0bb529eb63324966f9d18c59b8"} Sep 30 14:17:52 crc kubenswrapper[4676]: I0930 14:17:52.094005 4676 generic.go:334] "Generic (PLEG): container finished" podID="c77bd056-9c07-43d9-bd25-02177d6b53cc" containerID="e7909666c56ef6be2ee71a7394d383d2490ffe1053beac7d57a47a11963c70c9" exitCode=0 Sep 30 14:17:52 crc kubenswrapper[4676]: I0930 14:17:52.094147 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-bc91-account-create-zznlq" event={"ID":"c77bd056-9c07-43d9-bd25-02177d6b53cc","Type":"ContainerDied","Data":"e7909666c56ef6be2ee71a7394d383d2490ffe1053beac7d57a47a11963c70c9"} Sep 30 14:17:52 crc kubenswrapper[4676]: I0930 14:17:52.096257 4676 generic.go:334] "Generic (PLEG): container finished" podID="f56c6587-9965-4011-848b-c822e4572d6d" containerID="27156369ce0ce4742a4028041e67fd0beb70a95832d943bbf3147e6ccdadd5b7" exitCode=0 Sep 30 14:17:52 crc kubenswrapper[4676]: I0930 14:17:52.096480 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-1aaa-account-create-gxldf" event={"ID":"f56c6587-9965-4011-848b-c822e4572d6d","Type":"ContainerDied","Data":"27156369ce0ce4742a4028041e67fd0beb70a95832d943bbf3147e6ccdadd5b7"} Sep 30 14:17:52 crc kubenswrapper[4676]: I0930 14:17:52.598800 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-5ebc-account-create-7k42h" Sep 30 14:17:52 crc kubenswrapper[4676]: I0930 14:17:52.699813 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlffx\" (UniqueName: \"kubernetes.io/projected/c06b4228-b99e-40ab-bbea-8bfe2bb1ddd7-kube-api-access-vlffx\") pod \"c06b4228-b99e-40ab-bbea-8bfe2bb1ddd7\" (UID: \"c06b4228-b99e-40ab-bbea-8bfe2bb1ddd7\") " Sep 30 14:17:52 crc kubenswrapper[4676]: I0930 14:17:52.707172 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c06b4228-b99e-40ab-bbea-8bfe2bb1ddd7-kube-api-access-vlffx" (OuterVolumeSpecName: "kube-api-access-vlffx") pod "c06b4228-b99e-40ab-bbea-8bfe2bb1ddd7" (UID: "c06b4228-b99e-40ab-bbea-8bfe2bb1ddd7"). InnerVolumeSpecName "kube-api-access-vlffx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:17:52 crc kubenswrapper[4676]: I0930 14:17:52.709809 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-6aea-account-create-fd92w" Sep 30 14:17:52 crc kubenswrapper[4676]: I0930 14:17:52.761657 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-180c-account-create-zkbbd" Sep 30 14:17:52 crc kubenswrapper[4676]: I0930 14:17:52.801311 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82rz9\" (UniqueName: \"kubernetes.io/projected/d6a97d90-e028-46e3-afdf-d872d446f162-kube-api-access-82rz9\") pod \"d6a97d90-e028-46e3-afdf-d872d446f162\" (UID: \"d6a97d90-e028-46e3-afdf-d872d446f162\") " Sep 30 14:17:52 crc kubenswrapper[4676]: I0930 14:17:52.802422 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlffx\" (UniqueName: \"kubernetes.io/projected/c06b4228-b99e-40ab-bbea-8bfe2bb1ddd7-kube-api-access-vlffx\") on node \"crc\" DevicePath \"\"" Sep 30 14:17:52 crc kubenswrapper[4676]: I0930 14:17:52.824824 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6a97d90-e028-46e3-afdf-d872d446f162-kube-api-access-82rz9" (OuterVolumeSpecName: "kube-api-access-82rz9") pod "d6a97d90-e028-46e3-afdf-d872d446f162" (UID: "d6a97d90-e028-46e3-afdf-d872d446f162"). InnerVolumeSpecName "kube-api-access-82rz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:17:52 crc kubenswrapper[4676]: I0930 14:17:52.903374 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpvt5\" (UniqueName: \"kubernetes.io/projected/076c2b0a-4ad4-44b9-bec1-def5ed805975-kube-api-access-dpvt5\") pod \"076c2b0a-4ad4-44b9-bec1-def5ed805975\" (UID: \"076c2b0a-4ad4-44b9-bec1-def5ed805975\") " Sep 30 14:17:52 crc kubenswrapper[4676]: I0930 14:17:52.903951 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82rz9\" (UniqueName: \"kubernetes.io/projected/d6a97d90-e028-46e3-afdf-d872d446f162-kube-api-access-82rz9\") on node \"crc\" DevicePath \"\"" Sep 30 14:17:52 crc kubenswrapper[4676]: I0930 14:17:52.906722 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/076c2b0a-4ad4-44b9-bec1-def5ed805975-kube-api-access-dpvt5" (OuterVolumeSpecName: "kube-api-access-dpvt5") pod "076c2b0a-4ad4-44b9-bec1-def5ed805975" (UID: "076c2b0a-4ad4-44b9-bec1-def5ed805975"). InnerVolumeSpecName "kube-api-access-dpvt5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:17:53 crc kubenswrapper[4676]: I0930 14:17:53.006460 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dpvt5\" (UniqueName: \"kubernetes.io/projected/076c2b0a-4ad4-44b9-bec1-def5ed805975-kube-api-access-dpvt5\") on node \"crc\" DevicePath \"\"" Sep 30 14:17:53 crc kubenswrapper[4676]: I0930 14:17:53.114896 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-180c-account-create-zkbbd" Sep 30 14:17:53 crc kubenswrapper[4676]: I0930 14:17:53.114898 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-180c-account-create-zkbbd" event={"ID":"076c2b0a-4ad4-44b9-bec1-def5ed805975","Type":"ContainerDied","Data":"fa2a71f17d395cbfad7263f03490d0b365f996d705b32912bd61ca88bc5d7b56"} Sep 30 14:17:53 crc kubenswrapper[4676]: I0930 14:17:53.114954 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa2a71f17d395cbfad7263f03490d0b365f996d705b32912bd61ca88bc5d7b56" Sep 30 14:17:53 crc kubenswrapper[4676]: I0930 14:17:53.117046 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-6aea-account-create-fd92w" event={"ID":"d6a97d90-e028-46e3-afdf-d872d446f162","Type":"ContainerDied","Data":"ecbce8995e0646774d24adb9345d3b20d4ececb63b55e7c132d8c4ff0ba4959b"} Sep 30 14:17:53 crc kubenswrapper[4676]: I0930 14:17:53.117092 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ecbce8995e0646774d24adb9345d3b20d4ececb63b55e7c132d8c4ff0ba4959b" Sep 30 14:17:53 crc kubenswrapper[4676]: I0930 14:17:53.117124 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-6aea-account-create-fd92w" Sep 30 14:17:53 crc kubenswrapper[4676]: I0930 14:17:53.122906 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-5ebc-account-create-7k42h" event={"ID":"c06b4228-b99e-40ab-bbea-8bfe2bb1ddd7","Type":"ContainerDied","Data":"d5daadb054dfc3c7f0ee186a919e95b88d5d773284a431cedd543110989534fe"} Sep 30 14:17:53 crc kubenswrapper[4676]: I0930 14:17:53.122994 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5daadb054dfc3c7f0ee186a919e95b88d5d773284a431cedd543110989534fe" Sep 30 14:17:53 crc kubenswrapper[4676]: I0930 14:17:53.123060 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-5ebc-account-create-7k42h" Sep 30 14:17:53 crc kubenswrapper[4676]: I0930 14:17:53.449399 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-bc91-account-create-zznlq" Sep 30 14:17:53 crc kubenswrapper[4676]: I0930 14:17:53.481094 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-38b7-account-create-msmmq" Sep 30 14:17:53 crc kubenswrapper[4676]: I0930 14:17:53.494203 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1aaa-account-create-gxldf" Sep 30 14:17:53 crc kubenswrapper[4676]: I0930 14:17:53.514650 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9x2n\" (UniqueName: \"kubernetes.io/projected/c77bd056-9c07-43d9-bd25-02177d6b53cc-kube-api-access-b9x2n\") pod \"c77bd056-9c07-43d9-bd25-02177d6b53cc\" (UID: \"c77bd056-9c07-43d9-bd25-02177d6b53cc\") " Sep 30 14:17:53 crc kubenswrapper[4676]: I0930 14:17:53.518677 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c77bd056-9c07-43d9-bd25-02177d6b53cc-kube-api-access-b9x2n" (OuterVolumeSpecName: "kube-api-access-b9x2n") pod "c77bd056-9c07-43d9-bd25-02177d6b53cc" (UID: "c77bd056-9c07-43d9-bd25-02177d6b53cc"). InnerVolumeSpecName "kube-api-access-b9x2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:17:53 crc kubenswrapper[4676]: I0930 14:17:53.616389 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mrtc\" (UniqueName: \"kubernetes.io/projected/f56c6587-9965-4011-848b-c822e4572d6d-kube-api-access-7mrtc\") pod \"f56c6587-9965-4011-848b-c822e4572d6d\" (UID: \"f56c6587-9965-4011-848b-c822e4572d6d\") " Sep 30 14:17:53 crc kubenswrapper[4676]: I0930 14:17:53.616573 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crpb2\" (UniqueName: \"kubernetes.io/projected/62ea603d-33b6-4dfe-ad26-ccf513a14ae5-kube-api-access-crpb2\") pod \"62ea603d-33b6-4dfe-ad26-ccf513a14ae5\" (UID: \"62ea603d-33b6-4dfe-ad26-ccf513a14ae5\") " Sep 30 14:17:53 crc kubenswrapper[4676]: I0930 14:17:53.616991 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9x2n\" (UniqueName: \"kubernetes.io/projected/c77bd056-9c07-43d9-bd25-02177d6b53cc-kube-api-access-b9x2n\") on node \"crc\" DevicePath \"\"" Sep 30 14:17:53 crc kubenswrapper[4676]: I0930 14:17:53.619589 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62ea603d-33b6-4dfe-ad26-ccf513a14ae5-kube-api-access-crpb2" (OuterVolumeSpecName: "kube-api-access-crpb2") pod "62ea603d-33b6-4dfe-ad26-ccf513a14ae5" (UID: "62ea603d-33b6-4dfe-ad26-ccf513a14ae5"). InnerVolumeSpecName "kube-api-access-crpb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:17:53 crc kubenswrapper[4676]: I0930 14:17:53.620114 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f56c6587-9965-4011-848b-c822e4572d6d-kube-api-access-7mrtc" (OuterVolumeSpecName: "kube-api-access-7mrtc") pod "f56c6587-9965-4011-848b-c822e4572d6d" (UID: "f56c6587-9965-4011-848b-c822e4572d6d"). InnerVolumeSpecName "kube-api-access-7mrtc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:17:53 crc kubenswrapper[4676]: I0930 14:17:53.718580 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crpb2\" (UniqueName: \"kubernetes.io/projected/62ea603d-33b6-4dfe-ad26-ccf513a14ae5-kube-api-access-crpb2\") on node \"crc\" DevicePath \"\"" Sep 30 14:17:53 crc kubenswrapper[4676]: I0930 14:17:53.718634 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mrtc\" (UniqueName: \"kubernetes.io/projected/f56c6587-9965-4011-848b-c822e4572d6d-kube-api-access-7mrtc\") on node \"crc\" DevicePath \"\"" Sep 30 14:17:54 crc kubenswrapper[4676]: I0930 14:17:54.132303 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-1aaa-account-create-gxldf" event={"ID":"f56c6587-9965-4011-848b-c822e4572d6d","Type":"ContainerDied","Data":"e9b7ce2baaeb59e75ede60b3cbd9b035f3193b688a8c1fe680c3fd5314355ad3"} Sep 30 14:17:54 crc kubenswrapper[4676]: I0930 14:17:54.133834 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9b7ce2baaeb59e75ede60b3cbd9b035f3193b688a8c1fe680c3fd5314355ad3" Sep 30 14:17:54 crc kubenswrapper[4676]: I0930 14:17:54.132333 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1aaa-account-create-gxldf" Sep 30 14:17:54 crc kubenswrapper[4676]: I0930 14:17:54.134946 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-38b7-account-create-msmmq" event={"ID":"62ea603d-33b6-4dfe-ad26-ccf513a14ae5","Type":"ContainerDied","Data":"bcbca0f9fcc041abf78464c0ab33c4e55f015c0bb529eb63324966f9d18c59b8"} Sep 30 14:17:54 crc kubenswrapper[4676]: I0930 14:17:54.134978 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bcbca0f9fcc041abf78464c0ab33c4e55f015c0bb529eb63324966f9d18c59b8" Sep 30 14:17:54 crc kubenswrapper[4676]: I0930 14:17:54.135021 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-38b7-account-create-msmmq" Sep 30 14:17:54 crc kubenswrapper[4676]: I0930 14:17:54.137512 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-bc91-account-create-zznlq" event={"ID":"c77bd056-9c07-43d9-bd25-02177d6b53cc","Type":"ContainerDied","Data":"4022940783cd607b2d2bc22640a6f04a0ac969684e89b732dd4720875430e4bb"} Sep 30 14:17:54 crc kubenswrapper[4676]: I0930 14:17:54.137572 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4022940783cd607b2d2bc22640a6f04a0ac969684e89b732dd4720875430e4bb" Sep 30 14:17:54 crc kubenswrapper[4676]: I0930 14:17:54.137647 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-bc91-account-create-zznlq" Sep 30 14:17:54 crc kubenswrapper[4676]: I0930 14:17:54.937618 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6c0230d2-8bbc-4ad0-8f3d-062d2d940013-etc-swift\") pod \"swift-storage-0\" (UID: \"6c0230d2-8bbc-4ad0-8f3d-062d2d940013\") " pod="openstack/swift-storage-0" Sep 30 14:17:54 crc kubenswrapper[4676]: I0930 14:17:54.946279 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6c0230d2-8bbc-4ad0-8f3d-062d2d940013-etc-swift\") pod \"swift-storage-0\" (UID: \"6c0230d2-8bbc-4ad0-8f3d-062d2d940013\") " pod="openstack/swift-storage-0" Sep 30 14:17:55 crc kubenswrapper[4676]: I0930 14:17:55.030952 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Sep 30 14:17:55 crc kubenswrapper[4676]: I0930 14:17:55.225620 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-2m994"] Sep 30 14:17:55 crc kubenswrapper[4676]: E0930 14:17:55.226441 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="076c2b0a-4ad4-44b9-bec1-def5ed805975" containerName="mariadb-account-create" Sep 30 14:17:55 crc kubenswrapper[4676]: I0930 14:17:55.226458 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="076c2b0a-4ad4-44b9-bec1-def5ed805975" containerName="mariadb-account-create" Sep 30 14:17:55 crc kubenswrapper[4676]: E0930 14:17:55.226469 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c77bd056-9c07-43d9-bd25-02177d6b53cc" containerName="mariadb-account-create" Sep 30 14:17:55 crc kubenswrapper[4676]: I0930 14:17:55.226479 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="c77bd056-9c07-43d9-bd25-02177d6b53cc" containerName="mariadb-account-create" Sep 30 14:17:55 crc kubenswrapper[4676]: E0930 14:17:55.226495 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6a97d90-e028-46e3-afdf-d872d446f162" containerName="mariadb-account-create" Sep 30 14:17:55 crc kubenswrapper[4676]: I0930 14:17:55.226503 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6a97d90-e028-46e3-afdf-d872d446f162" containerName="mariadb-account-create" Sep 30 14:17:55 crc kubenswrapper[4676]: E0930 14:17:55.226517 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62ea603d-33b6-4dfe-ad26-ccf513a14ae5" containerName="mariadb-account-create" Sep 30 14:17:55 crc kubenswrapper[4676]: I0930 14:17:55.226524 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="62ea603d-33b6-4dfe-ad26-ccf513a14ae5" containerName="mariadb-account-create" Sep 30 14:17:55 crc kubenswrapper[4676]: E0930 14:17:55.226536 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c06b4228-b99e-40ab-bbea-8bfe2bb1ddd7" containerName="mariadb-account-create" Sep 30 14:17:55 crc kubenswrapper[4676]: I0930 14:17:55.226543 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="c06b4228-b99e-40ab-bbea-8bfe2bb1ddd7" containerName="mariadb-account-create" Sep 30 14:17:55 crc kubenswrapper[4676]: E0930 14:17:55.226552 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f56c6587-9965-4011-848b-c822e4572d6d" containerName="mariadb-account-create" Sep 30 14:17:55 crc kubenswrapper[4676]: I0930 14:17:55.226559 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="f56c6587-9965-4011-848b-c822e4572d6d" containerName="mariadb-account-create" Sep 30 14:17:55 crc kubenswrapper[4676]: I0930 14:17:55.226795 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="c06b4228-b99e-40ab-bbea-8bfe2bb1ddd7" containerName="mariadb-account-create" Sep 30 14:17:55 crc kubenswrapper[4676]: I0930 14:17:55.226813 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6a97d90-e028-46e3-afdf-d872d446f162" containerName="mariadb-account-create" Sep 30 14:17:55 crc kubenswrapper[4676]: I0930 14:17:55.226823 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="c77bd056-9c07-43d9-bd25-02177d6b53cc" containerName="mariadb-account-create" Sep 30 14:17:55 crc kubenswrapper[4676]: I0930 14:17:55.226836 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="f56c6587-9965-4011-848b-c822e4572d6d" containerName="mariadb-account-create" Sep 30 14:17:55 crc kubenswrapper[4676]: I0930 14:17:55.226846 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="62ea603d-33b6-4dfe-ad26-ccf513a14ae5" containerName="mariadb-account-create" Sep 30 14:17:55 crc kubenswrapper[4676]: I0930 14:17:55.226856 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="076c2b0a-4ad4-44b9-bec1-def5ed805975" containerName="mariadb-account-create" Sep 30 14:17:55 crc kubenswrapper[4676]: I0930 14:17:55.227610 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-2m994" Sep 30 14:17:55 crc kubenswrapper[4676]: I0930 14:17:55.232385 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Sep 30 14:17:55 crc kubenswrapper[4676]: I0930 14:17:55.232608 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-7jtlk" Sep 30 14:17:55 crc kubenswrapper[4676]: I0930 14:17:55.232796 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Sep 30 14:17:55 crc kubenswrapper[4676]: I0930 14:17:55.236266 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Sep 30 14:17:55 crc kubenswrapper[4676]: I0930 14:17:55.244308 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-2m994"] Sep 30 14:17:55 crc kubenswrapper[4676]: I0930 14:17:55.346346 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0f731ab-125f-48d4-838e-a38a5e78c6fb-combined-ca-bundle\") pod \"keystone-db-sync-2m994\" (UID: \"b0f731ab-125f-48d4-838e-a38a5e78c6fb\") " pod="openstack/keystone-db-sync-2m994" Sep 30 14:17:55 crc kubenswrapper[4676]: I0930 14:17:55.346445 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0f731ab-125f-48d4-838e-a38a5e78c6fb-config-data\") pod \"keystone-db-sync-2m994\" (UID: \"b0f731ab-125f-48d4-838e-a38a5e78c6fb\") " pod="openstack/keystone-db-sync-2m994" Sep 30 14:17:55 crc kubenswrapper[4676]: I0930 14:17:55.346483 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmzjq\" (UniqueName: \"kubernetes.io/projected/b0f731ab-125f-48d4-838e-a38a5e78c6fb-kube-api-access-nmzjq\") pod \"keystone-db-sync-2m994\" (UID: \"b0f731ab-125f-48d4-838e-a38a5e78c6fb\") " pod="openstack/keystone-db-sync-2m994" Sep 30 14:17:55 crc kubenswrapper[4676]: I0930 14:17:55.447860 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0f731ab-125f-48d4-838e-a38a5e78c6fb-config-data\") pod \"keystone-db-sync-2m994\" (UID: \"b0f731ab-125f-48d4-838e-a38a5e78c6fb\") " pod="openstack/keystone-db-sync-2m994" Sep 30 14:17:55 crc kubenswrapper[4676]: I0930 14:17:55.447971 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmzjq\" (UniqueName: \"kubernetes.io/projected/b0f731ab-125f-48d4-838e-a38a5e78c6fb-kube-api-access-nmzjq\") pod \"keystone-db-sync-2m994\" (UID: \"b0f731ab-125f-48d4-838e-a38a5e78c6fb\") " pod="openstack/keystone-db-sync-2m994" Sep 30 14:17:55 crc kubenswrapper[4676]: I0930 14:17:55.448048 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0f731ab-125f-48d4-838e-a38a5e78c6fb-combined-ca-bundle\") pod \"keystone-db-sync-2m994\" (UID: \"b0f731ab-125f-48d4-838e-a38a5e78c6fb\") " pod="openstack/keystone-db-sync-2m994" Sep 30 14:17:55 crc kubenswrapper[4676]: I0930 14:17:55.455995 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0f731ab-125f-48d4-838e-a38a5e78c6fb-config-data\") pod \"keystone-db-sync-2m994\" (UID: \"b0f731ab-125f-48d4-838e-a38a5e78c6fb\") " pod="openstack/keystone-db-sync-2m994" Sep 30 14:17:55 crc kubenswrapper[4676]: I0930 14:17:55.468789 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0f731ab-125f-48d4-838e-a38a5e78c6fb-combined-ca-bundle\") pod \"keystone-db-sync-2m994\" (UID: \"b0f731ab-125f-48d4-838e-a38a5e78c6fb\") " pod="openstack/keystone-db-sync-2m994" Sep 30 14:17:55 crc kubenswrapper[4676]: I0930 14:17:55.480531 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmzjq\" (UniqueName: \"kubernetes.io/projected/b0f731ab-125f-48d4-838e-a38a5e78c6fb-kube-api-access-nmzjq\") pod \"keystone-db-sync-2m994\" (UID: \"b0f731ab-125f-48d4-838e-a38a5e78c6fb\") " pod="openstack/keystone-db-sync-2m994" Sep 30 14:17:55 crc kubenswrapper[4676]: I0930 14:17:55.508716 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-fptlq"] Sep 30 14:17:55 crc kubenswrapper[4676]: I0930 14:17:55.511154 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-fptlq" Sep 30 14:17:55 crc kubenswrapper[4676]: I0930 14:17:55.514584 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Sep 30 14:17:55 crc kubenswrapper[4676]: I0930 14:17:55.525765 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-fptlq"] Sep 30 14:17:55 crc kubenswrapper[4676]: I0930 14:17:55.526017 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-59zsj" Sep 30 14:17:55 crc kubenswrapper[4676]: I0930 14:17:55.559676 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-2m994" Sep 30 14:17:55 crc kubenswrapper[4676]: I0930 14:17:55.654117 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7724b184-865f-4ced-bdf7-867184cf3647-combined-ca-bundle\") pod \"glance-db-sync-fptlq\" (UID: \"7724b184-865f-4ced-bdf7-867184cf3647\") " pod="openstack/glance-db-sync-fptlq" Sep 30 14:17:55 crc kubenswrapper[4676]: I0930 14:17:55.654212 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7724b184-865f-4ced-bdf7-867184cf3647-db-sync-config-data\") pod \"glance-db-sync-fptlq\" (UID: \"7724b184-865f-4ced-bdf7-867184cf3647\") " pod="openstack/glance-db-sync-fptlq" Sep 30 14:17:55 crc kubenswrapper[4676]: I0930 14:17:55.654354 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7724b184-865f-4ced-bdf7-867184cf3647-config-data\") pod \"glance-db-sync-fptlq\" (UID: \"7724b184-865f-4ced-bdf7-867184cf3647\") " pod="openstack/glance-db-sync-fptlq" Sep 30 14:17:55 crc kubenswrapper[4676]: I0930 14:17:55.654481 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9whng\" (UniqueName: \"kubernetes.io/projected/7724b184-865f-4ced-bdf7-867184cf3647-kube-api-access-9whng\") pod \"glance-db-sync-fptlq\" (UID: \"7724b184-865f-4ced-bdf7-867184cf3647\") " pod="openstack/glance-db-sync-fptlq" Sep 30 14:17:55 crc kubenswrapper[4676]: I0930 14:17:55.659744 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Sep 30 14:17:55 crc kubenswrapper[4676]: I0930 14:17:55.756451 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9whng\" (UniqueName: \"kubernetes.io/projected/7724b184-865f-4ced-bdf7-867184cf3647-kube-api-access-9whng\") pod \"glance-db-sync-fptlq\" (UID: \"7724b184-865f-4ced-bdf7-867184cf3647\") " pod="openstack/glance-db-sync-fptlq" Sep 30 14:17:55 crc kubenswrapper[4676]: I0930 14:17:55.756558 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7724b184-865f-4ced-bdf7-867184cf3647-combined-ca-bundle\") pod \"glance-db-sync-fptlq\" (UID: \"7724b184-865f-4ced-bdf7-867184cf3647\") " pod="openstack/glance-db-sync-fptlq" Sep 30 14:17:55 crc kubenswrapper[4676]: I0930 14:17:55.756650 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7724b184-865f-4ced-bdf7-867184cf3647-db-sync-config-data\") pod \"glance-db-sync-fptlq\" (UID: \"7724b184-865f-4ced-bdf7-867184cf3647\") " pod="openstack/glance-db-sync-fptlq" Sep 30 14:17:55 crc kubenswrapper[4676]: I0930 14:17:55.756704 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7724b184-865f-4ced-bdf7-867184cf3647-config-data\") pod \"glance-db-sync-fptlq\" (UID: \"7724b184-865f-4ced-bdf7-867184cf3647\") " pod="openstack/glance-db-sync-fptlq" Sep 30 14:17:55 crc kubenswrapper[4676]: I0930 14:17:55.764340 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7724b184-865f-4ced-bdf7-867184cf3647-combined-ca-bundle\") pod \"glance-db-sync-fptlq\" (UID: \"7724b184-865f-4ced-bdf7-867184cf3647\") " pod="openstack/glance-db-sync-fptlq" Sep 30 14:17:55 crc kubenswrapper[4676]: I0930 14:17:55.764619 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7724b184-865f-4ced-bdf7-867184cf3647-config-data\") pod \"glance-db-sync-fptlq\" (UID: \"7724b184-865f-4ced-bdf7-867184cf3647\") " pod="openstack/glance-db-sync-fptlq" Sep 30 14:17:55 crc kubenswrapper[4676]: I0930 14:17:55.773321 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7724b184-865f-4ced-bdf7-867184cf3647-db-sync-config-data\") pod \"glance-db-sync-fptlq\" (UID: \"7724b184-865f-4ced-bdf7-867184cf3647\") " pod="openstack/glance-db-sync-fptlq" Sep 30 14:17:55 crc kubenswrapper[4676]: I0930 14:17:55.776458 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9whng\" (UniqueName: \"kubernetes.io/projected/7724b184-865f-4ced-bdf7-867184cf3647-kube-api-access-9whng\") pod \"glance-db-sync-fptlq\" (UID: \"7724b184-865f-4ced-bdf7-867184cf3647\") " pod="openstack/glance-db-sync-fptlq" Sep 30 14:17:55 crc kubenswrapper[4676]: I0930 14:17:55.866288 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-fptlq" Sep 30 14:17:56 crc kubenswrapper[4676]: I0930 14:17:56.027619 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-2m994"] Sep 30 14:17:57 crc kubenswrapper[4676]: I0930 14:17:56.182058 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c0230d2-8bbc-4ad0-8f3d-062d2d940013","Type":"ContainerStarted","Data":"4326f8f56cbba9551ddd65855cb7742457f5ba8a3ace1f1ff2364554b4ebb89f"} Sep 30 14:17:57 crc kubenswrapper[4676]: I0930 14:17:56.183624 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-2m994" event={"ID":"b0f731ab-125f-48d4-838e-a38a5e78c6fb","Type":"ContainerStarted","Data":"1b447e90a57fbe7d7f47ef104ed606e59e5cc1f8bfd7f93dbd8349ff00af37c8"} Sep 30 14:17:57 crc kubenswrapper[4676]: I0930 14:17:56.437342 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-fptlq"] Sep 30 14:17:57 crc kubenswrapper[4676]: I0930 14:17:57.199642 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-fptlq" event={"ID":"7724b184-865f-4ced-bdf7-867184cf3647","Type":"ContainerStarted","Data":"d892166cd8fb315cc629792785ae20766ebb2350fd835e3907ed3005eb1b89f1"} Sep 30 14:17:58 crc kubenswrapper[4676]: I0930 14:17:58.211311 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c0230d2-8bbc-4ad0-8f3d-062d2d940013","Type":"ContainerStarted","Data":"497596b711005584c56adf5503171c1930ded1623b4638d8d87480909c994440"} Sep 30 14:17:58 crc kubenswrapper[4676]: I0930 14:17:58.211674 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c0230d2-8bbc-4ad0-8f3d-062d2d940013","Type":"ContainerStarted","Data":"b06a4f5b79ef334d95abc478952b6863e25c2b6e25732f021dd0ba877b93ef70"} Sep 30 14:18:06 crc kubenswrapper[4676]: I0930 14:18:05.395336 4676 scope.go:117] "RemoveContainer" containerID="2e5fc7551bda9eaa7665ce408f0e3324a79149228a6bcf970a014acd4921777d" Sep 30 14:18:14 crc kubenswrapper[4676]: E0930 14:18:14.370666 4676 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api:current-podified" Sep 30 14:18:14 crc kubenswrapper[4676]: E0930 14:18:14.371440 4676 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9whng,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-fptlq_openstack(7724b184-865f-4ced-bdf7-867184cf3647): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 14:18:14 crc kubenswrapper[4676]: E0930 14:18:14.373478 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-fptlq" podUID="7724b184-865f-4ced-bdf7-867184cf3647" Sep 30 14:18:14 crc kubenswrapper[4676]: I0930 14:18:14.532653 4676 scope.go:117] "RemoveContainer" containerID="55657124539b5d89602b934b55eaca5975b95e70751cdb502b1dfc1f01618d92" Sep 30 14:18:14 crc kubenswrapper[4676]: I0930 14:18:14.663913 4676 scope.go:117] "RemoveContainer" containerID="6203286c6e0c355901767abe7ef5a1201e7df5880ea892f0d826852200594e54" Sep 30 14:18:15 crc kubenswrapper[4676]: I0930 14:18:15.372713 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c0230d2-8bbc-4ad0-8f3d-062d2d940013","Type":"ContainerStarted","Data":"1ebabd35e43a13aa180d166c165f0727a1592fe4a4e971c365da783535708eeb"} Sep 30 14:18:15 crc kubenswrapper[4676]: I0930 14:18:15.373259 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c0230d2-8bbc-4ad0-8f3d-062d2d940013","Type":"ContainerStarted","Data":"24e697260bb5774daf8ad244659b8bf91fa1e00f8a28c5446a2073dddaa5447d"} Sep 30 14:18:15 crc kubenswrapper[4676]: I0930 14:18:15.375257 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-2m994" event={"ID":"b0f731ab-125f-48d4-838e-a38a5e78c6fb","Type":"ContainerStarted","Data":"f88a084860b35a0926c41bf279dcba241b63c4b3ad1a0f192816a697b1df10bd"} Sep 30 14:18:15 crc kubenswrapper[4676]: E0930 14:18:15.377019 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api:current-podified\\\"\"" pod="openstack/glance-db-sync-fptlq" podUID="7724b184-865f-4ced-bdf7-867184cf3647" Sep 30 14:18:15 crc kubenswrapper[4676]: I0930 14:18:15.412807 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-2m994" podStartSLOduration=1.785370264 podStartE2EDuration="20.412785756s" podCreationTimestamp="2025-09-30 14:17:55 +0000 UTC" firstStartedPulling="2025-09-30 14:17:56.040559085 +0000 UTC m=+1180.023647514" lastFinishedPulling="2025-09-30 14:18:14.667974577 +0000 UTC m=+1198.651063006" observedRunningTime="2025-09-30 14:18:15.411173842 +0000 UTC m=+1199.394262261" watchObservedRunningTime="2025-09-30 14:18:15.412785756 +0000 UTC m=+1199.395874185" Sep 30 14:18:20 crc kubenswrapper[4676]: I0930 14:18:20.423180 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c0230d2-8bbc-4ad0-8f3d-062d2d940013","Type":"ContainerStarted","Data":"a998e5851954813f8e1eec13388bc6d866d76a7f97599dc82ad44fe9ea41338f"} Sep 30 14:18:21 crc kubenswrapper[4676]: I0930 14:18:21.445498 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c0230d2-8bbc-4ad0-8f3d-062d2d940013","Type":"ContainerStarted","Data":"d1a59da75c58c0179cd72ed939e2605d630ac92c3c16eb1868e7c22465d6d56f"} Sep 30 14:18:21 crc kubenswrapper[4676]: I0930 14:18:21.446144 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c0230d2-8bbc-4ad0-8f3d-062d2d940013","Type":"ContainerStarted","Data":"6c9c05c30356e0a70a409bdf7d72fdd07cf89c5371ceed5a38a95abdc219f9cb"} Sep 30 14:18:22 crc kubenswrapper[4676]: I0930 14:18:22.447334 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c0230d2-8bbc-4ad0-8f3d-062d2d940013","Type":"ContainerStarted","Data":"e8533fd4da4791fa9ceccf248ae6e96ca6e1eb1a73ab3b7e0ba40a8fc3b4139f"} Sep 30 14:18:24 crc kubenswrapper[4676]: I0930 14:18:24.469681 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c0230d2-8bbc-4ad0-8f3d-062d2d940013","Type":"ContainerStarted","Data":"ec7fdfe08c792015ae9df2632c07bb1a2d05fbe4a8a173b36800d0d24219fa24"} Sep 30 14:18:25 crc kubenswrapper[4676]: I0930 14:18:25.485586 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c0230d2-8bbc-4ad0-8f3d-062d2d940013","Type":"ContainerStarted","Data":"5e7d23d33c03a1f64ec5514eefa27a1bc98de6b495d11cb1953761ab1feb1aad"} Sep 30 14:18:25 crc kubenswrapper[4676]: I0930 14:18:25.486184 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c0230d2-8bbc-4ad0-8f3d-062d2d940013","Type":"ContainerStarted","Data":"c6503593ea24121188f9379e68f1ba80f9d598101d77a31e774d19f932a0de65"} Sep 30 14:18:25 crc kubenswrapper[4676]: I0930 14:18:25.486197 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c0230d2-8bbc-4ad0-8f3d-062d2d940013","Type":"ContainerStarted","Data":"21a5a1e432d5ef5e90794a8a22b9e5d3a961f46dcb100bfed17d19dfd08c469d"} Sep 30 14:18:26 crc kubenswrapper[4676]: I0930 14:18:26.499520 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c0230d2-8bbc-4ad0-8f3d-062d2d940013","Type":"ContainerStarted","Data":"f6bc1a5261768be667a5438ee89da7601485d64a67f12d793de6d20351118c03"} Sep 30 14:18:26 crc kubenswrapper[4676]: I0930 14:18:26.499820 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c0230d2-8bbc-4ad0-8f3d-062d2d940013","Type":"ContainerStarted","Data":"028a3feced2986caf956b6a2ab9d5b2a2729f455e2035b76a34d3bba15a9e121"} Sep 30 14:18:26 crc kubenswrapper[4676]: I0930 14:18:26.499842 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c0230d2-8bbc-4ad0-8f3d-062d2d940013","Type":"ContainerStarted","Data":"1d54dfe000c41e79d8935604d0e5066042fbe04c70bd399ea2ea16a32a30f59b"} Sep 30 14:18:26 crc kubenswrapper[4676]: I0930 14:18:26.538389 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=37.059656154 podStartE2EDuration="1m5.53836949s" podCreationTimestamp="2025-09-30 14:17:21 +0000 UTC" firstStartedPulling="2025-09-30 14:17:55.672336832 +0000 UTC m=+1179.655425261" lastFinishedPulling="2025-09-30 14:18:24.151050168 +0000 UTC m=+1208.134138597" observedRunningTime="2025-09-30 14:18:26.533707544 +0000 UTC m=+1210.516795983" watchObservedRunningTime="2025-09-30 14:18:26.53836949 +0000 UTC m=+1210.521457919" Sep 30 14:18:26 crc kubenswrapper[4676]: I0930 14:18:26.848509 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-q8d7x"] Sep 30 14:18:26 crc kubenswrapper[4676]: I0930 14:18:26.850325 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-q8d7x" Sep 30 14:18:26 crc kubenswrapper[4676]: I0930 14:18:26.852792 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Sep 30 14:18:26 crc kubenswrapper[4676]: I0930 14:18:26.889037 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-q8d7x"] Sep 30 14:18:26 crc kubenswrapper[4676]: I0930 14:18:26.908909 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/701a9ff6-6a4f-4caf-8ea1-464d13196d6c-config\") pod \"dnsmasq-dns-77585f5f8c-q8d7x\" (UID: \"701a9ff6-6a4f-4caf-8ea1-464d13196d6c\") " pod="openstack/dnsmasq-dns-77585f5f8c-q8d7x" Sep 30 14:18:26 crc kubenswrapper[4676]: I0930 14:18:26.908976 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/701a9ff6-6a4f-4caf-8ea1-464d13196d6c-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-q8d7x\" (UID: \"701a9ff6-6a4f-4caf-8ea1-464d13196d6c\") " pod="openstack/dnsmasq-dns-77585f5f8c-q8d7x" Sep 30 14:18:26 crc kubenswrapper[4676]: I0930 14:18:26.909000 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/701a9ff6-6a4f-4caf-8ea1-464d13196d6c-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-q8d7x\" (UID: \"701a9ff6-6a4f-4caf-8ea1-464d13196d6c\") " pod="openstack/dnsmasq-dns-77585f5f8c-q8d7x" Sep 30 14:18:26 crc kubenswrapper[4676]: I0930 14:18:26.909065 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/701a9ff6-6a4f-4caf-8ea1-464d13196d6c-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-q8d7x\" (UID: \"701a9ff6-6a4f-4caf-8ea1-464d13196d6c\") " pod="openstack/dnsmasq-dns-77585f5f8c-q8d7x" Sep 30 14:18:26 crc kubenswrapper[4676]: I0930 14:18:26.909088 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpq7k\" (UniqueName: \"kubernetes.io/projected/701a9ff6-6a4f-4caf-8ea1-464d13196d6c-kube-api-access-jpq7k\") pod \"dnsmasq-dns-77585f5f8c-q8d7x\" (UID: \"701a9ff6-6a4f-4caf-8ea1-464d13196d6c\") " pod="openstack/dnsmasq-dns-77585f5f8c-q8d7x" Sep 30 14:18:26 crc kubenswrapper[4676]: I0930 14:18:26.909119 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/701a9ff6-6a4f-4caf-8ea1-464d13196d6c-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-q8d7x\" (UID: \"701a9ff6-6a4f-4caf-8ea1-464d13196d6c\") " pod="openstack/dnsmasq-dns-77585f5f8c-q8d7x" Sep 30 14:18:27 crc kubenswrapper[4676]: I0930 14:18:27.010397 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/701a9ff6-6a4f-4caf-8ea1-464d13196d6c-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-q8d7x\" (UID: \"701a9ff6-6a4f-4caf-8ea1-464d13196d6c\") " pod="openstack/dnsmasq-dns-77585f5f8c-q8d7x" Sep 30 14:18:27 crc kubenswrapper[4676]: I0930 14:18:27.010458 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpq7k\" (UniqueName: \"kubernetes.io/projected/701a9ff6-6a4f-4caf-8ea1-464d13196d6c-kube-api-access-jpq7k\") pod \"dnsmasq-dns-77585f5f8c-q8d7x\" (UID: \"701a9ff6-6a4f-4caf-8ea1-464d13196d6c\") " pod="openstack/dnsmasq-dns-77585f5f8c-q8d7x" Sep 30 14:18:27 crc kubenswrapper[4676]: I0930 14:18:27.010496 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/701a9ff6-6a4f-4caf-8ea1-464d13196d6c-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-q8d7x\" (UID: \"701a9ff6-6a4f-4caf-8ea1-464d13196d6c\") " pod="openstack/dnsmasq-dns-77585f5f8c-q8d7x" Sep 30 14:18:27 crc kubenswrapper[4676]: I0930 14:18:27.010597 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/701a9ff6-6a4f-4caf-8ea1-464d13196d6c-config\") pod \"dnsmasq-dns-77585f5f8c-q8d7x\" (UID: \"701a9ff6-6a4f-4caf-8ea1-464d13196d6c\") " pod="openstack/dnsmasq-dns-77585f5f8c-q8d7x" Sep 30 14:18:27 crc kubenswrapper[4676]: I0930 14:18:27.010632 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/701a9ff6-6a4f-4caf-8ea1-464d13196d6c-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-q8d7x\" (UID: \"701a9ff6-6a4f-4caf-8ea1-464d13196d6c\") " pod="openstack/dnsmasq-dns-77585f5f8c-q8d7x" Sep 30 14:18:27 crc kubenswrapper[4676]: I0930 14:18:27.010650 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/701a9ff6-6a4f-4caf-8ea1-464d13196d6c-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-q8d7x\" (UID: \"701a9ff6-6a4f-4caf-8ea1-464d13196d6c\") " pod="openstack/dnsmasq-dns-77585f5f8c-q8d7x" Sep 30 14:18:27 crc kubenswrapper[4676]: I0930 14:18:27.011547 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/701a9ff6-6a4f-4caf-8ea1-464d13196d6c-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-q8d7x\" (UID: \"701a9ff6-6a4f-4caf-8ea1-464d13196d6c\") " pod="openstack/dnsmasq-dns-77585f5f8c-q8d7x" Sep 30 14:18:27 crc kubenswrapper[4676]: I0930 14:18:27.011601 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/701a9ff6-6a4f-4caf-8ea1-464d13196d6c-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-q8d7x\" (UID: \"701a9ff6-6a4f-4caf-8ea1-464d13196d6c\") " pod="openstack/dnsmasq-dns-77585f5f8c-q8d7x" Sep 30 14:18:27 crc kubenswrapper[4676]: I0930 14:18:27.011685 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/701a9ff6-6a4f-4caf-8ea1-464d13196d6c-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-q8d7x\" (UID: \"701a9ff6-6a4f-4caf-8ea1-464d13196d6c\") " pod="openstack/dnsmasq-dns-77585f5f8c-q8d7x" Sep 30 14:18:27 crc kubenswrapper[4676]: I0930 14:18:27.011717 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/701a9ff6-6a4f-4caf-8ea1-464d13196d6c-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-q8d7x\" (UID: \"701a9ff6-6a4f-4caf-8ea1-464d13196d6c\") " pod="openstack/dnsmasq-dns-77585f5f8c-q8d7x" Sep 30 14:18:27 crc kubenswrapper[4676]: I0930 14:18:27.012303 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/701a9ff6-6a4f-4caf-8ea1-464d13196d6c-config\") pod \"dnsmasq-dns-77585f5f8c-q8d7x\" (UID: \"701a9ff6-6a4f-4caf-8ea1-464d13196d6c\") " pod="openstack/dnsmasq-dns-77585f5f8c-q8d7x" Sep 30 14:18:27 crc kubenswrapper[4676]: I0930 14:18:27.033134 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpq7k\" (UniqueName: \"kubernetes.io/projected/701a9ff6-6a4f-4caf-8ea1-464d13196d6c-kube-api-access-jpq7k\") pod \"dnsmasq-dns-77585f5f8c-q8d7x\" (UID: \"701a9ff6-6a4f-4caf-8ea1-464d13196d6c\") " pod="openstack/dnsmasq-dns-77585f5f8c-q8d7x" Sep 30 14:18:27 crc kubenswrapper[4676]: I0930 14:18:27.166974 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-q8d7x" Sep 30 14:18:27 crc kubenswrapper[4676]: I0930 14:18:27.599812 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-q8d7x"] Sep 30 14:18:28 crc kubenswrapper[4676]: I0930 14:18:28.520772 4676 generic.go:334] "Generic (PLEG): container finished" podID="701a9ff6-6a4f-4caf-8ea1-464d13196d6c" containerID="8d2808d46e31d0ad70aeded8f32540515c94b69af77502ff1e7df89936508090" exitCode=0 Sep 30 14:18:28 crc kubenswrapper[4676]: I0930 14:18:28.520844 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-q8d7x" event={"ID":"701a9ff6-6a4f-4caf-8ea1-464d13196d6c","Type":"ContainerDied","Data":"8d2808d46e31d0ad70aeded8f32540515c94b69af77502ff1e7df89936508090"} Sep 30 14:18:28 crc kubenswrapper[4676]: I0930 14:18:28.521083 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-q8d7x" event={"ID":"701a9ff6-6a4f-4caf-8ea1-464d13196d6c","Type":"ContainerStarted","Data":"2ab4497c14af1c3911b83b251c6cb8e9f45357137cf12115580aacb2fbe329bb"} Sep 30 14:18:29 crc kubenswrapper[4676]: I0930 14:18:29.531423 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-q8d7x" event={"ID":"701a9ff6-6a4f-4caf-8ea1-464d13196d6c","Type":"ContainerStarted","Data":"b946a9ca330f94d7a0033eb5c9cd9cf254b86d53180921a82bb1d3cef6860bdc"} Sep 30 14:18:29 crc kubenswrapper[4676]: I0930 14:18:29.531863 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77585f5f8c-q8d7x" Sep 30 14:18:29 crc kubenswrapper[4676]: I0930 14:18:29.569386 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77585f5f8c-q8d7x" podStartSLOduration=3.569364704 podStartE2EDuration="3.569364704s" podCreationTimestamp="2025-09-30 14:18:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:18:29.551435802 +0000 UTC m=+1213.534524251" watchObservedRunningTime="2025-09-30 14:18:29.569364704 +0000 UTC m=+1213.552453143" Sep 30 14:18:37 crc kubenswrapper[4676]: I0930 14:18:37.168678 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-77585f5f8c-q8d7x" Sep 30 14:18:37 crc kubenswrapper[4676]: I0930 14:18:37.235669 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-zmzf2"] Sep 30 14:18:37 crc kubenswrapper[4676]: I0930 14:18:37.236060 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-zmzf2" podUID="70a50116-382a-4dae-8f0a-d47de54cffcf" containerName="dnsmasq-dns" containerID="cri-o://95c822ac8de5d5cc4cbc980762011dabb92ba7666e85647803c484eebce9cac6" gracePeriod=10 Sep 30 14:18:38 crc kubenswrapper[4676]: I0930 14:18:38.609683 4676 generic.go:334] "Generic (PLEG): container finished" podID="70a50116-382a-4dae-8f0a-d47de54cffcf" containerID="95c822ac8de5d5cc4cbc980762011dabb92ba7666e85647803c484eebce9cac6" exitCode=0 Sep 30 14:18:38 crc kubenswrapper[4676]: I0930 14:18:38.609772 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-zmzf2" event={"ID":"70a50116-382a-4dae-8f0a-d47de54cffcf","Type":"ContainerDied","Data":"95c822ac8de5d5cc4cbc980762011dabb92ba7666e85647803c484eebce9cac6"} Sep 30 14:18:41 crc kubenswrapper[4676]: I0930 14:18:41.937327 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-zmzf2" Sep 30 14:18:41 crc kubenswrapper[4676]: I0930 14:18:41.962631 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/70a50116-382a-4dae-8f0a-d47de54cffcf-dns-svc\") pod \"70a50116-382a-4dae-8f0a-d47de54cffcf\" (UID: \"70a50116-382a-4dae-8f0a-d47de54cffcf\") " Sep 30 14:18:41 crc kubenswrapper[4676]: I0930 14:18:41.962777 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70a50116-382a-4dae-8f0a-d47de54cffcf-config\") pod \"70a50116-382a-4dae-8f0a-d47de54cffcf\" (UID: \"70a50116-382a-4dae-8f0a-d47de54cffcf\") " Sep 30 14:18:41 crc kubenswrapper[4676]: I0930 14:18:41.962816 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtjbp\" (UniqueName: \"kubernetes.io/projected/70a50116-382a-4dae-8f0a-d47de54cffcf-kube-api-access-mtjbp\") pod \"70a50116-382a-4dae-8f0a-d47de54cffcf\" (UID: \"70a50116-382a-4dae-8f0a-d47de54cffcf\") " Sep 30 14:18:41 crc kubenswrapper[4676]: I0930 14:18:41.963028 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/70a50116-382a-4dae-8f0a-d47de54cffcf-ovsdbserver-sb\") pod \"70a50116-382a-4dae-8f0a-d47de54cffcf\" (UID: \"70a50116-382a-4dae-8f0a-d47de54cffcf\") " Sep 30 14:18:41 crc kubenswrapper[4676]: I0930 14:18:41.963080 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/70a50116-382a-4dae-8f0a-d47de54cffcf-ovsdbserver-nb\") pod \"70a50116-382a-4dae-8f0a-d47de54cffcf\" (UID: \"70a50116-382a-4dae-8f0a-d47de54cffcf\") " Sep 30 14:18:41 crc kubenswrapper[4676]: I0930 14:18:41.986558 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70a50116-382a-4dae-8f0a-d47de54cffcf-kube-api-access-mtjbp" (OuterVolumeSpecName: "kube-api-access-mtjbp") pod "70a50116-382a-4dae-8f0a-d47de54cffcf" (UID: "70a50116-382a-4dae-8f0a-d47de54cffcf"). InnerVolumeSpecName "kube-api-access-mtjbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:18:42 crc kubenswrapper[4676]: I0930 14:18:42.011249 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70a50116-382a-4dae-8f0a-d47de54cffcf-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "70a50116-382a-4dae-8f0a-d47de54cffcf" (UID: "70a50116-382a-4dae-8f0a-d47de54cffcf"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:18:42 crc kubenswrapper[4676]: I0930 14:18:42.022236 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70a50116-382a-4dae-8f0a-d47de54cffcf-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "70a50116-382a-4dae-8f0a-d47de54cffcf" (UID: "70a50116-382a-4dae-8f0a-d47de54cffcf"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:18:42 crc kubenswrapper[4676]: I0930 14:18:42.023100 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70a50116-382a-4dae-8f0a-d47de54cffcf-config" (OuterVolumeSpecName: "config") pod "70a50116-382a-4dae-8f0a-d47de54cffcf" (UID: "70a50116-382a-4dae-8f0a-d47de54cffcf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:18:42 crc kubenswrapper[4676]: I0930 14:18:42.037503 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70a50116-382a-4dae-8f0a-d47de54cffcf-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "70a50116-382a-4dae-8f0a-d47de54cffcf" (UID: "70a50116-382a-4dae-8f0a-d47de54cffcf"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:18:42 crc kubenswrapper[4676]: I0930 14:18:42.065395 4676 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/70a50116-382a-4dae-8f0a-d47de54cffcf-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 14:18:42 crc kubenswrapper[4676]: I0930 14:18:42.065425 4676 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/70a50116-382a-4dae-8f0a-d47de54cffcf-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 14:18:42 crc kubenswrapper[4676]: I0930 14:18:42.065435 4676 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/70a50116-382a-4dae-8f0a-d47de54cffcf-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 14:18:42 crc kubenswrapper[4676]: I0930 14:18:42.065443 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70a50116-382a-4dae-8f0a-d47de54cffcf-config\") on node \"crc\" DevicePath \"\"" Sep 30 14:18:42 crc kubenswrapper[4676]: I0930 14:18:42.065453 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtjbp\" (UniqueName: \"kubernetes.io/projected/70a50116-382a-4dae-8f0a-d47de54cffcf-kube-api-access-mtjbp\") on node \"crc\" DevicePath \"\"" Sep 30 14:18:42 crc kubenswrapper[4676]: I0930 14:18:42.646073 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-zmzf2" event={"ID":"70a50116-382a-4dae-8f0a-d47de54cffcf","Type":"ContainerDied","Data":"a031752b598ddea02d7de95edc0ffffa34d8b6d442fb955b5de8d9c13d41ea05"} Sep 30 14:18:42 crc kubenswrapper[4676]: I0930 14:18:42.646378 4676 scope.go:117] "RemoveContainer" containerID="95c822ac8de5d5cc4cbc980762011dabb92ba7666e85647803c484eebce9cac6" Sep 30 14:18:42 crc kubenswrapper[4676]: I0930 14:18:42.646490 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-zmzf2" Sep 30 14:18:42 crc kubenswrapper[4676]: I0930 14:18:42.676256 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-zmzf2"] Sep 30 14:18:42 crc kubenswrapper[4676]: I0930 14:18:42.682812 4676 scope.go:117] "RemoveContainer" containerID="4087fb74a9a71f9b96ff7c1db8c5c123be4ab7d7437366d6e270a91611bd8d05" Sep 30 14:18:42 crc kubenswrapper[4676]: I0930 14:18:42.685651 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-zmzf2"] Sep 30 14:18:43 crc kubenswrapper[4676]: I0930 14:18:43.463653 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70a50116-382a-4dae-8f0a-d47de54cffcf" path="/var/lib/kubelet/pods/70a50116-382a-4dae-8f0a-d47de54cffcf/volumes" Sep 30 14:18:44 crc kubenswrapper[4676]: I0930 14:18:44.663130 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-fptlq" event={"ID":"7724b184-865f-4ced-bdf7-867184cf3647","Type":"ContainerStarted","Data":"086ac2a3788928169cf290661c3c50dd7ad4957cff71c7bde933f95169f8b4ef"} Sep 30 14:18:45 crc kubenswrapper[4676]: I0930 14:18:45.690621 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-fptlq" podStartSLOduration=5.336525618 podStartE2EDuration="50.690599218s" podCreationTimestamp="2025-09-30 14:17:55 +0000 UTC" firstStartedPulling="2025-09-30 14:17:56.447129616 +0000 UTC m=+1180.430218035" lastFinishedPulling="2025-09-30 14:18:41.801203206 +0000 UTC m=+1225.784291635" observedRunningTime="2025-09-30 14:18:45.688726358 +0000 UTC m=+1229.671814817" watchObservedRunningTime="2025-09-30 14:18:45.690599218 +0000 UTC m=+1229.673687647" Sep 30 14:19:19 crc kubenswrapper[4676]: I0930 14:19:19.972097 4676 generic.go:334] "Generic (PLEG): container finished" podID="b0f731ab-125f-48d4-838e-a38a5e78c6fb" containerID="f88a084860b35a0926c41bf279dcba241b63c4b3ad1a0f192816a697b1df10bd" exitCode=0 Sep 30 14:19:19 crc kubenswrapper[4676]: I0930 14:19:19.972183 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-2m994" event={"ID":"b0f731ab-125f-48d4-838e-a38a5e78c6fb","Type":"ContainerDied","Data":"f88a084860b35a0926c41bf279dcba241b63c4b3ad1a0f192816a697b1df10bd"} Sep 30 14:19:21 crc kubenswrapper[4676]: I0930 14:19:21.372922 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-2m994" Sep 30 14:19:21 crc kubenswrapper[4676]: I0930 14:19:21.520815 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0f731ab-125f-48d4-838e-a38a5e78c6fb-config-data\") pod \"b0f731ab-125f-48d4-838e-a38a5e78c6fb\" (UID: \"b0f731ab-125f-48d4-838e-a38a5e78c6fb\") " Sep 30 14:19:21 crc kubenswrapper[4676]: I0930 14:19:21.521001 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0f731ab-125f-48d4-838e-a38a5e78c6fb-combined-ca-bundle\") pod \"b0f731ab-125f-48d4-838e-a38a5e78c6fb\" (UID: \"b0f731ab-125f-48d4-838e-a38a5e78c6fb\") " Sep 30 14:19:21 crc kubenswrapper[4676]: I0930 14:19:21.521074 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmzjq\" (UniqueName: \"kubernetes.io/projected/b0f731ab-125f-48d4-838e-a38a5e78c6fb-kube-api-access-nmzjq\") pod \"b0f731ab-125f-48d4-838e-a38a5e78c6fb\" (UID: \"b0f731ab-125f-48d4-838e-a38a5e78c6fb\") " Sep 30 14:19:21 crc kubenswrapper[4676]: I0930 14:19:21.538330 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0f731ab-125f-48d4-838e-a38a5e78c6fb-kube-api-access-nmzjq" (OuterVolumeSpecName: "kube-api-access-nmzjq") pod "b0f731ab-125f-48d4-838e-a38a5e78c6fb" (UID: "b0f731ab-125f-48d4-838e-a38a5e78c6fb"). InnerVolumeSpecName "kube-api-access-nmzjq". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:19:21 crc kubenswrapper[4676]: I0930 14:19:21.552674 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0f731ab-125f-48d4-838e-a38a5e78c6fb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b0f731ab-125f-48d4-838e-a38a5e78c6fb" (UID: "b0f731ab-125f-48d4-838e-a38a5e78c6fb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:19:21 crc kubenswrapper[4676]: I0930 14:19:21.573646 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0f731ab-125f-48d4-838e-a38a5e78c6fb-config-data" (OuterVolumeSpecName: "config-data") pod "b0f731ab-125f-48d4-838e-a38a5e78c6fb" (UID: "b0f731ab-125f-48d4-838e-a38a5e78c6fb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:19:21 crc kubenswrapper[4676]: I0930 14:19:21.622596 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmzjq\" (UniqueName: \"kubernetes.io/projected/b0f731ab-125f-48d4-838e-a38a5e78c6fb-kube-api-access-nmzjq\") on node \"crc\" DevicePath \"\"" Sep 30 14:19:21 crc kubenswrapper[4676]: I0930 14:19:21.622626 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0f731ab-125f-48d4-838e-a38a5e78c6fb-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 14:19:21 crc kubenswrapper[4676]: I0930 14:19:21.622636 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0f731ab-125f-48d4-838e-a38a5e78c6fb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:19:21 crc kubenswrapper[4676]: I0930 14:19:21.992578 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-2m994" event={"ID":"b0f731ab-125f-48d4-838e-a38a5e78c6fb","Type":"ContainerDied","Data":"1b447e90a57fbe7d7f47ef104ed606e59e5cc1f8bfd7f93dbd8349ff00af37c8"} Sep 30 14:19:21 crc kubenswrapper[4676]: I0930 14:19:21.992631 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b447e90a57fbe7d7f47ef104ed606e59e5cc1f8bfd7f93dbd8349ff00af37c8" Sep 30 14:19:21 crc kubenswrapper[4676]: I0930 14:19:21.992679 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-2m994" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.266006 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-r6vhh"] Sep 30 14:19:22 crc kubenswrapper[4676]: E0930 14:19:22.266620 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70a50116-382a-4dae-8f0a-d47de54cffcf" containerName="init" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.266681 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="70a50116-382a-4dae-8f0a-d47de54cffcf" containerName="init" Sep 30 14:19:22 crc kubenswrapper[4676]: E0930 14:19:22.266743 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0f731ab-125f-48d4-838e-a38a5e78c6fb" containerName="keystone-db-sync" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.266870 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0f731ab-125f-48d4-838e-a38a5e78c6fb" containerName="keystone-db-sync" Sep 30 14:19:22 crc kubenswrapper[4676]: E0930 14:19:22.266948 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70a50116-382a-4dae-8f0a-d47de54cffcf" containerName="dnsmasq-dns" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.266996 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="70a50116-382a-4dae-8f0a-d47de54cffcf" containerName="dnsmasq-dns" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.267210 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="70a50116-382a-4dae-8f0a-d47de54cffcf" containerName="dnsmasq-dns" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.267272 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0f731ab-125f-48d4-838e-a38a5e78c6fb" containerName="keystone-db-sync" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.268369 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fff446b9-r6vhh" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.293777 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-r6vhh"] Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.305059 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-722hg"] Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.306490 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-722hg" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.315272 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.315488 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.315612 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.315690 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-7jtlk" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.340833 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1-ovsdbserver-sb\") pod \"dnsmasq-dns-55fff446b9-r6vhh\" (UID: \"a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1\") " pod="openstack/dnsmasq-dns-55fff446b9-r6vhh" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.340937 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1-ovsdbserver-nb\") pod \"dnsmasq-dns-55fff446b9-r6vhh\" (UID: \"a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1\") " pod="openstack/dnsmasq-dns-55fff446b9-r6vhh" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.340975 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1-dns-swift-storage-0\") pod \"dnsmasq-dns-55fff446b9-r6vhh\" (UID: \"a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1\") " pod="openstack/dnsmasq-dns-55fff446b9-r6vhh" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.340998 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvqxh\" (UniqueName: \"kubernetes.io/projected/a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1-kube-api-access-xvqxh\") pod \"dnsmasq-dns-55fff446b9-r6vhh\" (UID: \"a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1\") " pod="openstack/dnsmasq-dns-55fff446b9-r6vhh" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.341020 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1-dns-svc\") pod \"dnsmasq-dns-55fff446b9-r6vhh\" (UID: \"a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1\") " pod="openstack/dnsmasq-dns-55fff446b9-r6vhh" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.341070 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1-config\") pod \"dnsmasq-dns-55fff446b9-r6vhh\" (UID: \"a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1\") " pod="openstack/dnsmasq-dns-55fff446b9-r6vhh" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.342200 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-722hg"] Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.442252 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7f948430-7d30-45f5-95eb-559952c47cdd-fernet-keys\") pod \"keystone-bootstrap-722hg\" (UID: \"7f948430-7d30-45f5-95eb-559952c47cdd\") " pod="openstack/keystone-bootstrap-722hg" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.442312 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xfjl\" (UniqueName: \"kubernetes.io/projected/7f948430-7d30-45f5-95eb-559952c47cdd-kube-api-access-9xfjl\") pod \"keystone-bootstrap-722hg\" (UID: \"7f948430-7d30-45f5-95eb-559952c47cdd\") " pod="openstack/keystone-bootstrap-722hg" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.442351 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f948430-7d30-45f5-95eb-559952c47cdd-combined-ca-bundle\") pod \"keystone-bootstrap-722hg\" (UID: \"7f948430-7d30-45f5-95eb-559952c47cdd\") " pod="openstack/keystone-bootstrap-722hg" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.442415 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1-ovsdbserver-sb\") pod \"dnsmasq-dns-55fff446b9-r6vhh\" (UID: \"a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1\") " pod="openstack/dnsmasq-dns-55fff446b9-r6vhh" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.442434 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f948430-7d30-45f5-95eb-559952c47cdd-scripts\") pod \"keystone-bootstrap-722hg\" (UID: \"7f948430-7d30-45f5-95eb-559952c47cdd\") " pod="openstack/keystone-bootstrap-722hg" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.442464 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7f948430-7d30-45f5-95eb-559952c47cdd-credential-keys\") pod \"keystone-bootstrap-722hg\" (UID: \"7f948430-7d30-45f5-95eb-559952c47cdd\") " pod="openstack/keystone-bootstrap-722hg" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.442500 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1-ovsdbserver-nb\") pod \"dnsmasq-dns-55fff446b9-r6vhh\" (UID: \"a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1\") " pod="openstack/dnsmasq-dns-55fff446b9-r6vhh" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.442524 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvqxh\" (UniqueName: \"kubernetes.io/projected/a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1-kube-api-access-xvqxh\") pod \"dnsmasq-dns-55fff446b9-r6vhh\" (UID: \"a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1\") " pod="openstack/dnsmasq-dns-55fff446b9-r6vhh" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.442546 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1-dns-svc\") pod \"dnsmasq-dns-55fff446b9-r6vhh\" (UID: \"a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1\") " pod="openstack/dnsmasq-dns-55fff446b9-r6vhh" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.442565 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1-dns-swift-storage-0\") pod \"dnsmasq-dns-55fff446b9-r6vhh\" (UID: \"a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1\") " pod="openstack/dnsmasq-dns-55fff446b9-r6vhh" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.442594 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f948430-7d30-45f5-95eb-559952c47cdd-config-data\") pod \"keystone-bootstrap-722hg\" (UID: \"7f948430-7d30-45f5-95eb-559952c47cdd\") " pod="openstack/keystone-bootstrap-722hg" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.442630 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1-config\") pod \"dnsmasq-dns-55fff446b9-r6vhh\" (UID: \"a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1\") " pod="openstack/dnsmasq-dns-55fff446b9-r6vhh" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.443715 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1-ovsdbserver-sb\") pod \"dnsmasq-dns-55fff446b9-r6vhh\" (UID: \"a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1\") " pod="openstack/dnsmasq-dns-55fff446b9-r6vhh" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.445386 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5d5c48cd87-fdhnt"] Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.447033 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1-dns-swift-storage-0\") pod \"dnsmasq-dns-55fff446b9-r6vhh\" (UID: \"a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1\") " pod="openstack/dnsmasq-dns-55fff446b9-r6vhh" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.447209 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1-config\") pod \"dnsmasq-dns-55fff446b9-r6vhh\" (UID: \"a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1\") " pod="openstack/dnsmasq-dns-55fff446b9-r6vhh" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.447420 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1-ovsdbserver-nb\") pod \"dnsmasq-dns-55fff446b9-r6vhh\" (UID: \"a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1\") " pod="openstack/dnsmasq-dns-55fff446b9-r6vhh" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.448962 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1-dns-svc\") pod \"dnsmasq-dns-55fff446b9-r6vhh\" (UID: \"a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1\") " pod="openstack/dnsmasq-dns-55fff446b9-r6vhh" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.453390 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5d5c48cd87-fdhnt" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.460157 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.460413 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.465472 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-mhjpr" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.465689 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.477251 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvqxh\" (UniqueName: \"kubernetes.io/projected/a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1-kube-api-access-xvqxh\") pod \"dnsmasq-dns-55fff446b9-r6vhh\" (UID: \"a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1\") " pod="openstack/dnsmasq-dns-55fff446b9-r6vhh" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.508011 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5d5c48cd87-fdhnt"] Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.523248 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-wxj4s"] Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.524820 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-wxj4s" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.531038 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.531286 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-jt2br" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.532049 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.543454 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-wxj4s"] Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.546857 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/70e1b5b0-8a73-47b7-b1f6-2ce64a85d1ae-scripts\") pod \"horizon-5d5c48cd87-fdhnt\" (UID: \"70e1b5b0-8a73-47b7-b1f6-2ce64a85d1ae\") " pod="openstack/horizon-5d5c48cd87-fdhnt" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.546971 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f948430-7d30-45f5-95eb-559952c47cdd-scripts\") pod \"keystone-bootstrap-722hg\" (UID: \"7f948430-7d30-45f5-95eb-559952c47cdd\") " pod="openstack/keystone-bootstrap-722hg" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.547029 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/70e1b5b0-8a73-47b7-b1f6-2ce64a85d1ae-horizon-secret-key\") pod \"horizon-5d5c48cd87-fdhnt\" (UID: \"70e1b5b0-8a73-47b7-b1f6-2ce64a85d1ae\") " pod="openstack/horizon-5d5c48cd87-fdhnt" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.547073 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7f948430-7d30-45f5-95eb-559952c47cdd-credential-keys\") pod \"keystone-bootstrap-722hg\" (UID: \"7f948430-7d30-45f5-95eb-559952c47cdd\") " pod="openstack/keystone-bootstrap-722hg" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.547143 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f948430-7d30-45f5-95eb-559952c47cdd-config-data\") pod \"keystone-bootstrap-722hg\" (UID: \"7f948430-7d30-45f5-95eb-559952c47cdd\") " pod="openstack/keystone-bootstrap-722hg" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.547167 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70e1b5b0-8a73-47b7-b1f6-2ce64a85d1ae-logs\") pod \"horizon-5d5c48cd87-fdhnt\" (UID: \"70e1b5b0-8a73-47b7-b1f6-2ce64a85d1ae\") " pod="openstack/horizon-5d5c48cd87-fdhnt" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.547217 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/70e1b5b0-8a73-47b7-b1f6-2ce64a85d1ae-config-data\") pod \"horizon-5d5c48cd87-fdhnt\" (UID: \"70e1b5b0-8a73-47b7-b1f6-2ce64a85d1ae\") " pod="openstack/horizon-5d5c48cd87-fdhnt" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.547242 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d75d9\" (UniqueName: \"kubernetes.io/projected/70e1b5b0-8a73-47b7-b1f6-2ce64a85d1ae-kube-api-access-d75d9\") pod \"horizon-5d5c48cd87-fdhnt\" (UID: \"70e1b5b0-8a73-47b7-b1f6-2ce64a85d1ae\") " pod="openstack/horizon-5d5c48cd87-fdhnt" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.547294 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7f948430-7d30-45f5-95eb-559952c47cdd-fernet-keys\") pod \"keystone-bootstrap-722hg\" (UID: \"7f948430-7d30-45f5-95eb-559952c47cdd\") " pod="openstack/keystone-bootstrap-722hg" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.547324 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xfjl\" (UniqueName: \"kubernetes.io/projected/7f948430-7d30-45f5-95eb-559952c47cdd-kube-api-access-9xfjl\") pod \"keystone-bootstrap-722hg\" (UID: \"7f948430-7d30-45f5-95eb-559952c47cdd\") " pod="openstack/keystone-bootstrap-722hg" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.547387 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f948430-7d30-45f5-95eb-559952c47cdd-combined-ca-bundle\") pod \"keystone-bootstrap-722hg\" (UID: \"7f948430-7d30-45f5-95eb-559952c47cdd\") " pod="openstack/keystone-bootstrap-722hg" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.576259 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f948430-7d30-45f5-95eb-559952c47cdd-scripts\") pod \"keystone-bootstrap-722hg\" (UID: \"7f948430-7d30-45f5-95eb-559952c47cdd\") " pod="openstack/keystone-bootstrap-722hg" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.576753 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f948430-7d30-45f5-95eb-559952c47cdd-combined-ca-bundle\") pod \"keystone-bootstrap-722hg\" (UID: \"7f948430-7d30-45f5-95eb-559952c47cdd\") " pod="openstack/keystone-bootstrap-722hg" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.576959 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f948430-7d30-45f5-95eb-559952c47cdd-config-data\") pod \"keystone-bootstrap-722hg\" (UID: \"7f948430-7d30-45f5-95eb-559952c47cdd\") " pod="openstack/keystone-bootstrap-722hg" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.581490 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7f948430-7d30-45f5-95eb-559952c47cdd-credential-keys\") pod \"keystone-bootstrap-722hg\" (UID: \"7f948430-7d30-45f5-95eb-559952c47cdd\") " pod="openstack/keystone-bootstrap-722hg" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.585016 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7f948430-7d30-45f5-95eb-559952c47cdd-fernet-keys\") pod \"keystone-bootstrap-722hg\" (UID: \"7f948430-7d30-45f5-95eb-559952c47cdd\") " pod="openstack/keystone-bootstrap-722hg" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.592086 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fff446b9-r6vhh" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.597117 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.599245 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.617003 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.618099 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xfjl\" (UniqueName: \"kubernetes.io/projected/7f948430-7d30-45f5-95eb-559952c47cdd-kube-api-access-9xfjl\") pod \"keystone-bootstrap-722hg\" (UID: \"7f948430-7d30-45f5-95eb-559952c47cdd\") " pod="openstack/keystone-bootstrap-722hg" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.623412 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.623668 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.644363 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-722hg" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.649214 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63080796-b0be-4b3a-8db5-8242e2eb2bb3-config-data\") pod \"cinder-db-sync-wxj4s\" (UID: \"63080796-b0be-4b3a-8db5-8242e2eb2bb3\") " pod="openstack/cinder-db-sync-wxj4s" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.649266 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/63080796-b0be-4b3a-8db5-8242e2eb2bb3-db-sync-config-data\") pod \"cinder-db-sync-wxj4s\" (UID: \"63080796-b0be-4b3a-8db5-8242e2eb2bb3\") " pod="openstack/cinder-db-sync-wxj4s" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.649306 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70e1b5b0-8a73-47b7-b1f6-2ce64a85d1ae-logs\") pod \"horizon-5d5c48cd87-fdhnt\" (UID: \"70e1b5b0-8a73-47b7-b1f6-2ce64a85d1ae\") " pod="openstack/horizon-5d5c48cd87-fdhnt" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.649337 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/70e1b5b0-8a73-47b7-b1f6-2ce64a85d1ae-config-data\") pod \"horizon-5d5c48cd87-fdhnt\" (UID: \"70e1b5b0-8a73-47b7-b1f6-2ce64a85d1ae\") " pod="openstack/horizon-5d5c48cd87-fdhnt" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.649380 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d75d9\" (UniqueName: \"kubernetes.io/projected/70e1b5b0-8a73-47b7-b1f6-2ce64a85d1ae-kube-api-access-d75d9\") pod \"horizon-5d5c48cd87-fdhnt\" (UID: \"70e1b5b0-8a73-47b7-b1f6-2ce64a85d1ae\") " pod="openstack/horizon-5d5c48cd87-fdhnt" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.649428 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63080796-b0be-4b3a-8db5-8242e2eb2bb3-combined-ca-bundle\") pod \"cinder-db-sync-wxj4s\" (UID: \"63080796-b0be-4b3a-8db5-8242e2eb2bb3\") " pod="openstack/cinder-db-sync-wxj4s" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.649467 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7czf\" (UniqueName: \"kubernetes.io/projected/63080796-b0be-4b3a-8db5-8242e2eb2bb3-kube-api-access-t7czf\") pod \"cinder-db-sync-wxj4s\" (UID: \"63080796-b0be-4b3a-8db5-8242e2eb2bb3\") " pod="openstack/cinder-db-sync-wxj4s" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.649493 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/63080796-b0be-4b3a-8db5-8242e2eb2bb3-etc-machine-id\") pod \"cinder-db-sync-wxj4s\" (UID: \"63080796-b0be-4b3a-8db5-8242e2eb2bb3\") " pod="openstack/cinder-db-sync-wxj4s" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.649551 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/70e1b5b0-8a73-47b7-b1f6-2ce64a85d1ae-scripts\") pod \"horizon-5d5c48cd87-fdhnt\" (UID: \"70e1b5b0-8a73-47b7-b1f6-2ce64a85d1ae\") " pod="openstack/horizon-5d5c48cd87-fdhnt" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.649593 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/70e1b5b0-8a73-47b7-b1f6-2ce64a85d1ae-horizon-secret-key\") pod \"horizon-5d5c48cd87-fdhnt\" (UID: \"70e1b5b0-8a73-47b7-b1f6-2ce64a85d1ae\") " pod="openstack/horizon-5d5c48cd87-fdhnt" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.649645 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63080796-b0be-4b3a-8db5-8242e2eb2bb3-scripts\") pod \"cinder-db-sync-wxj4s\" (UID: \"63080796-b0be-4b3a-8db5-8242e2eb2bb3\") " pod="openstack/cinder-db-sync-wxj4s" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.650382 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70e1b5b0-8a73-47b7-b1f6-2ce64a85d1ae-logs\") pod \"horizon-5d5c48cd87-fdhnt\" (UID: \"70e1b5b0-8a73-47b7-b1f6-2ce64a85d1ae\") " pod="openstack/horizon-5d5c48cd87-fdhnt" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.651316 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/70e1b5b0-8a73-47b7-b1f6-2ce64a85d1ae-scripts\") pod \"horizon-5d5c48cd87-fdhnt\" (UID: \"70e1b5b0-8a73-47b7-b1f6-2ce64a85d1ae\") " pod="openstack/horizon-5d5c48cd87-fdhnt" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.651790 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/70e1b5b0-8a73-47b7-b1f6-2ce64a85d1ae-config-data\") pod \"horizon-5d5c48cd87-fdhnt\" (UID: \"70e1b5b0-8a73-47b7-b1f6-2ce64a85d1ae\") " pod="openstack/horizon-5d5c48cd87-fdhnt" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.665579 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/70e1b5b0-8a73-47b7-b1f6-2ce64a85d1ae-horizon-secret-key\") pod \"horizon-5d5c48cd87-fdhnt\" (UID: \"70e1b5b0-8a73-47b7-b1f6-2ce64a85d1ae\") " pod="openstack/horizon-5d5c48cd87-fdhnt" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.689475 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d75d9\" (UniqueName: \"kubernetes.io/projected/70e1b5b0-8a73-47b7-b1f6-2ce64a85d1ae-kube-api-access-d75d9\") pod \"horizon-5d5c48cd87-fdhnt\" (UID: \"70e1b5b0-8a73-47b7-b1f6-2ce64a85d1ae\") " pod="openstack/horizon-5d5c48cd87-fdhnt" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.699437 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-b5nzl"] Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.700646 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-b5nzl" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.744054 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.774543 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-g2kbp" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.787431 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63080796-b0be-4b3a-8db5-8242e2eb2bb3-scripts\") pod \"cinder-db-sync-wxj4s\" (UID: \"63080796-b0be-4b3a-8db5-8242e2eb2bb3\") " pod="openstack/cinder-db-sync-wxj4s" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.787502 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5540da5e-a02e-437f-82a8-e0f74ac91760-scripts\") pod \"ceilometer-0\" (UID: \"5540da5e-a02e-437f-82a8-e0f74ac91760\") " pod="openstack/ceilometer-0" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.787541 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63080796-b0be-4b3a-8db5-8242e2eb2bb3-config-data\") pod \"cinder-db-sync-wxj4s\" (UID: \"63080796-b0be-4b3a-8db5-8242e2eb2bb3\") " pod="openstack/cinder-db-sync-wxj4s" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.787576 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/63080796-b0be-4b3a-8db5-8242e2eb2bb3-db-sync-config-data\") pod \"cinder-db-sync-wxj4s\" (UID: \"63080796-b0be-4b3a-8db5-8242e2eb2bb3\") " pod="openstack/cinder-db-sync-wxj4s" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.787698 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nt82t\" (UniqueName: \"kubernetes.io/projected/5540da5e-a02e-437f-82a8-e0f74ac91760-kube-api-access-nt82t\") pod \"ceilometer-0\" (UID: \"5540da5e-a02e-437f-82a8-e0f74ac91760\") " pod="openstack/ceilometer-0" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.787755 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63080796-b0be-4b3a-8db5-8242e2eb2bb3-combined-ca-bundle\") pod \"cinder-db-sync-wxj4s\" (UID: \"63080796-b0be-4b3a-8db5-8242e2eb2bb3\") " pod="openstack/cinder-db-sync-wxj4s" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.787834 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7czf\" (UniqueName: \"kubernetes.io/projected/63080796-b0be-4b3a-8db5-8242e2eb2bb3-kube-api-access-t7czf\") pod \"cinder-db-sync-wxj4s\" (UID: \"63080796-b0be-4b3a-8db5-8242e2eb2bb3\") " pod="openstack/cinder-db-sync-wxj4s" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.787862 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5540da5e-a02e-437f-82a8-e0f74ac91760-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5540da5e-a02e-437f-82a8-e0f74ac91760\") " pod="openstack/ceilometer-0" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.787919 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/63080796-b0be-4b3a-8db5-8242e2eb2bb3-etc-machine-id\") pod \"cinder-db-sync-wxj4s\" (UID: \"63080796-b0be-4b3a-8db5-8242e2eb2bb3\") " pod="openstack/cinder-db-sync-wxj4s" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.787977 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5540da5e-a02e-437f-82a8-e0f74ac91760-run-httpd\") pod \"ceilometer-0\" (UID: \"5540da5e-a02e-437f-82a8-e0f74ac91760\") " pod="openstack/ceilometer-0" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.788015 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5540da5e-a02e-437f-82a8-e0f74ac91760-config-data\") pod \"ceilometer-0\" (UID: \"5540da5e-a02e-437f-82a8-e0f74ac91760\") " pod="openstack/ceilometer-0" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.788166 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5540da5e-a02e-437f-82a8-e0f74ac91760-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5540da5e-a02e-437f-82a8-e0f74ac91760\") " pod="openstack/ceilometer-0" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.788222 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5540da5e-a02e-437f-82a8-e0f74ac91760-log-httpd\") pod \"ceilometer-0\" (UID: \"5540da5e-a02e-437f-82a8-e0f74ac91760\") " pod="openstack/ceilometer-0" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.788687 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5d5c48cd87-fdhnt" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.789994 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/63080796-b0be-4b3a-8db5-8242e2eb2bb3-etc-machine-id\") pod \"cinder-db-sync-wxj4s\" (UID: \"63080796-b0be-4b3a-8db5-8242e2eb2bb3\") " pod="openstack/cinder-db-sync-wxj4s" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.798811 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63080796-b0be-4b3a-8db5-8242e2eb2bb3-combined-ca-bundle\") pod \"cinder-db-sync-wxj4s\" (UID: \"63080796-b0be-4b3a-8db5-8242e2eb2bb3\") " pod="openstack/cinder-db-sync-wxj4s" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.825245 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63080796-b0be-4b3a-8db5-8242e2eb2bb3-scripts\") pod \"cinder-db-sync-wxj4s\" (UID: \"63080796-b0be-4b3a-8db5-8242e2eb2bb3\") " pod="openstack/cinder-db-sync-wxj4s" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.830898 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/63080796-b0be-4b3a-8db5-8242e2eb2bb3-db-sync-config-data\") pod \"cinder-db-sync-wxj4s\" (UID: \"63080796-b0be-4b3a-8db5-8242e2eb2bb3\") " pod="openstack/cinder-db-sync-wxj4s" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.868789 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63080796-b0be-4b3a-8db5-8242e2eb2bb3-config-data\") pod \"cinder-db-sync-wxj4s\" (UID: \"63080796-b0be-4b3a-8db5-8242e2eb2bb3\") " pod="openstack/cinder-db-sync-wxj4s" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.886831 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7czf\" (UniqueName: \"kubernetes.io/projected/63080796-b0be-4b3a-8db5-8242e2eb2bb3-kube-api-access-t7czf\") pod \"cinder-db-sync-wxj4s\" (UID: \"63080796-b0be-4b3a-8db5-8242e2eb2bb3\") " pod="openstack/cinder-db-sync-wxj4s" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.890643 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5540da5e-a02e-437f-82a8-e0f74ac91760-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5540da5e-a02e-437f-82a8-e0f74ac91760\") " pod="openstack/ceilometer-0" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.890776 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5540da5e-a02e-437f-82a8-e0f74ac91760-log-httpd\") pod \"ceilometer-0\" (UID: \"5540da5e-a02e-437f-82a8-e0f74ac91760\") " pod="openstack/ceilometer-0" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.890823 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5540da5e-a02e-437f-82a8-e0f74ac91760-scripts\") pod \"ceilometer-0\" (UID: \"5540da5e-a02e-437f-82a8-e0f74ac91760\") " pod="openstack/ceilometer-0" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.890859 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1fd6e1e-38da-4634-9862-21c027ea770a-combined-ca-bundle\") pod \"barbican-db-sync-b5nzl\" (UID: \"c1fd6e1e-38da-4634-9862-21c027ea770a\") " pod="openstack/barbican-db-sync-b5nzl" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.890967 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jktg8\" (UniqueName: \"kubernetes.io/projected/c1fd6e1e-38da-4634-9862-21c027ea770a-kube-api-access-jktg8\") pod \"barbican-db-sync-b5nzl\" (UID: \"c1fd6e1e-38da-4634-9862-21c027ea770a\") " pod="openstack/barbican-db-sync-b5nzl" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.891029 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nt82t\" (UniqueName: \"kubernetes.io/projected/5540da5e-a02e-437f-82a8-e0f74ac91760-kube-api-access-nt82t\") pod \"ceilometer-0\" (UID: \"5540da5e-a02e-437f-82a8-e0f74ac91760\") " pod="openstack/ceilometer-0" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.891093 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5540da5e-a02e-437f-82a8-e0f74ac91760-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5540da5e-a02e-437f-82a8-e0f74ac91760\") " pod="openstack/ceilometer-0" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.891131 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5540da5e-a02e-437f-82a8-e0f74ac91760-run-httpd\") pod \"ceilometer-0\" (UID: \"5540da5e-a02e-437f-82a8-e0f74ac91760\") " pod="openstack/ceilometer-0" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.891165 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5540da5e-a02e-437f-82a8-e0f74ac91760-config-data\") pod \"ceilometer-0\" (UID: \"5540da5e-a02e-437f-82a8-e0f74ac91760\") " pod="openstack/ceilometer-0" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.891204 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c1fd6e1e-38da-4634-9862-21c027ea770a-db-sync-config-data\") pod \"barbican-db-sync-b5nzl\" (UID: \"c1fd6e1e-38da-4634-9862-21c027ea770a\") " pod="openstack/barbican-db-sync-b5nzl" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.894719 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5540da5e-a02e-437f-82a8-e0f74ac91760-log-httpd\") pod \"ceilometer-0\" (UID: \"5540da5e-a02e-437f-82a8-e0f74ac91760\") " pod="openstack/ceilometer-0" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.900418 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5540da5e-a02e-437f-82a8-e0f74ac91760-run-httpd\") pod \"ceilometer-0\" (UID: \"5540da5e-a02e-437f-82a8-e0f74ac91760\") " pod="openstack/ceilometer-0" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.907845 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5540da5e-a02e-437f-82a8-e0f74ac91760-scripts\") pod \"ceilometer-0\" (UID: \"5540da5e-a02e-437f-82a8-e0f74ac91760\") " pod="openstack/ceilometer-0" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.910424 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5540da5e-a02e-437f-82a8-e0f74ac91760-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5540da5e-a02e-437f-82a8-e0f74ac91760\") " pod="openstack/ceilometer-0" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.916145 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5540da5e-a02e-437f-82a8-e0f74ac91760-config-data\") pod \"ceilometer-0\" (UID: \"5540da5e-a02e-437f-82a8-e0f74ac91760\") " pod="openstack/ceilometer-0" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.924268 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-b5nzl"] Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.943004 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5540da5e-a02e-437f-82a8-e0f74ac91760-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5540da5e-a02e-437f-82a8-e0f74ac91760\") " pod="openstack/ceilometer-0" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.973458 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nt82t\" (UniqueName: \"kubernetes.io/projected/5540da5e-a02e-437f-82a8-e0f74ac91760-kube-api-access-nt82t\") pod \"ceilometer-0\" (UID: \"5540da5e-a02e-437f-82a8-e0f74ac91760\") " pod="openstack/ceilometer-0" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.999004 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1fd6e1e-38da-4634-9862-21c027ea770a-combined-ca-bundle\") pod \"barbican-db-sync-b5nzl\" (UID: \"c1fd6e1e-38da-4634-9862-21c027ea770a\") " pod="openstack/barbican-db-sync-b5nzl" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.999057 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jktg8\" (UniqueName: \"kubernetes.io/projected/c1fd6e1e-38da-4634-9862-21c027ea770a-kube-api-access-jktg8\") pod \"barbican-db-sync-b5nzl\" (UID: \"c1fd6e1e-38da-4634-9862-21c027ea770a\") " pod="openstack/barbican-db-sync-b5nzl" Sep 30 14:19:22 crc kubenswrapper[4676]: I0930 14:19:22.999130 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c1fd6e1e-38da-4634-9862-21c027ea770a-db-sync-config-data\") pod \"barbican-db-sync-b5nzl\" (UID: \"c1fd6e1e-38da-4634-9862-21c027ea770a\") " pod="openstack/barbican-db-sync-b5nzl" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.004410 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c1fd6e1e-38da-4634-9862-21c027ea770a-db-sync-config-data\") pod \"barbican-db-sync-b5nzl\" (UID: \"c1fd6e1e-38da-4634-9862-21c027ea770a\") " pod="openstack/barbican-db-sync-b5nzl" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.007206 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1fd6e1e-38da-4634-9862-21c027ea770a-combined-ca-bundle\") pod \"barbican-db-sync-b5nzl\" (UID: \"c1fd6e1e-38da-4634-9862-21c027ea770a\") " pod="openstack/barbican-db-sync-b5nzl" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.032933 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-2mftx"] Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.034323 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-2mftx" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.040931 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-jvbzz" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.041135 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.041252 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.053898 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jktg8\" (UniqueName: \"kubernetes.io/projected/c1fd6e1e-38da-4634-9862-21c027ea770a-kube-api-access-jktg8\") pod \"barbican-db-sync-b5nzl\" (UID: \"c1fd6e1e-38da-4634-9862-21c027ea770a\") " pod="openstack/barbican-db-sync-b5nzl" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.055161 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-45gkb"] Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.056701 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-45gkb" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.061467 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-ml554" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.061836 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.061848 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.091333 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-r6vhh"] Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.108090 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-2mftx"] Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.128986 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-45gkb"] Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.129067 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-866c445777-2hmsg"] Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.130609 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-866c445777-2hmsg"] Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.130703 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-866c445777-2hmsg" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.155272 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-k2b4r"] Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.157317 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fcf4b695-k2b4r" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.166148 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-wxj4s" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.168053 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-k2b4r"] Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.204977 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.206401 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b04eaad6-de72-4265-aa3d-fda03a0ea925-logs\") pod \"placement-db-sync-45gkb\" (UID: \"b04eaad6-de72-4265-aa3d-fda03a0ea925\") " pod="openstack/placement-db-sync-45gkb" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.206682 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b04eaad6-de72-4265-aa3d-fda03a0ea925-config-data\") pod \"placement-db-sync-45gkb\" (UID: \"b04eaad6-de72-4265-aa3d-fda03a0ea925\") " pod="openstack/placement-db-sync-45gkb" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.207245 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsmt5\" (UniqueName: \"kubernetes.io/projected/f9a3961e-e61e-4a02-9fe4-bc1b5ae097cc-kube-api-access-bsmt5\") pod \"neutron-db-sync-2mftx\" (UID: \"f9a3961e-e61e-4a02-9fe4-bc1b5ae097cc\") " pod="openstack/neutron-db-sync-2mftx" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.207333 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fngq9\" (UniqueName: \"kubernetes.io/projected/b04eaad6-de72-4265-aa3d-fda03a0ea925-kube-api-access-fngq9\") pod \"placement-db-sync-45gkb\" (UID: \"b04eaad6-de72-4265-aa3d-fda03a0ea925\") " pod="openstack/placement-db-sync-45gkb" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.207353 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b04eaad6-de72-4265-aa3d-fda03a0ea925-combined-ca-bundle\") pod \"placement-db-sync-45gkb\" (UID: \"b04eaad6-de72-4265-aa3d-fda03a0ea925\") " pod="openstack/placement-db-sync-45gkb" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.207471 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9a3961e-e61e-4a02-9fe4-bc1b5ae097cc-combined-ca-bundle\") pod \"neutron-db-sync-2mftx\" (UID: \"f9a3961e-e61e-4a02-9fe4-bc1b5ae097cc\") " pod="openstack/neutron-db-sync-2mftx" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.207545 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b04eaad6-de72-4265-aa3d-fda03a0ea925-scripts\") pod \"placement-db-sync-45gkb\" (UID: \"b04eaad6-de72-4265-aa3d-fda03a0ea925\") " pod="openstack/placement-db-sync-45gkb" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.207587 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f9a3961e-e61e-4a02-9fe4-bc1b5ae097cc-config\") pod \"neutron-db-sync-2mftx\" (UID: \"f9a3961e-e61e-4a02-9fe4-bc1b5ae097cc\") " pod="openstack/neutron-db-sync-2mftx" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.261280 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-b5nzl" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.309340 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cef0985d-7a66-49f5-a4ac-fabf92844c3e-dns-swift-storage-0\") pod \"dnsmasq-dns-76fcf4b695-k2b4r\" (UID: \"cef0985d-7a66-49f5-a4ac-fabf92844c3e\") " pod="openstack/dnsmasq-dns-76fcf4b695-k2b4r" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.309415 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3769ebb3-5ddc-4940-a088-79308a08ef6c-horizon-secret-key\") pod \"horizon-866c445777-2hmsg\" (UID: \"3769ebb3-5ddc-4940-a088-79308a08ef6c\") " pod="openstack/horizon-866c445777-2hmsg" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.309457 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fngq9\" (UniqueName: \"kubernetes.io/projected/b04eaad6-de72-4265-aa3d-fda03a0ea925-kube-api-access-fngq9\") pod \"placement-db-sync-45gkb\" (UID: \"b04eaad6-de72-4265-aa3d-fda03a0ea925\") " pod="openstack/placement-db-sync-45gkb" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.309482 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b04eaad6-de72-4265-aa3d-fda03a0ea925-combined-ca-bundle\") pod \"placement-db-sync-45gkb\" (UID: \"b04eaad6-de72-4265-aa3d-fda03a0ea925\") " pod="openstack/placement-db-sync-45gkb" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.309528 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3769ebb3-5ddc-4940-a088-79308a08ef6c-logs\") pod \"horizon-866c445777-2hmsg\" (UID: \"3769ebb3-5ddc-4940-a088-79308a08ef6c\") " pod="openstack/horizon-866c445777-2hmsg" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.309583 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3769ebb3-5ddc-4940-a088-79308a08ef6c-scripts\") pod \"horizon-866c445777-2hmsg\" (UID: \"3769ebb3-5ddc-4940-a088-79308a08ef6c\") " pod="openstack/horizon-866c445777-2hmsg" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.309638 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9a3961e-e61e-4a02-9fe4-bc1b5ae097cc-combined-ca-bundle\") pod \"neutron-db-sync-2mftx\" (UID: \"f9a3961e-e61e-4a02-9fe4-bc1b5ae097cc\") " pod="openstack/neutron-db-sync-2mftx" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.309692 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cef0985d-7a66-49f5-a4ac-fabf92844c3e-ovsdbserver-nb\") pod \"dnsmasq-dns-76fcf4b695-k2b4r\" (UID: \"cef0985d-7a66-49f5-a4ac-fabf92844c3e\") " pod="openstack/dnsmasq-dns-76fcf4b695-k2b4r" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.310092 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cef0985d-7a66-49f5-a4ac-fabf92844c3e-dns-svc\") pod \"dnsmasq-dns-76fcf4b695-k2b4r\" (UID: \"cef0985d-7a66-49f5-a4ac-fabf92844c3e\") " pod="openstack/dnsmasq-dns-76fcf4b695-k2b4r" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.310138 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3769ebb3-5ddc-4940-a088-79308a08ef6c-config-data\") pod \"horizon-866c445777-2hmsg\" (UID: \"3769ebb3-5ddc-4940-a088-79308a08ef6c\") " pod="openstack/horizon-866c445777-2hmsg" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.310170 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b04eaad6-de72-4265-aa3d-fda03a0ea925-scripts\") pod \"placement-db-sync-45gkb\" (UID: \"b04eaad6-de72-4265-aa3d-fda03a0ea925\") " pod="openstack/placement-db-sync-45gkb" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.310218 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f9a3961e-e61e-4a02-9fe4-bc1b5ae097cc-config\") pod \"neutron-db-sync-2mftx\" (UID: \"f9a3961e-e61e-4a02-9fe4-bc1b5ae097cc\") " pod="openstack/neutron-db-sync-2mftx" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.310268 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b04eaad6-de72-4265-aa3d-fda03a0ea925-logs\") pod \"placement-db-sync-45gkb\" (UID: \"b04eaad6-de72-4265-aa3d-fda03a0ea925\") " pod="openstack/placement-db-sync-45gkb" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.310322 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsz6g\" (UniqueName: \"kubernetes.io/projected/3769ebb3-5ddc-4940-a088-79308a08ef6c-kube-api-access-bsz6g\") pod \"horizon-866c445777-2hmsg\" (UID: \"3769ebb3-5ddc-4940-a088-79308a08ef6c\") " pod="openstack/horizon-866c445777-2hmsg" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.310351 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b04eaad6-de72-4265-aa3d-fda03a0ea925-config-data\") pod \"placement-db-sync-45gkb\" (UID: \"b04eaad6-de72-4265-aa3d-fda03a0ea925\") " pod="openstack/placement-db-sync-45gkb" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.310374 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cef0985d-7a66-49f5-a4ac-fabf92844c3e-config\") pod \"dnsmasq-dns-76fcf4b695-k2b4r\" (UID: \"cef0985d-7a66-49f5-a4ac-fabf92844c3e\") " pod="openstack/dnsmasq-dns-76fcf4b695-k2b4r" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.310424 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsmt5\" (UniqueName: \"kubernetes.io/projected/f9a3961e-e61e-4a02-9fe4-bc1b5ae097cc-kube-api-access-bsmt5\") pod \"neutron-db-sync-2mftx\" (UID: \"f9a3961e-e61e-4a02-9fe4-bc1b5ae097cc\") " pod="openstack/neutron-db-sync-2mftx" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.310452 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcgkm\" (UniqueName: \"kubernetes.io/projected/cef0985d-7a66-49f5-a4ac-fabf92844c3e-kube-api-access-zcgkm\") pod \"dnsmasq-dns-76fcf4b695-k2b4r\" (UID: \"cef0985d-7a66-49f5-a4ac-fabf92844c3e\") " pod="openstack/dnsmasq-dns-76fcf4b695-k2b4r" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.310481 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cef0985d-7a66-49f5-a4ac-fabf92844c3e-ovsdbserver-sb\") pod \"dnsmasq-dns-76fcf4b695-k2b4r\" (UID: \"cef0985d-7a66-49f5-a4ac-fabf92844c3e\") " pod="openstack/dnsmasq-dns-76fcf4b695-k2b4r" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.314291 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b04eaad6-de72-4265-aa3d-fda03a0ea925-logs\") pod \"placement-db-sync-45gkb\" (UID: \"b04eaad6-de72-4265-aa3d-fda03a0ea925\") " pod="openstack/placement-db-sync-45gkb" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.321916 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b04eaad6-de72-4265-aa3d-fda03a0ea925-combined-ca-bundle\") pod \"placement-db-sync-45gkb\" (UID: \"b04eaad6-de72-4265-aa3d-fda03a0ea925\") " pod="openstack/placement-db-sync-45gkb" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.324530 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9a3961e-e61e-4a02-9fe4-bc1b5ae097cc-combined-ca-bundle\") pod \"neutron-db-sync-2mftx\" (UID: \"f9a3961e-e61e-4a02-9fe4-bc1b5ae097cc\") " pod="openstack/neutron-db-sync-2mftx" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.324617 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b04eaad6-de72-4265-aa3d-fda03a0ea925-scripts\") pod \"placement-db-sync-45gkb\" (UID: \"b04eaad6-de72-4265-aa3d-fda03a0ea925\") " pod="openstack/placement-db-sync-45gkb" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.327044 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b04eaad6-de72-4265-aa3d-fda03a0ea925-config-data\") pod \"placement-db-sync-45gkb\" (UID: \"b04eaad6-de72-4265-aa3d-fda03a0ea925\") " pod="openstack/placement-db-sync-45gkb" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.327570 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f9a3961e-e61e-4a02-9fe4-bc1b5ae097cc-config\") pod \"neutron-db-sync-2mftx\" (UID: \"f9a3961e-e61e-4a02-9fe4-bc1b5ae097cc\") " pod="openstack/neutron-db-sync-2mftx" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.338496 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fngq9\" (UniqueName: \"kubernetes.io/projected/b04eaad6-de72-4265-aa3d-fda03a0ea925-kube-api-access-fngq9\") pod \"placement-db-sync-45gkb\" (UID: \"b04eaad6-de72-4265-aa3d-fda03a0ea925\") " pod="openstack/placement-db-sync-45gkb" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.345054 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsmt5\" (UniqueName: \"kubernetes.io/projected/f9a3961e-e61e-4a02-9fe4-bc1b5ae097cc-kube-api-access-bsmt5\") pod \"neutron-db-sync-2mftx\" (UID: \"f9a3961e-e61e-4a02-9fe4-bc1b5ae097cc\") " pod="openstack/neutron-db-sync-2mftx" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.399222 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-2mftx" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.411704 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cef0985d-7a66-49f5-a4ac-fabf92844c3e-ovsdbserver-nb\") pod \"dnsmasq-dns-76fcf4b695-k2b4r\" (UID: \"cef0985d-7a66-49f5-a4ac-fabf92844c3e\") " pod="openstack/dnsmasq-dns-76fcf4b695-k2b4r" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.411751 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cef0985d-7a66-49f5-a4ac-fabf92844c3e-dns-svc\") pod \"dnsmasq-dns-76fcf4b695-k2b4r\" (UID: \"cef0985d-7a66-49f5-a4ac-fabf92844c3e\") " pod="openstack/dnsmasq-dns-76fcf4b695-k2b4r" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.411786 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3769ebb3-5ddc-4940-a088-79308a08ef6c-config-data\") pod \"horizon-866c445777-2hmsg\" (UID: \"3769ebb3-5ddc-4940-a088-79308a08ef6c\") " pod="openstack/horizon-866c445777-2hmsg" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.411857 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsz6g\" (UniqueName: \"kubernetes.io/projected/3769ebb3-5ddc-4940-a088-79308a08ef6c-kube-api-access-bsz6g\") pod \"horizon-866c445777-2hmsg\" (UID: \"3769ebb3-5ddc-4940-a088-79308a08ef6c\") " pod="openstack/horizon-866c445777-2hmsg" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.411906 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cef0985d-7a66-49f5-a4ac-fabf92844c3e-config\") pod \"dnsmasq-dns-76fcf4b695-k2b4r\" (UID: \"cef0985d-7a66-49f5-a4ac-fabf92844c3e\") " pod="openstack/dnsmasq-dns-76fcf4b695-k2b4r" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.411944 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcgkm\" (UniqueName: \"kubernetes.io/projected/cef0985d-7a66-49f5-a4ac-fabf92844c3e-kube-api-access-zcgkm\") pod \"dnsmasq-dns-76fcf4b695-k2b4r\" (UID: \"cef0985d-7a66-49f5-a4ac-fabf92844c3e\") " pod="openstack/dnsmasq-dns-76fcf4b695-k2b4r" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.411978 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cef0985d-7a66-49f5-a4ac-fabf92844c3e-ovsdbserver-sb\") pod \"dnsmasq-dns-76fcf4b695-k2b4r\" (UID: \"cef0985d-7a66-49f5-a4ac-fabf92844c3e\") " pod="openstack/dnsmasq-dns-76fcf4b695-k2b4r" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.412012 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cef0985d-7a66-49f5-a4ac-fabf92844c3e-dns-swift-storage-0\") pod \"dnsmasq-dns-76fcf4b695-k2b4r\" (UID: \"cef0985d-7a66-49f5-a4ac-fabf92844c3e\") " pod="openstack/dnsmasq-dns-76fcf4b695-k2b4r" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.412041 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3769ebb3-5ddc-4940-a088-79308a08ef6c-horizon-secret-key\") pod \"horizon-866c445777-2hmsg\" (UID: \"3769ebb3-5ddc-4940-a088-79308a08ef6c\") " pod="openstack/horizon-866c445777-2hmsg" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.412087 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3769ebb3-5ddc-4940-a088-79308a08ef6c-logs\") pod \"horizon-866c445777-2hmsg\" (UID: \"3769ebb3-5ddc-4940-a088-79308a08ef6c\") " pod="openstack/horizon-866c445777-2hmsg" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.412125 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3769ebb3-5ddc-4940-a088-79308a08ef6c-scripts\") pod \"horizon-866c445777-2hmsg\" (UID: \"3769ebb3-5ddc-4940-a088-79308a08ef6c\") " pod="openstack/horizon-866c445777-2hmsg" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.413084 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3769ebb3-5ddc-4940-a088-79308a08ef6c-scripts\") pod \"horizon-866c445777-2hmsg\" (UID: \"3769ebb3-5ddc-4940-a088-79308a08ef6c\") " pod="openstack/horizon-866c445777-2hmsg" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.413977 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cef0985d-7a66-49f5-a4ac-fabf92844c3e-ovsdbserver-nb\") pod \"dnsmasq-dns-76fcf4b695-k2b4r\" (UID: \"cef0985d-7a66-49f5-a4ac-fabf92844c3e\") " pod="openstack/dnsmasq-dns-76fcf4b695-k2b4r" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.414926 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cef0985d-7a66-49f5-a4ac-fabf92844c3e-dns-svc\") pod \"dnsmasq-dns-76fcf4b695-k2b4r\" (UID: \"cef0985d-7a66-49f5-a4ac-fabf92844c3e\") " pod="openstack/dnsmasq-dns-76fcf4b695-k2b4r" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.414972 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cef0985d-7a66-49f5-a4ac-fabf92844c3e-config\") pod \"dnsmasq-dns-76fcf4b695-k2b4r\" (UID: \"cef0985d-7a66-49f5-a4ac-fabf92844c3e\") " pod="openstack/dnsmasq-dns-76fcf4b695-k2b4r" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.415209 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cef0985d-7a66-49f5-a4ac-fabf92844c3e-dns-swift-storage-0\") pod \"dnsmasq-dns-76fcf4b695-k2b4r\" (UID: \"cef0985d-7a66-49f5-a4ac-fabf92844c3e\") " pod="openstack/dnsmasq-dns-76fcf4b695-k2b4r" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.415770 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cef0985d-7a66-49f5-a4ac-fabf92844c3e-ovsdbserver-sb\") pod \"dnsmasq-dns-76fcf4b695-k2b4r\" (UID: \"cef0985d-7a66-49f5-a4ac-fabf92844c3e\") " pod="openstack/dnsmasq-dns-76fcf4b695-k2b4r" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.418767 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3769ebb3-5ddc-4940-a088-79308a08ef6c-logs\") pod \"horizon-866c445777-2hmsg\" (UID: \"3769ebb3-5ddc-4940-a088-79308a08ef6c\") " pod="openstack/horizon-866c445777-2hmsg" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.418905 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3769ebb3-5ddc-4940-a088-79308a08ef6c-config-data\") pod \"horizon-866c445777-2hmsg\" (UID: \"3769ebb3-5ddc-4940-a088-79308a08ef6c\") " pod="openstack/horizon-866c445777-2hmsg" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.438438 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsz6g\" (UniqueName: \"kubernetes.io/projected/3769ebb3-5ddc-4940-a088-79308a08ef6c-kube-api-access-bsz6g\") pod \"horizon-866c445777-2hmsg\" (UID: \"3769ebb3-5ddc-4940-a088-79308a08ef6c\") " pod="openstack/horizon-866c445777-2hmsg" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.440385 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3769ebb3-5ddc-4940-a088-79308a08ef6c-horizon-secret-key\") pod \"horizon-866c445777-2hmsg\" (UID: \"3769ebb3-5ddc-4940-a088-79308a08ef6c\") " pod="openstack/horizon-866c445777-2hmsg" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.441034 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-45gkb" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.443037 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcgkm\" (UniqueName: \"kubernetes.io/projected/cef0985d-7a66-49f5-a4ac-fabf92844c3e-kube-api-access-zcgkm\") pod \"dnsmasq-dns-76fcf4b695-k2b4r\" (UID: \"cef0985d-7a66-49f5-a4ac-fabf92844c3e\") " pod="openstack/dnsmasq-dns-76fcf4b695-k2b4r" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.535445 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-866c445777-2hmsg" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.549308 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fcf4b695-k2b4r" Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.654274 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-r6vhh"] Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.839012 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5d5c48cd87-fdhnt"] Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.846341 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-722hg"] Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.862441 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-wxj4s"] Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.970313 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-b5nzl"] Sep 30 14:19:23 crc kubenswrapper[4676]: I0930 14:19:23.980714 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 14:19:23 crc kubenswrapper[4676]: W0930 14:19:23.994182 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1fd6e1e_38da_4634_9862_21c027ea770a.slice/crio-bdfd0f30116ee60462a628ab7dedbf55fad6923b35dbf6e0076e8a5e1cf426e6 WatchSource:0}: Error finding container bdfd0f30116ee60462a628ab7dedbf55fad6923b35dbf6e0076e8a5e1cf426e6: Status 404 returned error can't find the container with id bdfd0f30116ee60462a628ab7dedbf55fad6923b35dbf6e0076e8a5e1cf426e6 Sep 30 14:19:24 crc kubenswrapper[4676]: W0930 14:19:24.032249 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5540da5e_a02e_437f_82a8_e0f74ac91760.slice/crio-8e7ea66401a3fe8da0c30c75221e2fe6c4c0e4c20f8ec8a4a4a7dabc57daea2f WatchSource:0}: Error finding container 8e7ea66401a3fe8da0c30c75221e2fe6c4c0e4c20f8ec8a4a4a7dabc57daea2f: Status 404 returned error can't find the container with id 8e7ea66401a3fe8da0c30c75221e2fe6c4c0e4c20f8ec8a4a4a7dabc57daea2f Sep 30 14:19:24 crc kubenswrapper[4676]: I0930 14:19:24.070690 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55fff446b9-r6vhh" event={"ID":"a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1","Type":"ContainerStarted","Data":"2b59b614b914471383a285c057d436cd13efe02cb809b9c004dc87dfd8a1631b"} Sep 30 14:19:24 crc kubenswrapper[4676]: I0930 14:19:24.073349 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5540da5e-a02e-437f-82a8-e0f74ac91760","Type":"ContainerStarted","Data":"8e7ea66401a3fe8da0c30c75221e2fe6c4c0e4c20f8ec8a4a4a7dabc57daea2f"} Sep 30 14:19:24 crc kubenswrapper[4676]: I0930 14:19:24.077153 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-722hg" event={"ID":"7f948430-7d30-45f5-95eb-559952c47cdd","Type":"ContainerStarted","Data":"eb123e5400cef122aa02d4fdd1be3d390dde73a3a87fbbf13e3b419cf26bbcbc"} Sep 30 14:19:24 crc kubenswrapper[4676]: I0930 14:19:24.079305 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-wxj4s" event={"ID":"63080796-b0be-4b3a-8db5-8242e2eb2bb3","Type":"ContainerStarted","Data":"c5e57c8741c9ce91017786c2a42a15da636d0308f2596ec415e201cb107129cf"} Sep 30 14:19:24 crc kubenswrapper[4676]: I0930 14:19:24.082600 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-b5nzl" event={"ID":"c1fd6e1e-38da-4634-9862-21c027ea770a","Type":"ContainerStarted","Data":"bdfd0f30116ee60462a628ab7dedbf55fad6923b35dbf6e0076e8a5e1cf426e6"} Sep 30 14:19:24 crc kubenswrapper[4676]: I0930 14:19:24.086590 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5d5c48cd87-fdhnt" event={"ID":"70e1b5b0-8a73-47b7-b1f6-2ce64a85d1ae","Type":"ContainerStarted","Data":"b0902124a4e885abdf52e2adec43f7c61ca9e6d7950f829ba01f978c11abd4e6"} Sep 30 14:19:24 crc kubenswrapper[4676]: I0930 14:19:24.346049 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-2mftx"] Sep 30 14:19:24 crc kubenswrapper[4676]: W0930 14:19:24.365771 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9a3961e_e61e_4a02_9fe4_bc1b5ae097cc.slice/crio-67d39d4e26cf5e5521d66ba11efa23b3ade4a19783643d7ebefff7340d0ac4ed WatchSource:0}: Error finding container 67d39d4e26cf5e5521d66ba11efa23b3ade4a19783643d7ebefff7340d0ac4ed: Status 404 returned error can't find the container with id 67d39d4e26cf5e5521d66ba11efa23b3ade4a19783643d7ebefff7340d0ac4ed Sep 30 14:19:24 crc kubenswrapper[4676]: I0930 14:19:24.478457 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-k2b4r"] Sep 30 14:19:24 crc kubenswrapper[4676]: W0930 14:19:24.521545 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcef0985d_7a66_49f5_a4ac_fabf92844c3e.slice/crio-204a08448dcc4ef2f75612186a75c6e67dcf048c5c40c62840f5281afd13fb21 WatchSource:0}: Error finding container 204a08448dcc4ef2f75612186a75c6e67dcf048c5c40c62840f5281afd13fb21: Status 404 returned error can't find the container with id 204a08448dcc4ef2f75612186a75c6e67dcf048c5c40c62840f5281afd13fb21 Sep 30 14:19:24 crc kubenswrapper[4676]: I0930 14:19:24.555499 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-45gkb"] Sep 30 14:19:24 crc kubenswrapper[4676]: W0930 14:19:24.566813 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb04eaad6_de72_4265_aa3d_fda03a0ea925.slice/crio-384c2dfa2233bfc3f317ddfb8882b62680b0db352c8bb0276707a18523693847 WatchSource:0}: Error finding container 384c2dfa2233bfc3f317ddfb8882b62680b0db352c8bb0276707a18523693847: Status 404 returned error can't find the container with id 384c2dfa2233bfc3f317ddfb8882b62680b0db352c8bb0276707a18523693847 Sep 30 14:19:24 crc kubenswrapper[4676]: I0930 14:19:24.571386 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-866c445777-2hmsg"] Sep 30 14:19:24 crc kubenswrapper[4676]: I0930 14:19:24.829838 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5d5c48cd87-fdhnt"] Sep 30 14:19:24 crc kubenswrapper[4676]: I0930 14:19:24.893096 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 14:19:24 crc kubenswrapper[4676]: I0930 14:19:24.948963 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-67b767d9c7-wpv5b"] Sep 30 14:19:24 crc kubenswrapper[4676]: I0930 14:19:24.950658 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67b767d9c7-wpv5b" Sep 30 14:19:24 crc kubenswrapper[4676]: I0930 14:19:24.979369 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-67b767d9c7-wpv5b"] Sep 30 14:19:25 crc kubenswrapper[4676]: I0930 14:19:25.057002 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3a53aec0-8d03-4a4a-8ae3-736fc762491e-horizon-secret-key\") pod \"horizon-67b767d9c7-wpv5b\" (UID: \"3a53aec0-8d03-4a4a-8ae3-736fc762491e\") " pod="openstack/horizon-67b767d9c7-wpv5b" Sep 30 14:19:25 crc kubenswrapper[4676]: I0930 14:19:25.057426 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a53aec0-8d03-4a4a-8ae3-736fc762491e-scripts\") pod \"horizon-67b767d9c7-wpv5b\" (UID: \"3a53aec0-8d03-4a4a-8ae3-736fc762491e\") " pod="openstack/horizon-67b767d9c7-wpv5b" Sep 30 14:19:25 crc kubenswrapper[4676]: I0930 14:19:25.057462 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3a53aec0-8d03-4a4a-8ae3-736fc762491e-config-data\") pod \"horizon-67b767d9c7-wpv5b\" (UID: \"3a53aec0-8d03-4a4a-8ae3-736fc762491e\") " pod="openstack/horizon-67b767d9c7-wpv5b" Sep 30 14:19:25 crc kubenswrapper[4676]: I0930 14:19:25.057579 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a53aec0-8d03-4a4a-8ae3-736fc762491e-logs\") pod \"horizon-67b767d9c7-wpv5b\" (UID: \"3a53aec0-8d03-4a4a-8ae3-736fc762491e\") " pod="openstack/horizon-67b767d9c7-wpv5b" Sep 30 14:19:25 crc kubenswrapper[4676]: I0930 14:19:25.057605 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zj6wh\" (UniqueName: \"kubernetes.io/projected/3a53aec0-8d03-4a4a-8ae3-736fc762491e-kube-api-access-zj6wh\") pod \"horizon-67b767d9c7-wpv5b\" (UID: \"3a53aec0-8d03-4a4a-8ae3-736fc762491e\") " pod="openstack/horizon-67b767d9c7-wpv5b" Sep 30 14:19:25 crc kubenswrapper[4676]: I0930 14:19:25.104170 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-k2b4r" event={"ID":"cef0985d-7a66-49f5-a4ac-fabf92844c3e","Type":"ContainerStarted","Data":"87d9756869ed2a1da3dd6f1fbcc5f711c00b50d58c463696069450cfbf198086"} Sep 30 14:19:25 crc kubenswrapper[4676]: I0930 14:19:25.104216 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-k2b4r" event={"ID":"cef0985d-7a66-49f5-a4ac-fabf92844c3e","Type":"ContainerStarted","Data":"204a08448dcc4ef2f75612186a75c6e67dcf048c5c40c62840f5281afd13fb21"} Sep 30 14:19:25 crc kubenswrapper[4676]: I0930 14:19:25.156784 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-45gkb" event={"ID":"b04eaad6-de72-4265-aa3d-fda03a0ea925","Type":"ContainerStarted","Data":"384c2dfa2233bfc3f317ddfb8882b62680b0db352c8bb0276707a18523693847"} Sep 30 14:19:25 crc kubenswrapper[4676]: I0930 14:19:25.159810 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a53aec0-8d03-4a4a-8ae3-736fc762491e-logs\") pod \"horizon-67b767d9c7-wpv5b\" (UID: \"3a53aec0-8d03-4a4a-8ae3-736fc762491e\") " pod="openstack/horizon-67b767d9c7-wpv5b" Sep 30 14:19:25 crc kubenswrapper[4676]: I0930 14:19:25.159892 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zj6wh\" (UniqueName: \"kubernetes.io/projected/3a53aec0-8d03-4a4a-8ae3-736fc762491e-kube-api-access-zj6wh\") pod \"horizon-67b767d9c7-wpv5b\" (UID: \"3a53aec0-8d03-4a4a-8ae3-736fc762491e\") " pod="openstack/horizon-67b767d9c7-wpv5b" Sep 30 14:19:25 crc kubenswrapper[4676]: I0930 14:19:25.159956 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3a53aec0-8d03-4a4a-8ae3-736fc762491e-horizon-secret-key\") pod \"horizon-67b767d9c7-wpv5b\" (UID: \"3a53aec0-8d03-4a4a-8ae3-736fc762491e\") " pod="openstack/horizon-67b767d9c7-wpv5b" Sep 30 14:19:25 crc kubenswrapper[4676]: I0930 14:19:25.159989 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a53aec0-8d03-4a4a-8ae3-736fc762491e-scripts\") pod \"horizon-67b767d9c7-wpv5b\" (UID: \"3a53aec0-8d03-4a4a-8ae3-736fc762491e\") " pod="openstack/horizon-67b767d9c7-wpv5b" Sep 30 14:19:25 crc kubenswrapper[4676]: I0930 14:19:25.160028 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3a53aec0-8d03-4a4a-8ae3-736fc762491e-config-data\") pod \"horizon-67b767d9c7-wpv5b\" (UID: \"3a53aec0-8d03-4a4a-8ae3-736fc762491e\") " pod="openstack/horizon-67b767d9c7-wpv5b" Sep 30 14:19:25 crc kubenswrapper[4676]: I0930 14:19:25.162568 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3a53aec0-8d03-4a4a-8ae3-736fc762491e-config-data\") pod \"horizon-67b767d9c7-wpv5b\" (UID: \"3a53aec0-8d03-4a4a-8ae3-736fc762491e\") " pod="openstack/horizon-67b767d9c7-wpv5b" Sep 30 14:19:25 crc kubenswrapper[4676]: I0930 14:19:25.164265 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a53aec0-8d03-4a4a-8ae3-736fc762491e-scripts\") pod \"horizon-67b767d9c7-wpv5b\" (UID: \"3a53aec0-8d03-4a4a-8ae3-736fc762491e\") " pod="openstack/horizon-67b767d9c7-wpv5b" Sep 30 14:19:25 crc kubenswrapper[4676]: I0930 14:19:25.171372 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a53aec0-8d03-4a4a-8ae3-736fc762491e-logs\") pod \"horizon-67b767d9c7-wpv5b\" (UID: \"3a53aec0-8d03-4a4a-8ae3-736fc762491e\") " pod="openstack/horizon-67b767d9c7-wpv5b" Sep 30 14:19:25 crc kubenswrapper[4676]: I0930 14:19:25.176605 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-866c445777-2hmsg" event={"ID":"3769ebb3-5ddc-4940-a088-79308a08ef6c","Type":"ContainerStarted","Data":"4d954a525417b7b68d1053da78695fa7ba069d87858ae7a6cbd808d1e9781cfb"} Sep 30 14:19:25 crc kubenswrapper[4676]: I0930 14:19:25.181585 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3a53aec0-8d03-4a4a-8ae3-736fc762491e-horizon-secret-key\") pod \"horizon-67b767d9c7-wpv5b\" (UID: \"3a53aec0-8d03-4a4a-8ae3-736fc762491e\") " pod="openstack/horizon-67b767d9c7-wpv5b" Sep 30 14:19:25 crc kubenswrapper[4676]: I0930 14:19:25.220820 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zj6wh\" (UniqueName: \"kubernetes.io/projected/3a53aec0-8d03-4a4a-8ae3-736fc762491e-kube-api-access-zj6wh\") pod \"horizon-67b767d9c7-wpv5b\" (UID: \"3a53aec0-8d03-4a4a-8ae3-736fc762491e\") " pod="openstack/horizon-67b767d9c7-wpv5b" Sep 30 14:19:25 crc kubenswrapper[4676]: I0930 14:19:25.223773 4676 generic.go:334] "Generic (PLEG): container finished" podID="a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1" containerID="d87049e44855d743148a0434eea94ba8c3b31fcdac1921b2183f37b460b3441f" exitCode=0 Sep 30 14:19:25 crc kubenswrapper[4676]: I0930 14:19:25.223903 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55fff446b9-r6vhh" event={"ID":"a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1","Type":"ContainerDied","Data":"d87049e44855d743148a0434eea94ba8c3b31fcdac1921b2183f37b460b3441f"} Sep 30 14:19:25 crc kubenswrapper[4676]: I0930 14:19:25.246212 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-722hg" event={"ID":"7f948430-7d30-45f5-95eb-559952c47cdd","Type":"ContainerStarted","Data":"15d4fa73b81c64d64a663b9b1b1d62f874e8abfa7d4eb14bfeea527f22babbf6"} Sep 30 14:19:25 crc kubenswrapper[4676]: I0930 14:19:25.264982 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-2mftx" event={"ID":"f9a3961e-e61e-4a02-9fe4-bc1b5ae097cc","Type":"ContainerStarted","Data":"a9ee4e61b97d16119699b853291cd49fbeb6cc8203cf25984ecfb1d6e96db9cf"} Sep 30 14:19:25 crc kubenswrapper[4676]: I0930 14:19:25.265029 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-2mftx" event={"ID":"f9a3961e-e61e-4a02-9fe4-bc1b5ae097cc","Type":"ContainerStarted","Data":"67d39d4e26cf5e5521d66ba11efa23b3ade4a19783643d7ebefff7340d0ac4ed"} Sep 30 14:19:25 crc kubenswrapper[4676]: I0930 14:19:25.305611 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67b767d9c7-wpv5b" Sep 30 14:19:25 crc kubenswrapper[4676]: I0930 14:19:25.306681 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-2mftx" podStartSLOduration=3.306659404 podStartE2EDuration="3.306659404s" podCreationTimestamp="2025-09-30 14:19:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:19:25.303936902 +0000 UTC m=+1269.287025331" watchObservedRunningTime="2025-09-30 14:19:25.306659404 +0000 UTC m=+1269.289747833" Sep 30 14:19:25 crc kubenswrapper[4676]: I0930 14:19:25.352456 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-722hg" podStartSLOduration=3.352430634 podStartE2EDuration="3.352430634s" podCreationTimestamp="2025-09-30 14:19:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:19:25.341436664 +0000 UTC m=+1269.324525103" watchObservedRunningTime="2025-09-30 14:19:25.352430634 +0000 UTC m=+1269.335519053" Sep 30 14:19:25 crc kubenswrapper[4676]: I0930 14:19:25.748574 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-67b767d9c7-wpv5b"] Sep 30 14:19:25 crc kubenswrapper[4676]: I0930 14:19:25.873734 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fff446b9-r6vhh" Sep 30 14:19:25 crc kubenswrapper[4676]: I0930 14:19:25.991757 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1-dns-svc\") pod \"a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1\" (UID: \"a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1\") " Sep 30 14:19:25 crc kubenswrapper[4676]: I0930 14:19:25.991821 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1-config\") pod \"a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1\" (UID: \"a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1\") " Sep 30 14:19:25 crc kubenswrapper[4676]: I0930 14:19:25.991936 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1-ovsdbserver-sb\") pod \"a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1\" (UID: \"a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1\") " Sep 30 14:19:25 crc kubenswrapper[4676]: I0930 14:19:25.991988 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1-ovsdbserver-nb\") pod \"a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1\" (UID: \"a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1\") " Sep 30 14:19:25 crc kubenswrapper[4676]: I0930 14:19:25.992013 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvqxh\" (UniqueName: \"kubernetes.io/projected/a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1-kube-api-access-xvqxh\") pod \"a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1\" (UID: \"a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1\") " Sep 30 14:19:25 crc kubenswrapper[4676]: I0930 14:19:25.992055 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1-dns-swift-storage-0\") pod \"a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1\" (UID: \"a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1\") " Sep 30 14:19:26 crc kubenswrapper[4676]: I0930 14:19:26.033015 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1-kube-api-access-xvqxh" (OuterVolumeSpecName: "kube-api-access-xvqxh") pod "a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1" (UID: "a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1"). InnerVolumeSpecName "kube-api-access-xvqxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:19:26 crc kubenswrapper[4676]: I0930 14:19:26.056722 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1" (UID: "a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:19:26 crc kubenswrapper[4676]: I0930 14:19:26.056725 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1" (UID: "a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:19:26 crc kubenswrapper[4676]: I0930 14:19:26.075591 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1" (UID: "a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:19:26 crc kubenswrapper[4676]: I0930 14:19:26.076013 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1" (UID: "a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:19:26 crc kubenswrapper[4676]: I0930 14:19:26.090410 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1-config" (OuterVolumeSpecName: "config") pod "a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1" (UID: "a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:19:26 crc kubenswrapper[4676]: I0930 14:19:26.099817 4676 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 14:19:26 crc kubenswrapper[4676]: I0930 14:19:26.100321 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvqxh\" (UniqueName: \"kubernetes.io/projected/a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1-kube-api-access-xvqxh\") on node \"crc\" DevicePath \"\"" Sep 30 14:19:26 crc kubenswrapper[4676]: I0930 14:19:26.100916 4676 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 14:19:26 crc kubenswrapper[4676]: I0930 14:19:26.100935 4676 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 14:19:26 crc kubenswrapper[4676]: I0930 14:19:26.100951 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1-config\") on node \"crc\" DevicePath \"\"" Sep 30 14:19:26 crc kubenswrapper[4676]: I0930 14:19:26.100965 4676 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 14:19:26 crc kubenswrapper[4676]: I0930 14:19:26.280758 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67b767d9c7-wpv5b" event={"ID":"3a53aec0-8d03-4a4a-8ae3-736fc762491e","Type":"ContainerStarted","Data":"d7217f39b8ed39c7e306baa3acbdcfdd3787cab66d62512545f944be534efcd6"} Sep 30 14:19:26 crc kubenswrapper[4676]: I0930 14:19:26.283283 4676 generic.go:334] "Generic (PLEG): container finished" podID="cef0985d-7a66-49f5-a4ac-fabf92844c3e" containerID="87d9756869ed2a1da3dd6f1fbcc5f711c00b50d58c463696069450cfbf198086" exitCode=0 Sep 30 14:19:26 crc kubenswrapper[4676]: I0930 14:19:26.283380 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-k2b4r" event={"ID":"cef0985d-7a66-49f5-a4ac-fabf92844c3e","Type":"ContainerDied","Data":"87d9756869ed2a1da3dd6f1fbcc5f711c00b50d58c463696069450cfbf198086"} Sep 30 14:19:26 crc kubenswrapper[4676]: I0930 14:19:26.286553 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fff446b9-r6vhh" Sep 30 14:19:26 crc kubenswrapper[4676]: I0930 14:19:26.287068 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55fff446b9-r6vhh" event={"ID":"a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1","Type":"ContainerDied","Data":"2b59b614b914471383a285c057d436cd13efe02cb809b9c004dc87dfd8a1631b"} Sep 30 14:19:26 crc kubenswrapper[4676]: I0930 14:19:26.287165 4676 scope.go:117] "RemoveContainer" containerID="d87049e44855d743148a0434eea94ba8c3b31fcdac1921b2183f37b460b3441f" Sep 30 14:19:26 crc kubenswrapper[4676]: I0930 14:19:26.411374 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-r6vhh"] Sep 30 14:19:26 crc kubenswrapper[4676]: I0930 14:19:26.416302 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-r6vhh"] Sep 30 14:19:27 crc kubenswrapper[4676]: I0930 14:19:27.456054 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1" path="/var/lib/kubelet/pods/a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1/volumes" Sep 30 14:19:28 crc kubenswrapper[4676]: I0930 14:19:28.330213 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-k2b4r" event={"ID":"cef0985d-7a66-49f5-a4ac-fabf92844c3e","Type":"ContainerStarted","Data":"185eae2b3dcc4a3ac93914b66f37a3ecbfc99964c50d6eb283b10d3dbdff7545"} Sep 30 14:19:28 crc kubenswrapper[4676]: I0930 14:19:28.331748 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-76fcf4b695-k2b4r" Sep 30 14:19:28 crc kubenswrapper[4676]: I0930 14:19:28.360552 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-76fcf4b695-k2b4r" podStartSLOduration=6.360531147 podStartE2EDuration="6.360531147s" podCreationTimestamp="2025-09-30 14:19:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:19:28.359072488 +0000 UTC m=+1272.342160907" watchObservedRunningTime="2025-09-30 14:19:28.360531147 +0000 UTC m=+1272.343619576" Sep 30 14:19:29 crc kubenswrapper[4676]: I0930 14:19:29.919923 4676 patch_prober.go:28] interesting pod/machine-config-daemon-4k2dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:19:29 crc kubenswrapper[4676]: I0930 14:19:29.920505 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:19:31 crc kubenswrapper[4676]: I0930 14:19:31.453863 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-866c445777-2hmsg"] Sep 30 14:19:31 crc kubenswrapper[4676]: I0930 14:19:31.473476 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6668cdff8d-z8vnk"] Sep 30 14:19:31 crc kubenswrapper[4676]: E0930 14:19:31.474457 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1" containerName="init" Sep 30 14:19:31 crc kubenswrapper[4676]: I0930 14:19:31.474552 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1" containerName="init" Sep 30 14:19:31 crc kubenswrapper[4676]: I0930 14:19:31.474822 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9c8fa15-0dff-4a7b-b8cb-0a5d92ce98c1" containerName="init" Sep 30 14:19:31 crc kubenswrapper[4676]: I0930 14:19:31.478487 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6668cdff8d-z8vnk" Sep 30 14:19:31 crc kubenswrapper[4676]: I0930 14:19:31.489675 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Sep 30 14:19:31 crc kubenswrapper[4676]: I0930 14:19:31.491800 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6668cdff8d-z8vnk"] Sep 30 14:19:31 crc kubenswrapper[4676]: I0930 14:19:31.541851 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-67b767d9c7-wpv5b"] Sep 30 14:19:31 crc kubenswrapper[4676]: I0930 14:19:31.560372 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-68fc47cdb4-6758j"] Sep 30 14:19:31 crc kubenswrapper[4676]: I0930 14:19:31.562142 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-68fc47cdb4-6758j" Sep 30 14:19:31 crc kubenswrapper[4676]: I0930 14:19:31.577655 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-68fc47cdb4-6758j"] Sep 30 14:19:31 crc kubenswrapper[4676]: I0930 14:19:31.631988 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ace12602-f0f7-4f29-8c37-72c1e840bacc-horizon-tls-certs\") pod \"horizon-6668cdff8d-z8vnk\" (UID: \"ace12602-f0f7-4f29-8c37-72c1e840bacc\") " pod="openstack/horizon-6668cdff8d-z8vnk" Sep 30 14:19:31 crc kubenswrapper[4676]: I0930 14:19:31.632094 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ace12602-f0f7-4f29-8c37-72c1e840bacc-horizon-secret-key\") pod \"horizon-6668cdff8d-z8vnk\" (UID: \"ace12602-f0f7-4f29-8c37-72c1e840bacc\") " pod="openstack/horizon-6668cdff8d-z8vnk" Sep 30 14:19:31 crc kubenswrapper[4676]: I0930 14:19:31.632130 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ace12602-f0f7-4f29-8c37-72c1e840bacc-scripts\") pod \"horizon-6668cdff8d-z8vnk\" (UID: \"ace12602-f0f7-4f29-8c37-72c1e840bacc\") " pod="openstack/horizon-6668cdff8d-z8vnk" Sep 30 14:19:31 crc kubenswrapper[4676]: I0930 14:19:31.632167 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ace12602-f0f7-4f29-8c37-72c1e840bacc-logs\") pod \"horizon-6668cdff8d-z8vnk\" (UID: \"ace12602-f0f7-4f29-8c37-72c1e840bacc\") " pod="openstack/horizon-6668cdff8d-z8vnk" Sep 30 14:19:31 crc kubenswrapper[4676]: I0930 14:19:31.632212 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ace12602-f0f7-4f29-8c37-72c1e840bacc-combined-ca-bundle\") pod \"horizon-6668cdff8d-z8vnk\" (UID: \"ace12602-f0f7-4f29-8c37-72c1e840bacc\") " pod="openstack/horizon-6668cdff8d-z8vnk" Sep 30 14:19:31 crc kubenswrapper[4676]: I0930 14:19:31.632274 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ace12602-f0f7-4f29-8c37-72c1e840bacc-config-data\") pod \"horizon-6668cdff8d-z8vnk\" (UID: \"ace12602-f0f7-4f29-8c37-72c1e840bacc\") " pod="openstack/horizon-6668cdff8d-z8vnk" Sep 30 14:19:31 crc kubenswrapper[4676]: I0930 14:19:31.632295 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrhhz\" (UniqueName: \"kubernetes.io/projected/ace12602-f0f7-4f29-8c37-72c1e840bacc-kube-api-access-rrhhz\") pod \"horizon-6668cdff8d-z8vnk\" (UID: \"ace12602-f0f7-4f29-8c37-72c1e840bacc\") " pod="openstack/horizon-6668cdff8d-z8vnk" Sep 30 14:19:31 crc kubenswrapper[4676]: I0930 14:19:31.734403 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a020c8ba-b848-4a3f-80e4-b3692cf99ffa-scripts\") pod \"horizon-68fc47cdb4-6758j\" (UID: \"a020c8ba-b848-4a3f-80e4-b3692cf99ffa\") " pod="openstack/horizon-68fc47cdb4-6758j" Sep 30 14:19:31 crc kubenswrapper[4676]: I0930 14:19:31.734657 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a020c8ba-b848-4a3f-80e4-b3692cf99ffa-horizon-tls-certs\") pod \"horizon-68fc47cdb4-6758j\" (UID: \"a020c8ba-b848-4a3f-80e4-b3692cf99ffa\") " pod="openstack/horizon-68fc47cdb4-6758j" Sep 30 14:19:31 crc kubenswrapper[4676]: I0930 14:19:31.734727 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ace12602-f0f7-4f29-8c37-72c1e840bacc-horizon-secret-key\") pod \"horizon-6668cdff8d-z8vnk\" (UID: \"ace12602-f0f7-4f29-8c37-72c1e840bacc\") " pod="openstack/horizon-6668cdff8d-z8vnk" Sep 30 14:19:31 crc kubenswrapper[4676]: I0930 14:19:31.734789 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ace12602-f0f7-4f29-8c37-72c1e840bacc-scripts\") pod \"horizon-6668cdff8d-z8vnk\" (UID: \"ace12602-f0f7-4f29-8c37-72c1e840bacc\") " pod="openstack/horizon-6668cdff8d-z8vnk" Sep 30 14:19:31 crc kubenswrapper[4676]: I0930 14:19:31.734869 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ace12602-f0f7-4f29-8c37-72c1e840bacc-logs\") pod \"horizon-6668cdff8d-z8vnk\" (UID: \"ace12602-f0f7-4f29-8c37-72c1e840bacc\") " pod="openstack/horizon-6668cdff8d-z8vnk" Sep 30 14:19:31 crc kubenswrapper[4676]: I0930 14:19:31.735026 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ace12602-f0f7-4f29-8c37-72c1e840bacc-combined-ca-bundle\") pod \"horizon-6668cdff8d-z8vnk\" (UID: \"ace12602-f0f7-4f29-8c37-72c1e840bacc\") " pod="openstack/horizon-6668cdff8d-z8vnk" Sep 30 14:19:31 crc kubenswrapper[4676]: I0930 14:19:31.735096 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a020c8ba-b848-4a3f-80e4-b3692cf99ffa-config-data\") pod \"horizon-68fc47cdb4-6758j\" (UID: \"a020c8ba-b848-4a3f-80e4-b3692cf99ffa\") " pod="openstack/horizon-68fc47cdb4-6758j" Sep 30 14:19:31 crc kubenswrapper[4676]: I0930 14:19:31.735183 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a020c8ba-b848-4a3f-80e4-b3692cf99ffa-logs\") pod \"horizon-68fc47cdb4-6758j\" (UID: \"a020c8ba-b848-4a3f-80e4-b3692cf99ffa\") " pod="openstack/horizon-68fc47cdb4-6758j" Sep 30 14:19:31 crc kubenswrapper[4676]: I0930 14:19:31.735221 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ace12602-f0f7-4f29-8c37-72c1e840bacc-config-data\") pod \"horizon-6668cdff8d-z8vnk\" (UID: \"ace12602-f0f7-4f29-8c37-72c1e840bacc\") " pod="openstack/horizon-6668cdff8d-z8vnk" Sep 30 14:19:31 crc kubenswrapper[4676]: I0930 14:19:31.735248 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrhhz\" (UniqueName: \"kubernetes.io/projected/ace12602-f0f7-4f29-8c37-72c1e840bacc-kube-api-access-rrhhz\") pod \"horizon-6668cdff8d-z8vnk\" (UID: \"ace12602-f0f7-4f29-8c37-72c1e840bacc\") " pod="openstack/horizon-6668cdff8d-z8vnk" Sep 30 14:19:31 crc kubenswrapper[4676]: I0930 14:19:31.735340 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtt9k\" (UniqueName: \"kubernetes.io/projected/a020c8ba-b848-4a3f-80e4-b3692cf99ffa-kube-api-access-dtt9k\") pod \"horizon-68fc47cdb4-6758j\" (UID: \"a020c8ba-b848-4a3f-80e4-b3692cf99ffa\") " pod="openstack/horizon-68fc47cdb4-6758j" Sep 30 14:19:31 crc kubenswrapper[4676]: I0930 14:19:31.735381 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a020c8ba-b848-4a3f-80e4-b3692cf99ffa-horizon-secret-key\") pod \"horizon-68fc47cdb4-6758j\" (UID: \"a020c8ba-b848-4a3f-80e4-b3692cf99ffa\") " pod="openstack/horizon-68fc47cdb4-6758j" Sep 30 14:19:31 crc kubenswrapper[4676]: I0930 14:19:31.735477 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ace12602-f0f7-4f29-8c37-72c1e840bacc-horizon-tls-certs\") pod \"horizon-6668cdff8d-z8vnk\" (UID: \"ace12602-f0f7-4f29-8c37-72c1e840bacc\") " pod="openstack/horizon-6668cdff8d-z8vnk" Sep 30 14:19:31 crc kubenswrapper[4676]: I0930 14:19:31.735513 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a020c8ba-b848-4a3f-80e4-b3692cf99ffa-combined-ca-bundle\") pod \"horizon-68fc47cdb4-6758j\" (UID: \"a020c8ba-b848-4a3f-80e4-b3692cf99ffa\") " pod="openstack/horizon-68fc47cdb4-6758j" Sep 30 14:19:31 crc kubenswrapper[4676]: I0930 14:19:31.735419 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ace12602-f0f7-4f29-8c37-72c1e840bacc-logs\") pod \"horizon-6668cdff8d-z8vnk\" (UID: \"ace12602-f0f7-4f29-8c37-72c1e840bacc\") " pod="openstack/horizon-6668cdff8d-z8vnk" Sep 30 14:19:31 crc kubenswrapper[4676]: I0930 14:19:31.736995 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ace12602-f0f7-4f29-8c37-72c1e840bacc-config-data\") pod \"horizon-6668cdff8d-z8vnk\" (UID: \"ace12602-f0f7-4f29-8c37-72c1e840bacc\") " pod="openstack/horizon-6668cdff8d-z8vnk" Sep 30 14:19:31 crc kubenswrapper[4676]: I0930 14:19:31.737026 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ace12602-f0f7-4f29-8c37-72c1e840bacc-scripts\") pod \"horizon-6668cdff8d-z8vnk\" (UID: \"ace12602-f0f7-4f29-8c37-72c1e840bacc\") " pod="openstack/horizon-6668cdff8d-z8vnk" Sep 30 14:19:31 crc kubenswrapper[4676]: I0930 14:19:31.743636 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ace12602-f0f7-4f29-8c37-72c1e840bacc-horizon-tls-certs\") pod \"horizon-6668cdff8d-z8vnk\" (UID: \"ace12602-f0f7-4f29-8c37-72c1e840bacc\") " pod="openstack/horizon-6668cdff8d-z8vnk" Sep 30 14:19:31 crc kubenswrapper[4676]: I0930 14:19:31.743677 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ace12602-f0f7-4f29-8c37-72c1e840bacc-combined-ca-bundle\") pod \"horizon-6668cdff8d-z8vnk\" (UID: \"ace12602-f0f7-4f29-8c37-72c1e840bacc\") " pod="openstack/horizon-6668cdff8d-z8vnk" Sep 30 14:19:31 crc kubenswrapper[4676]: I0930 14:19:31.743988 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ace12602-f0f7-4f29-8c37-72c1e840bacc-horizon-secret-key\") pod \"horizon-6668cdff8d-z8vnk\" (UID: \"ace12602-f0f7-4f29-8c37-72c1e840bacc\") " pod="openstack/horizon-6668cdff8d-z8vnk" Sep 30 14:19:31 crc kubenswrapper[4676]: I0930 14:19:31.760490 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrhhz\" (UniqueName: \"kubernetes.io/projected/ace12602-f0f7-4f29-8c37-72c1e840bacc-kube-api-access-rrhhz\") pod \"horizon-6668cdff8d-z8vnk\" (UID: \"ace12602-f0f7-4f29-8c37-72c1e840bacc\") " pod="openstack/horizon-6668cdff8d-z8vnk" Sep 30 14:19:31 crc kubenswrapper[4676]: I0930 14:19:31.800758 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6668cdff8d-z8vnk" Sep 30 14:19:31 crc kubenswrapper[4676]: I0930 14:19:31.836761 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a020c8ba-b848-4a3f-80e4-b3692cf99ffa-horizon-tls-certs\") pod \"horizon-68fc47cdb4-6758j\" (UID: \"a020c8ba-b848-4a3f-80e4-b3692cf99ffa\") " pod="openstack/horizon-68fc47cdb4-6758j" Sep 30 14:19:31 crc kubenswrapper[4676]: I0930 14:19:31.836859 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a020c8ba-b848-4a3f-80e4-b3692cf99ffa-config-data\") pod \"horizon-68fc47cdb4-6758j\" (UID: \"a020c8ba-b848-4a3f-80e4-b3692cf99ffa\") " pod="openstack/horizon-68fc47cdb4-6758j" Sep 30 14:19:31 crc kubenswrapper[4676]: I0930 14:19:31.836909 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a020c8ba-b848-4a3f-80e4-b3692cf99ffa-logs\") pod \"horizon-68fc47cdb4-6758j\" (UID: \"a020c8ba-b848-4a3f-80e4-b3692cf99ffa\") " pod="openstack/horizon-68fc47cdb4-6758j" Sep 30 14:19:31 crc kubenswrapper[4676]: I0930 14:19:31.836946 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtt9k\" (UniqueName: \"kubernetes.io/projected/a020c8ba-b848-4a3f-80e4-b3692cf99ffa-kube-api-access-dtt9k\") pod \"horizon-68fc47cdb4-6758j\" (UID: \"a020c8ba-b848-4a3f-80e4-b3692cf99ffa\") " pod="openstack/horizon-68fc47cdb4-6758j" Sep 30 14:19:31 crc kubenswrapper[4676]: I0930 14:19:31.836967 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a020c8ba-b848-4a3f-80e4-b3692cf99ffa-horizon-secret-key\") pod \"horizon-68fc47cdb4-6758j\" (UID: \"a020c8ba-b848-4a3f-80e4-b3692cf99ffa\") " pod="openstack/horizon-68fc47cdb4-6758j" Sep 30 14:19:31 crc kubenswrapper[4676]: I0930 14:19:31.837007 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a020c8ba-b848-4a3f-80e4-b3692cf99ffa-combined-ca-bundle\") pod \"horizon-68fc47cdb4-6758j\" (UID: \"a020c8ba-b848-4a3f-80e4-b3692cf99ffa\") " pod="openstack/horizon-68fc47cdb4-6758j" Sep 30 14:19:31 crc kubenswrapper[4676]: I0930 14:19:31.837035 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a020c8ba-b848-4a3f-80e4-b3692cf99ffa-scripts\") pod \"horizon-68fc47cdb4-6758j\" (UID: \"a020c8ba-b848-4a3f-80e4-b3692cf99ffa\") " pod="openstack/horizon-68fc47cdb4-6758j" Sep 30 14:19:31 crc kubenswrapper[4676]: I0930 14:19:31.837713 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a020c8ba-b848-4a3f-80e4-b3692cf99ffa-scripts\") pod \"horizon-68fc47cdb4-6758j\" (UID: \"a020c8ba-b848-4a3f-80e4-b3692cf99ffa\") " pod="openstack/horizon-68fc47cdb4-6758j" Sep 30 14:19:31 crc kubenswrapper[4676]: I0930 14:19:31.837972 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a020c8ba-b848-4a3f-80e4-b3692cf99ffa-logs\") pod \"horizon-68fc47cdb4-6758j\" (UID: \"a020c8ba-b848-4a3f-80e4-b3692cf99ffa\") " pod="openstack/horizon-68fc47cdb4-6758j" Sep 30 14:19:31 crc kubenswrapper[4676]: I0930 14:19:31.839158 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a020c8ba-b848-4a3f-80e4-b3692cf99ffa-config-data\") pod \"horizon-68fc47cdb4-6758j\" (UID: \"a020c8ba-b848-4a3f-80e4-b3692cf99ffa\") " pod="openstack/horizon-68fc47cdb4-6758j" Sep 30 14:19:31 crc kubenswrapper[4676]: I0930 14:19:31.841254 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a020c8ba-b848-4a3f-80e4-b3692cf99ffa-horizon-secret-key\") pod \"horizon-68fc47cdb4-6758j\" (UID: \"a020c8ba-b848-4a3f-80e4-b3692cf99ffa\") " pod="openstack/horizon-68fc47cdb4-6758j" Sep 30 14:19:31 crc kubenswrapper[4676]: I0930 14:19:31.841925 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a020c8ba-b848-4a3f-80e4-b3692cf99ffa-combined-ca-bundle\") pod \"horizon-68fc47cdb4-6758j\" (UID: \"a020c8ba-b848-4a3f-80e4-b3692cf99ffa\") " pod="openstack/horizon-68fc47cdb4-6758j" Sep 30 14:19:31 crc kubenswrapper[4676]: I0930 14:19:31.847570 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a020c8ba-b848-4a3f-80e4-b3692cf99ffa-horizon-tls-certs\") pod \"horizon-68fc47cdb4-6758j\" (UID: \"a020c8ba-b848-4a3f-80e4-b3692cf99ffa\") " pod="openstack/horizon-68fc47cdb4-6758j" Sep 30 14:19:31 crc kubenswrapper[4676]: I0930 14:19:31.854744 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtt9k\" (UniqueName: \"kubernetes.io/projected/a020c8ba-b848-4a3f-80e4-b3692cf99ffa-kube-api-access-dtt9k\") pod \"horizon-68fc47cdb4-6758j\" (UID: \"a020c8ba-b848-4a3f-80e4-b3692cf99ffa\") " pod="openstack/horizon-68fc47cdb4-6758j" Sep 30 14:19:31 crc kubenswrapper[4676]: I0930 14:19:31.895366 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-68fc47cdb4-6758j" Sep 30 14:19:33 crc kubenswrapper[4676]: I0930 14:19:33.551249 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-76fcf4b695-k2b4r" Sep 30 14:19:33 crc kubenswrapper[4676]: I0930 14:19:33.626839 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-q8d7x"] Sep 30 14:19:33 crc kubenswrapper[4676]: I0930 14:19:33.627490 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-77585f5f8c-q8d7x" podUID="701a9ff6-6a4f-4caf-8ea1-464d13196d6c" containerName="dnsmasq-dns" containerID="cri-o://b946a9ca330f94d7a0033eb5c9cd9cf254b86d53180921a82bb1d3cef6860bdc" gracePeriod=10 Sep 30 14:19:34 crc kubenswrapper[4676]: I0930 14:19:34.400206 4676 generic.go:334] "Generic (PLEG): container finished" podID="701a9ff6-6a4f-4caf-8ea1-464d13196d6c" containerID="b946a9ca330f94d7a0033eb5c9cd9cf254b86d53180921a82bb1d3cef6860bdc" exitCode=0 Sep 30 14:19:34 crc kubenswrapper[4676]: I0930 14:19:34.400265 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-q8d7x" event={"ID":"701a9ff6-6a4f-4caf-8ea1-464d13196d6c","Type":"ContainerDied","Data":"b946a9ca330f94d7a0033eb5c9cd9cf254b86d53180921a82bb1d3cef6860bdc"} Sep 30 14:19:42 crc kubenswrapper[4676]: I0930 14:19:42.167993 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-77585f5f8c-q8d7x" podUID="701a9ff6-6a4f-4caf-8ea1-464d13196d6c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.132:5353: i/o timeout" Sep 30 14:19:42 crc kubenswrapper[4676]: I0930 14:19:42.474447 4676 generic.go:334] "Generic (PLEG): container finished" podID="7f948430-7d30-45f5-95eb-559952c47cdd" containerID="15d4fa73b81c64d64a663b9b1b1d62f874e8abfa7d4eb14bfeea527f22babbf6" exitCode=0 Sep 30 14:19:42 crc kubenswrapper[4676]: I0930 14:19:42.474569 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-722hg" event={"ID":"7f948430-7d30-45f5-95eb-559952c47cdd","Type":"ContainerDied","Data":"15d4fa73b81c64d64a663b9b1b1d62f874e8abfa7d4eb14bfeea527f22babbf6"} Sep 30 14:19:46 crc kubenswrapper[4676]: I0930 14:19:46.122119 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-q8d7x" Sep 30 14:19:46 crc kubenswrapper[4676]: I0930 14:19:46.228092 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/701a9ff6-6a4f-4caf-8ea1-464d13196d6c-ovsdbserver-sb\") pod \"701a9ff6-6a4f-4caf-8ea1-464d13196d6c\" (UID: \"701a9ff6-6a4f-4caf-8ea1-464d13196d6c\") " Sep 30 14:19:46 crc kubenswrapper[4676]: I0930 14:19:46.228225 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/701a9ff6-6a4f-4caf-8ea1-464d13196d6c-ovsdbserver-nb\") pod \"701a9ff6-6a4f-4caf-8ea1-464d13196d6c\" (UID: \"701a9ff6-6a4f-4caf-8ea1-464d13196d6c\") " Sep 30 14:19:46 crc kubenswrapper[4676]: I0930 14:19:46.228275 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/701a9ff6-6a4f-4caf-8ea1-464d13196d6c-dns-svc\") pod \"701a9ff6-6a4f-4caf-8ea1-464d13196d6c\" (UID: \"701a9ff6-6a4f-4caf-8ea1-464d13196d6c\") " Sep 30 14:19:46 crc kubenswrapper[4676]: I0930 14:19:46.228332 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/701a9ff6-6a4f-4caf-8ea1-464d13196d6c-dns-swift-storage-0\") pod \"701a9ff6-6a4f-4caf-8ea1-464d13196d6c\" (UID: \"701a9ff6-6a4f-4caf-8ea1-464d13196d6c\") " Sep 30 14:19:46 crc kubenswrapper[4676]: I0930 14:19:46.228437 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jpq7k\" (UniqueName: \"kubernetes.io/projected/701a9ff6-6a4f-4caf-8ea1-464d13196d6c-kube-api-access-jpq7k\") pod \"701a9ff6-6a4f-4caf-8ea1-464d13196d6c\" (UID: \"701a9ff6-6a4f-4caf-8ea1-464d13196d6c\") " Sep 30 14:19:46 crc kubenswrapper[4676]: I0930 14:19:46.228540 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/701a9ff6-6a4f-4caf-8ea1-464d13196d6c-config\") pod \"701a9ff6-6a4f-4caf-8ea1-464d13196d6c\" (UID: \"701a9ff6-6a4f-4caf-8ea1-464d13196d6c\") " Sep 30 14:19:46 crc kubenswrapper[4676]: I0930 14:19:46.238007 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/701a9ff6-6a4f-4caf-8ea1-464d13196d6c-kube-api-access-jpq7k" (OuterVolumeSpecName: "kube-api-access-jpq7k") pod "701a9ff6-6a4f-4caf-8ea1-464d13196d6c" (UID: "701a9ff6-6a4f-4caf-8ea1-464d13196d6c"). InnerVolumeSpecName "kube-api-access-jpq7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:19:46 crc kubenswrapper[4676]: I0930 14:19:46.274565 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/701a9ff6-6a4f-4caf-8ea1-464d13196d6c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "701a9ff6-6a4f-4caf-8ea1-464d13196d6c" (UID: "701a9ff6-6a4f-4caf-8ea1-464d13196d6c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:19:46 crc kubenswrapper[4676]: I0930 14:19:46.274730 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/701a9ff6-6a4f-4caf-8ea1-464d13196d6c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "701a9ff6-6a4f-4caf-8ea1-464d13196d6c" (UID: "701a9ff6-6a4f-4caf-8ea1-464d13196d6c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:19:46 crc kubenswrapper[4676]: I0930 14:19:46.279586 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/701a9ff6-6a4f-4caf-8ea1-464d13196d6c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "701a9ff6-6a4f-4caf-8ea1-464d13196d6c" (UID: "701a9ff6-6a4f-4caf-8ea1-464d13196d6c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:19:46 crc kubenswrapper[4676]: I0930 14:19:46.282612 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/701a9ff6-6a4f-4caf-8ea1-464d13196d6c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "701a9ff6-6a4f-4caf-8ea1-464d13196d6c" (UID: "701a9ff6-6a4f-4caf-8ea1-464d13196d6c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:19:46 crc kubenswrapper[4676]: I0930 14:19:46.283592 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/701a9ff6-6a4f-4caf-8ea1-464d13196d6c-config" (OuterVolumeSpecName: "config") pod "701a9ff6-6a4f-4caf-8ea1-464d13196d6c" (UID: "701a9ff6-6a4f-4caf-8ea1-464d13196d6c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:19:46 crc kubenswrapper[4676]: I0930 14:19:46.331588 4676 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/701a9ff6-6a4f-4caf-8ea1-464d13196d6c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 14:19:46 crc kubenswrapper[4676]: I0930 14:19:46.331636 4676 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/701a9ff6-6a4f-4caf-8ea1-464d13196d6c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 14:19:46 crc kubenswrapper[4676]: I0930 14:19:46.331646 4676 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/701a9ff6-6a4f-4caf-8ea1-464d13196d6c-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 14:19:46 crc kubenswrapper[4676]: I0930 14:19:46.331655 4676 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/701a9ff6-6a4f-4caf-8ea1-464d13196d6c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 14:19:46 crc kubenswrapper[4676]: I0930 14:19:46.331669 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jpq7k\" (UniqueName: \"kubernetes.io/projected/701a9ff6-6a4f-4caf-8ea1-464d13196d6c-kube-api-access-jpq7k\") on node \"crc\" DevicePath \"\"" Sep 30 14:19:46 crc kubenswrapper[4676]: I0930 14:19:46.331679 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/701a9ff6-6a4f-4caf-8ea1-464d13196d6c-config\") on node \"crc\" DevicePath \"\"" Sep 30 14:19:46 crc kubenswrapper[4676]: I0930 14:19:46.509141 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-q8d7x" event={"ID":"701a9ff6-6a4f-4caf-8ea1-464d13196d6c","Type":"ContainerDied","Data":"2ab4497c14af1c3911b83b251c6cb8e9f45357137cf12115580aacb2fbe329bb"} Sep 30 14:19:46 crc kubenswrapper[4676]: I0930 14:19:46.509200 4676 scope.go:117] "RemoveContainer" containerID="b946a9ca330f94d7a0033eb5c9cd9cf254b86d53180921a82bb1d3cef6860bdc" Sep 30 14:19:46 crc kubenswrapper[4676]: I0930 14:19:46.509226 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-q8d7x" Sep 30 14:19:46 crc kubenswrapper[4676]: I0930 14:19:46.552131 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-q8d7x"] Sep 30 14:19:46 crc kubenswrapper[4676]: I0930 14:19:46.558780 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-q8d7x"] Sep 30 14:19:46 crc kubenswrapper[4676]: E0930 14:19:46.610855 4676 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod701a9ff6_6a4f_4caf_8ea1_464d13196d6c.slice/crio-2ab4497c14af1c3911b83b251c6cb8e9f45357137cf12115580aacb2fbe329bb\": RecentStats: unable to find data in memory cache]" Sep 30 14:19:47 crc kubenswrapper[4676]: I0930 14:19:47.169307 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-77585f5f8c-q8d7x" podUID="701a9ff6-6a4f-4caf-8ea1-464d13196d6c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.132:5353: i/o timeout" Sep 30 14:19:47 crc kubenswrapper[4676]: I0930 14:19:47.445500 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="701a9ff6-6a4f-4caf-8ea1-464d13196d6c" path="/var/lib/kubelet/pods/701a9ff6-6a4f-4caf-8ea1-464d13196d6c/volumes" Sep 30 14:19:53 crc kubenswrapper[4676]: E0930 14:19:53.217689 4676 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Sep 30 14:19:53 crc kubenswrapper[4676]: E0930 14:19:53.218032 4676 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5dh694h68ch5fchf4hcbh594h68bh657hcbh579h566h559h64fh556h665h5d4hbdhcfhb8h665h7fhcdh644h576h8hc8h5d8h5fdh66bh79h686q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bsz6g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-866c445777-2hmsg_openstack(3769ebb3-5ddc-4940-a088-79308a08ef6c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 14:19:53 crc kubenswrapper[4676]: E0930 14:19:53.221684 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-866c445777-2hmsg" podUID="3769ebb3-5ddc-4940-a088-79308a08ef6c" Sep 30 14:19:53 crc kubenswrapper[4676]: E0930 14:19:53.402737 4676 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Sep 30 14:19:53 crc kubenswrapper[4676]: E0930 14:19:53.403456 4676 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jktg8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-b5nzl_openstack(c1fd6e1e-38da-4634-9862-21c027ea770a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 14:19:53 crc kubenswrapper[4676]: E0930 14:19:53.404844 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-b5nzl" podUID="c1fd6e1e-38da-4634-9862-21c027ea770a" Sep 30 14:19:53 crc kubenswrapper[4676]: E0930 14:19:53.453222 4676 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Sep 30 14:19:53 crc kubenswrapper[4676]: E0930 14:19:53.453593 4676 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n554h668h98h575h5d6h587h54ch54dh679h688h665h58dh5b5h5fbh56bh98h647h55ch6dh55ch64ch574h56h664h549h588h56chc8h585h569h5fbh5dfq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d75d9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-5d5c48cd87-fdhnt_openstack(70e1b5b0-8a73-47b7-b1f6-2ce64a85d1ae): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 14:19:53 crc kubenswrapper[4676]: E0930 14:19:53.459562 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-5d5c48cd87-fdhnt" podUID="70e1b5b0-8a73-47b7-b1f6-2ce64a85d1ae" Sep 30 14:19:53 crc kubenswrapper[4676]: E0930 14:19:53.571752 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-b5nzl" podUID="c1fd6e1e-38da-4634-9862-21c027ea770a" Sep 30 14:19:55 crc kubenswrapper[4676]: E0930 14:19:55.014926 4676 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api:current-podified" Sep 30 14:19:55 crc kubenswrapper[4676]: E0930 14:19:55.015500 4676 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fngq9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-45gkb_openstack(b04eaad6-de72-4265-aa3d-fda03a0ea925): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 14:19:55 crc kubenswrapper[4676]: E0930 14:19:55.016755 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-45gkb" podUID="b04eaad6-de72-4265-aa3d-fda03a0ea925" Sep 30 14:19:55 crc kubenswrapper[4676]: E0930 14:19:55.359259 4676 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Sep 30 14:19:55 crc kubenswrapper[4676]: E0930 14:19:55.359481 4676 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n66bh7fh687h549h5ddh645hd9h5f7h87hb7h5f6h568h6dh5bdh554h9ch59bh65ch84h59fhfh9dh66dh5b6h56ch5ffh88h64dhcfh549hddh549q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nt82t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(5540da5e-a02e-437f-82a8-e0f74ac91760): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 14:19:55 crc kubenswrapper[4676]: I0930 14:19:55.364440 4676 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 14:19:55 crc kubenswrapper[4676]: E0930 14:19:55.380416 4676 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Sep 30 14:19:55 crc kubenswrapper[4676]: E0930 14:19:55.380622 4676 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nd9hcchd9h5d9h545h658h596h96h59fhb9h559h5dh546h55fh658h59dh584h64h55dhbbh55dh8h544h9dhffh56hdhb7h679hcfh5bch55dq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zj6wh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-67b767d9c7-wpv5b_openstack(3a53aec0-8d03-4a4a-8ae3-736fc762491e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 14:19:55 crc kubenswrapper[4676]: E0930 14:19:55.383165 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-67b767d9c7-wpv5b" podUID="3a53aec0-8d03-4a4a-8ae3-736fc762491e" Sep 30 14:19:55 crc kubenswrapper[4676]: E0930 14:19:55.591526 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api:current-podified\\\"\"" pod="openstack/placement-db-sync-45gkb" podUID="b04eaad6-de72-4265-aa3d-fda03a0ea925" Sep 30 14:19:59 crc kubenswrapper[4676]: I0930 14:19:59.919587 4676 patch_prober.go:28] interesting pod/machine-config-daemon-4k2dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:19:59 crc kubenswrapper[4676]: I0930 14:19:59.919951 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.016206 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-722hg" Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.023472 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-866c445777-2hmsg" Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.029975 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5d5c48cd87-fdhnt" Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.073570 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67b767d9c7-wpv5b" Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.118784 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3769ebb3-5ddc-4940-a088-79308a08ef6c-config-data\") pod \"3769ebb3-5ddc-4940-a088-79308a08ef6c\" (UID: \"3769ebb3-5ddc-4940-a088-79308a08ef6c\") " Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.118842 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/70e1b5b0-8a73-47b7-b1f6-2ce64a85d1ae-config-data\") pod \"70e1b5b0-8a73-47b7-b1f6-2ce64a85d1ae\" (UID: \"70e1b5b0-8a73-47b7-b1f6-2ce64a85d1ae\") " Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.118909 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3769ebb3-5ddc-4940-a088-79308a08ef6c-horizon-secret-key\") pod \"3769ebb3-5ddc-4940-a088-79308a08ef6c\" (UID: \"3769ebb3-5ddc-4940-a088-79308a08ef6c\") " Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.118963 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/70e1b5b0-8a73-47b7-b1f6-2ce64a85d1ae-horizon-secret-key\") pod \"70e1b5b0-8a73-47b7-b1f6-2ce64a85d1ae\" (UID: \"70e1b5b0-8a73-47b7-b1f6-2ce64a85d1ae\") " Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.118990 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3769ebb3-5ddc-4940-a088-79308a08ef6c-logs\") pod \"3769ebb3-5ddc-4940-a088-79308a08ef6c\" (UID: \"3769ebb3-5ddc-4940-a088-79308a08ef6c\") " Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.119021 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3a53aec0-8d03-4a4a-8ae3-736fc762491e-config-data\") pod \"3a53aec0-8d03-4a4a-8ae3-736fc762491e\" (UID: \"3a53aec0-8d03-4a4a-8ae3-736fc762491e\") " Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.119045 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfjl\" (UniqueName: \"kubernetes.io/projected/7f948430-7d30-45f5-95eb-559952c47cdd-kube-api-access-9xfjl\") pod \"7f948430-7d30-45f5-95eb-559952c47cdd\" (UID: \"7f948430-7d30-45f5-95eb-559952c47cdd\") " Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.119068 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f948430-7d30-45f5-95eb-559952c47cdd-scripts\") pod \"7f948430-7d30-45f5-95eb-559952c47cdd\" (UID: \"7f948430-7d30-45f5-95eb-559952c47cdd\") " Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.119099 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d75d9\" (UniqueName: \"kubernetes.io/projected/70e1b5b0-8a73-47b7-b1f6-2ce64a85d1ae-kube-api-access-d75d9\") pod \"70e1b5b0-8a73-47b7-b1f6-2ce64a85d1ae\" (UID: \"70e1b5b0-8a73-47b7-b1f6-2ce64a85d1ae\") " Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.119405 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3769ebb3-5ddc-4940-a088-79308a08ef6c-logs" (OuterVolumeSpecName: "logs") pod "3769ebb3-5ddc-4940-a088-79308a08ef6c" (UID: "3769ebb3-5ddc-4940-a088-79308a08ef6c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.120074 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a53aec0-8d03-4a4a-8ae3-736fc762491e-logs\") pod \"3a53aec0-8d03-4a4a-8ae3-736fc762491e\" (UID: \"3a53aec0-8d03-4a4a-8ae3-736fc762491e\") " Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.120161 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70e1b5b0-8a73-47b7-b1f6-2ce64a85d1ae-config-data" (OuterVolumeSpecName: "config-data") pod "70e1b5b0-8a73-47b7-b1f6-2ce64a85d1ae" (UID: "70e1b5b0-8a73-47b7-b1f6-2ce64a85d1ae"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.120218 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a53aec0-8d03-4a4a-8ae3-736fc762491e-scripts\") pod \"3a53aec0-8d03-4a4a-8ae3-736fc762491e\" (UID: \"3a53aec0-8d03-4a4a-8ae3-736fc762491e\") " Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.120263 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70e1b5b0-8a73-47b7-b1f6-2ce64a85d1ae-logs\") pod \"70e1b5b0-8a73-47b7-b1f6-2ce64a85d1ae\" (UID: \"70e1b5b0-8a73-47b7-b1f6-2ce64a85d1ae\") " Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.120296 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f948430-7d30-45f5-95eb-559952c47cdd-combined-ca-bundle\") pod \"7f948430-7d30-45f5-95eb-559952c47cdd\" (UID: \"7f948430-7d30-45f5-95eb-559952c47cdd\") " Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.120326 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7f948430-7d30-45f5-95eb-559952c47cdd-credential-keys\") pod \"7f948430-7d30-45f5-95eb-559952c47cdd\" (UID: \"7f948430-7d30-45f5-95eb-559952c47cdd\") " Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.120385 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bsz6g\" (UniqueName: \"kubernetes.io/projected/3769ebb3-5ddc-4940-a088-79308a08ef6c-kube-api-access-bsz6g\") pod \"3769ebb3-5ddc-4940-a088-79308a08ef6c\" (UID: \"3769ebb3-5ddc-4940-a088-79308a08ef6c\") " Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.120481 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/70e1b5b0-8a73-47b7-b1f6-2ce64a85d1ae-scripts\") pod \"70e1b5b0-8a73-47b7-b1f6-2ce64a85d1ae\" (UID: \"70e1b5b0-8a73-47b7-b1f6-2ce64a85d1ae\") " Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.120510 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f948430-7d30-45f5-95eb-559952c47cdd-config-data\") pod \"7f948430-7d30-45f5-95eb-559952c47cdd\" (UID: \"7f948430-7d30-45f5-95eb-559952c47cdd\") " Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.120538 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7f948430-7d30-45f5-95eb-559952c47cdd-fernet-keys\") pod \"7f948430-7d30-45f5-95eb-559952c47cdd\" (UID: \"7f948430-7d30-45f5-95eb-559952c47cdd\") " Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.120564 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zj6wh\" (UniqueName: \"kubernetes.io/projected/3a53aec0-8d03-4a4a-8ae3-736fc762491e-kube-api-access-zj6wh\") pod \"3a53aec0-8d03-4a4a-8ae3-736fc762491e\" (UID: \"3a53aec0-8d03-4a4a-8ae3-736fc762491e\") " Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.120610 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3a53aec0-8d03-4a4a-8ae3-736fc762491e-horizon-secret-key\") pod \"3a53aec0-8d03-4a4a-8ae3-736fc762491e\" (UID: \"3a53aec0-8d03-4a4a-8ae3-736fc762491e\") " Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.120677 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3769ebb3-5ddc-4940-a088-79308a08ef6c-scripts\") pod \"3769ebb3-5ddc-4940-a088-79308a08ef6c\" (UID: \"3769ebb3-5ddc-4940-a088-79308a08ef6c\") " Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.120812 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70e1b5b0-8a73-47b7-b1f6-2ce64a85d1ae-logs" (OuterVolumeSpecName: "logs") pod "70e1b5b0-8a73-47b7-b1f6-2ce64a85d1ae" (UID: "70e1b5b0-8a73-47b7-b1f6-2ce64a85d1ae"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.121355 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/70e1b5b0-8a73-47b7-b1f6-2ce64a85d1ae-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.121388 4676 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3769ebb3-5ddc-4940-a088-79308a08ef6c-logs\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.121401 4676 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70e1b5b0-8a73-47b7-b1f6-2ce64a85d1ae-logs\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.121818 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3769ebb3-5ddc-4940-a088-79308a08ef6c-scripts" (OuterVolumeSpecName: "scripts") pod "3769ebb3-5ddc-4940-a088-79308a08ef6c" (UID: "3769ebb3-5ddc-4940-a088-79308a08ef6c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.125025 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f948430-7d30-45f5-95eb-559952c47cdd-kube-api-access-9xfjl" (OuterVolumeSpecName: "kube-api-access-9xfjl") pod "7f948430-7d30-45f5-95eb-559952c47cdd" (UID: "7f948430-7d30-45f5-95eb-559952c47cdd"). InnerVolumeSpecName "kube-api-access-9xfjl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.125993 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3769ebb3-5ddc-4940-a088-79308a08ef6c-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "3769ebb3-5ddc-4940-a088-79308a08ef6c" (UID: "3769ebb3-5ddc-4940-a088-79308a08ef6c"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.126018 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70e1b5b0-8a73-47b7-b1f6-2ce64a85d1ae-kube-api-access-d75d9" (OuterVolumeSpecName: "kube-api-access-d75d9") pod "70e1b5b0-8a73-47b7-b1f6-2ce64a85d1ae" (UID: "70e1b5b0-8a73-47b7-b1f6-2ce64a85d1ae"). InnerVolumeSpecName "kube-api-access-d75d9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.126272 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f948430-7d30-45f5-95eb-559952c47cdd-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "7f948430-7d30-45f5-95eb-559952c47cdd" (UID: "7f948430-7d30-45f5-95eb-559952c47cdd"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.127014 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70e1b5b0-8a73-47b7-b1f6-2ce64a85d1ae-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "70e1b5b0-8a73-47b7-b1f6-2ce64a85d1ae" (UID: "70e1b5b0-8a73-47b7-b1f6-2ce64a85d1ae"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.127378 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70e1b5b0-8a73-47b7-b1f6-2ce64a85d1ae-scripts" (OuterVolumeSpecName: "scripts") pod "70e1b5b0-8a73-47b7-b1f6-2ce64a85d1ae" (UID: "70e1b5b0-8a73-47b7-b1f6-2ce64a85d1ae"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.127683 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f948430-7d30-45f5-95eb-559952c47cdd-scripts" (OuterVolumeSpecName: "scripts") pod "7f948430-7d30-45f5-95eb-559952c47cdd" (UID: "7f948430-7d30-45f5-95eb-559952c47cdd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.128674 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3769ebb3-5ddc-4940-a088-79308a08ef6c-kube-api-access-bsz6g" (OuterVolumeSpecName: "kube-api-access-bsz6g") pod "3769ebb3-5ddc-4940-a088-79308a08ef6c" (UID: "3769ebb3-5ddc-4940-a088-79308a08ef6c"). InnerVolumeSpecName "kube-api-access-bsz6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.128944 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f948430-7d30-45f5-95eb-559952c47cdd-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "7f948430-7d30-45f5-95eb-559952c47cdd" (UID: "7f948430-7d30-45f5-95eb-559952c47cdd"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.129627 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a53aec0-8d03-4a4a-8ae3-736fc762491e-kube-api-access-zj6wh" (OuterVolumeSpecName: "kube-api-access-zj6wh") pod "3a53aec0-8d03-4a4a-8ae3-736fc762491e" (UID: "3a53aec0-8d03-4a4a-8ae3-736fc762491e"). InnerVolumeSpecName "kube-api-access-zj6wh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.151037 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f948430-7d30-45f5-95eb-559952c47cdd-config-data" (OuterVolumeSpecName: "config-data") pod "7f948430-7d30-45f5-95eb-559952c47cdd" (UID: "7f948430-7d30-45f5-95eb-559952c47cdd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.152851 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f948430-7d30-45f5-95eb-559952c47cdd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7f948430-7d30-45f5-95eb-559952c47cdd" (UID: "7f948430-7d30-45f5-95eb-559952c47cdd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.169233 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a53aec0-8d03-4a4a-8ae3-736fc762491e-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "3a53aec0-8d03-4a4a-8ae3-736fc762491e" (UID: "3a53aec0-8d03-4a4a-8ae3-736fc762491e"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.169418 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a53aec0-8d03-4a4a-8ae3-736fc762491e-config-data" (OuterVolumeSpecName: "config-data") pod "3a53aec0-8d03-4a4a-8ae3-736fc762491e" (UID: "3a53aec0-8d03-4a4a-8ae3-736fc762491e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.169680 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a53aec0-8d03-4a4a-8ae3-736fc762491e-logs" (OuterVolumeSpecName: "logs") pod "3a53aec0-8d03-4a4a-8ae3-736fc762491e" (UID: "3a53aec0-8d03-4a4a-8ae3-736fc762491e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.170226 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3769ebb3-5ddc-4940-a088-79308a08ef6c-config-data" (OuterVolumeSpecName: "config-data") pod "3769ebb3-5ddc-4940-a088-79308a08ef6c" (UID: "3769ebb3-5ddc-4940-a088-79308a08ef6c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.175233 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a53aec0-8d03-4a4a-8ae3-736fc762491e-scripts" (OuterVolumeSpecName: "scripts") pod "3a53aec0-8d03-4a4a-8ae3-736fc762491e" (UID: "3a53aec0-8d03-4a4a-8ae3-736fc762491e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.223453 4676 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a53aec0-8d03-4a4a-8ae3-736fc762491e-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.223720 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f948430-7d30-45f5-95eb-559952c47cdd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.223819 4676 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7f948430-7d30-45f5-95eb-559952c47cdd-credential-keys\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.223920 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bsz6g\" (UniqueName: \"kubernetes.io/projected/3769ebb3-5ddc-4940-a088-79308a08ef6c-kube-api-access-bsz6g\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.223997 4676 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/70e1b5b0-8a73-47b7-b1f6-2ce64a85d1ae-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.224105 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f948430-7d30-45f5-95eb-559952c47cdd-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.224190 4676 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7f948430-7d30-45f5-95eb-559952c47cdd-fernet-keys\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.224255 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zj6wh\" (UniqueName: \"kubernetes.io/projected/3a53aec0-8d03-4a4a-8ae3-736fc762491e-kube-api-access-zj6wh\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.224330 4676 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3a53aec0-8d03-4a4a-8ae3-736fc762491e-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.224400 4676 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3769ebb3-5ddc-4940-a088-79308a08ef6c-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.224480 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3769ebb3-5ddc-4940-a088-79308a08ef6c-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.224615 4676 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3769ebb3-5ddc-4940-a088-79308a08ef6c-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.224836 4676 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/70e1b5b0-8a73-47b7-b1f6-2ce64a85d1ae-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.224947 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3a53aec0-8d03-4a4a-8ae3-736fc762491e-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.225026 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfjl\" (UniqueName: \"kubernetes.io/projected/7f948430-7d30-45f5-95eb-559952c47cdd-kube-api-access-9xfjl\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.225098 4676 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f948430-7d30-45f5-95eb-559952c47cdd-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.225184 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d75d9\" (UniqueName: \"kubernetes.io/projected/70e1b5b0-8a73-47b7-b1f6-2ce64a85d1ae-kube-api-access-d75d9\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.225261 4676 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a53aec0-8d03-4a4a-8ae3-736fc762491e-logs\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.699104 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5d5c48cd87-fdhnt" event={"ID":"70e1b5b0-8a73-47b7-b1f6-2ce64a85d1ae","Type":"ContainerDied","Data":"b0902124a4e885abdf52e2adec43f7c61ca9e6d7950f829ba01f978c11abd4e6"} Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.699182 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5d5c48cd87-fdhnt" Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.702413 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67b767d9c7-wpv5b" event={"ID":"3a53aec0-8d03-4a4a-8ae3-736fc762491e","Type":"ContainerDied","Data":"d7217f39b8ed39c7e306baa3acbdcfdd3787cab66d62512545f944be534efcd6"} Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.702426 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67b767d9c7-wpv5b" Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.704951 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-866c445777-2hmsg" event={"ID":"3769ebb3-5ddc-4940-a088-79308a08ef6c","Type":"ContainerDied","Data":"4d954a525417b7b68d1053da78695fa7ba069d87858ae7a6cbd808d1e9781cfb"} Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.705039 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-866c445777-2hmsg" Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.712619 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-722hg" event={"ID":"7f948430-7d30-45f5-95eb-559952c47cdd","Type":"ContainerDied","Data":"eb123e5400cef122aa02d4fdd1be3d390dde73a3a87fbbf13e3b419cf26bbcbc"} Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.712673 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb123e5400cef122aa02d4fdd1be3d390dde73a3a87fbbf13e3b419cf26bbcbc" Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.712668 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-722hg" Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.783842 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5d5c48cd87-fdhnt"] Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.800224 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5d5c48cd87-fdhnt"] Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.827125 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-67b767d9c7-wpv5b"] Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.840333 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-67b767d9c7-wpv5b"] Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.866613 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-866c445777-2hmsg"] Sep 30 14:20:06 crc kubenswrapper[4676]: I0930 14:20:06.876357 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-866c445777-2hmsg"] Sep 30 14:20:07 crc kubenswrapper[4676]: I0930 14:20:07.133706 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-722hg"] Sep 30 14:20:07 crc kubenswrapper[4676]: I0930 14:20:07.139570 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-722hg"] Sep 30 14:20:07 crc kubenswrapper[4676]: I0930 14:20:07.214387 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-g276r"] Sep 30 14:20:07 crc kubenswrapper[4676]: E0930 14:20:07.215000 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="701a9ff6-6a4f-4caf-8ea1-464d13196d6c" containerName="dnsmasq-dns" Sep 30 14:20:07 crc kubenswrapper[4676]: I0930 14:20:07.215016 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="701a9ff6-6a4f-4caf-8ea1-464d13196d6c" containerName="dnsmasq-dns" Sep 30 14:20:07 crc kubenswrapper[4676]: E0930 14:20:07.215062 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f948430-7d30-45f5-95eb-559952c47cdd" containerName="keystone-bootstrap" Sep 30 14:20:07 crc kubenswrapper[4676]: I0930 14:20:07.215072 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f948430-7d30-45f5-95eb-559952c47cdd" containerName="keystone-bootstrap" Sep 30 14:20:07 crc kubenswrapper[4676]: E0930 14:20:07.215094 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="701a9ff6-6a4f-4caf-8ea1-464d13196d6c" containerName="init" Sep 30 14:20:07 crc kubenswrapper[4676]: I0930 14:20:07.215103 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="701a9ff6-6a4f-4caf-8ea1-464d13196d6c" containerName="init" Sep 30 14:20:07 crc kubenswrapper[4676]: I0930 14:20:07.215420 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f948430-7d30-45f5-95eb-559952c47cdd" containerName="keystone-bootstrap" Sep 30 14:20:07 crc kubenswrapper[4676]: I0930 14:20:07.215443 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="701a9ff6-6a4f-4caf-8ea1-464d13196d6c" containerName="dnsmasq-dns" Sep 30 14:20:07 crc kubenswrapper[4676]: I0930 14:20:07.216536 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-g276r" Sep 30 14:20:07 crc kubenswrapper[4676]: I0930 14:20:07.220611 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Sep 30 14:20:07 crc kubenswrapper[4676]: I0930 14:20:07.220644 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Sep 30 14:20:07 crc kubenswrapper[4676]: I0930 14:20:07.220737 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-7jtlk" Sep 30 14:20:07 crc kubenswrapper[4676]: I0930 14:20:07.220660 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Sep 30 14:20:07 crc kubenswrapper[4676]: I0930 14:20:07.223726 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-g276r"] Sep 30 14:20:07 crc kubenswrapper[4676]: I0930 14:20:07.246408 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmx44\" (UniqueName: \"kubernetes.io/projected/da58f130-eba3-4e56-97c1-6eba2641fa7d-kube-api-access-rmx44\") pod \"keystone-bootstrap-g276r\" (UID: \"da58f130-eba3-4e56-97c1-6eba2641fa7d\") " pod="openstack/keystone-bootstrap-g276r" Sep 30 14:20:07 crc kubenswrapper[4676]: I0930 14:20:07.246459 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/da58f130-eba3-4e56-97c1-6eba2641fa7d-credential-keys\") pod \"keystone-bootstrap-g276r\" (UID: \"da58f130-eba3-4e56-97c1-6eba2641fa7d\") " pod="openstack/keystone-bootstrap-g276r" Sep 30 14:20:07 crc kubenswrapper[4676]: I0930 14:20:07.246510 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da58f130-eba3-4e56-97c1-6eba2641fa7d-scripts\") pod \"keystone-bootstrap-g276r\" (UID: \"da58f130-eba3-4e56-97c1-6eba2641fa7d\") " pod="openstack/keystone-bootstrap-g276r" Sep 30 14:20:07 crc kubenswrapper[4676]: I0930 14:20:07.246550 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da58f130-eba3-4e56-97c1-6eba2641fa7d-combined-ca-bundle\") pod \"keystone-bootstrap-g276r\" (UID: \"da58f130-eba3-4e56-97c1-6eba2641fa7d\") " pod="openstack/keystone-bootstrap-g276r" Sep 30 14:20:07 crc kubenswrapper[4676]: I0930 14:20:07.246578 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da58f130-eba3-4e56-97c1-6eba2641fa7d-config-data\") pod \"keystone-bootstrap-g276r\" (UID: \"da58f130-eba3-4e56-97c1-6eba2641fa7d\") " pod="openstack/keystone-bootstrap-g276r" Sep 30 14:20:07 crc kubenswrapper[4676]: I0930 14:20:07.246604 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/da58f130-eba3-4e56-97c1-6eba2641fa7d-fernet-keys\") pod \"keystone-bootstrap-g276r\" (UID: \"da58f130-eba3-4e56-97c1-6eba2641fa7d\") " pod="openstack/keystone-bootstrap-g276r" Sep 30 14:20:07 crc kubenswrapper[4676]: I0930 14:20:07.348515 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da58f130-eba3-4e56-97c1-6eba2641fa7d-scripts\") pod \"keystone-bootstrap-g276r\" (UID: \"da58f130-eba3-4e56-97c1-6eba2641fa7d\") " pod="openstack/keystone-bootstrap-g276r" Sep 30 14:20:07 crc kubenswrapper[4676]: I0930 14:20:07.348587 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da58f130-eba3-4e56-97c1-6eba2641fa7d-combined-ca-bundle\") pod \"keystone-bootstrap-g276r\" (UID: \"da58f130-eba3-4e56-97c1-6eba2641fa7d\") " pod="openstack/keystone-bootstrap-g276r" Sep 30 14:20:07 crc kubenswrapper[4676]: I0930 14:20:07.348628 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da58f130-eba3-4e56-97c1-6eba2641fa7d-config-data\") pod \"keystone-bootstrap-g276r\" (UID: \"da58f130-eba3-4e56-97c1-6eba2641fa7d\") " pod="openstack/keystone-bootstrap-g276r" Sep 30 14:20:07 crc kubenswrapper[4676]: I0930 14:20:07.348663 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/da58f130-eba3-4e56-97c1-6eba2641fa7d-fernet-keys\") pod \"keystone-bootstrap-g276r\" (UID: \"da58f130-eba3-4e56-97c1-6eba2641fa7d\") " pod="openstack/keystone-bootstrap-g276r" Sep 30 14:20:07 crc kubenswrapper[4676]: I0930 14:20:07.348742 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmx44\" (UniqueName: \"kubernetes.io/projected/da58f130-eba3-4e56-97c1-6eba2641fa7d-kube-api-access-rmx44\") pod \"keystone-bootstrap-g276r\" (UID: \"da58f130-eba3-4e56-97c1-6eba2641fa7d\") " pod="openstack/keystone-bootstrap-g276r" Sep 30 14:20:07 crc kubenswrapper[4676]: I0930 14:20:07.348775 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/da58f130-eba3-4e56-97c1-6eba2641fa7d-credential-keys\") pod \"keystone-bootstrap-g276r\" (UID: \"da58f130-eba3-4e56-97c1-6eba2641fa7d\") " pod="openstack/keystone-bootstrap-g276r" Sep 30 14:20:07 crc kubenswrapper[4676]: I0930 14:20:07.354296 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da58f130-eba3-4e56-97c1-6eba2641fa7d-scripts\") pod \"keystone-bootstrap-g276r\" (UID: \"da58f130-eba3-4e56-97c1-6eba2641fa7d\") " pod="openstack/keystone-bootstrap-g276r" Sep 30 14:20:07 crc kubenswrapper[4676]: I0930 14:20:07.354796 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da58f130-eba3-4e56-97c1-6eba2641fa7d-combined-ca-bundle\") pod \"keystone-bootstrap-g276r\" (UID: \"da58f130-eba3-4e56-97c1-6eba2641fa7d\") " pod="openstack/keystone-bootstrap-g276r" Sep 30 14:20:07 crc kubenswrapper[4676]: I0930 14:20:07.354972 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/da58f130-eba3-4e56-97c1-6eba2641fa7d-fernet-keys\") pod \"keystone-bootstrap-g276r\" (UID: \"da58f130-eba3-4e56-97c1-6eba2641fa7d\") " pod="openstack/keystone-bootstrap-g276r" Sep 30 14:20:07 crc kubenswrapper[4676]: I0930 14:20:07.358164 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da58f130-eba3-4e56-97c1-6eba2641fa7d-config-data\") pod \"keystone-bootstrap-g276r\" (UID: \"da58f130-eba3-4e56-97c1-6eba2641fa7d\") " pod="openstack/keystone-bootstrap-g276r" Sep 30 14:20:07 crc kubenswrapper[4676]: I0930 14:20:07.368064 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/da58f130-eba3-4e56-97c1-6eba2641fa7d-credential-keys\") pod \"keystone-bootstrap-g276r\" (UID: \"da58f130-eba3-4e56-97c1-6eba2641fa7d\") " pod="openstack/keystone-bootstrap-g276r" Sep 30 14:20:07 crc kubenswrapper[4676]: I0930 14:20:07.368567 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmx44\" (UniqueName: \"kubernetes.io/projected/da58f130-eba3-4e56-97c1-6eba2641fa7d-kube-api-access-rmx44\") pod \"keystone-bootstrap-g276r\" (UID: \"da58f130-eba3-4e56-97c1-6eba2641fa7d\") " pod="openstack/keystone-bootstrap-g276r" Sep 30 14:20:07 crc kubenswrapper[4676]: I0930 14:20:07.551354 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-g276r" Sep 30 14:20:07 crc kubenswrapper[4676]: I0930 14:20:07.592383 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3769ebb3-5ddc-4940-a088-79308a08ef6c" path="/var/lib/kubelet/pods/3769ebb3-5ddc-4940-a088-79308a08ef6c/volumes" Sep 30 14:20:07 crc kubenswrapper[4676]: I0930 14:20:07.629186 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a53aec0-8d03-4a4a-8ae3-736fc762491e" path="/var/lib/kubelet/pods/3a53aec0-8d03-4a4a-8ae3-736fc762491e/volumes" Sep 30 14:20:07 crc kubenswrapper[4676]: I0930 14:20:07.635542 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70e1b5b0-8a73-47b7-b1f6-2ce64a85d1ae" path="/var/lib/kubelet/pods/70e1b5b0-8a73-47b7-b1f6-2ce64a85d1ae/volumes" Sep 30 14:20:07 crc kubenswrapper[4676]: I0930 14:20:07.636211 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f948430-7d30-45f5-95eb-559952c47cdd" path="/var/lib/kubelet/pods/7f948430-7d30-45f5-95eb-559952c47cdd/volumes" Sep 30 14:20:08 crc kubenswrapper[4676]: I0930 14:20:08.594148 4676 scope.go:117] "RemoveContainer" containerID="8d2808d46e31d0ad70aeded8f32540515c94b69af77502ff1e7df89936508090" Sep 30 14:20:08 crc kubenswrapper[4676]: E0930 14:20:08.672488 4676 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Sep 30 14:20:08 crc kubenswrapper[4676]: E0930 14:20:08.672677 4676 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t7czf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-wxj4s_openstack(63080796-b0be-4b3a-8db5-8242e2eb2bb3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 14:20:08 crc kubenswrapper[4676]: E0930 14:20:08.675026 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-wxj4s" podUID="63080796-b0be-4b3a-8db5-8242e2eb2bb3" Sep 30 14:20:08 crc kubenswrapper[4676]: E0930 14:20:08.742916 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-wxj4s" podUID="63080796-b0be-4b3a-8db5-8242e2eb2bb3" Sep 30 14:20:09 crc kubenswrapper[4676]: I0930 14:20:09.051056 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6668cdff8d-z8vnk"] Sep 30 14:20:09 crc kubenswrapper[4676]: I0930 14:20:09.122052 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-g276r"] Sep 30 14:20:09 crc kubenswrapper[4676]: I0930 14:20:09.127057 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-68fc47cdb4-6758j"] Sep 30 14:20:09 crc kubenswrapper[4676]: I0930 14:20:09.756956 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-g276r" event={"ID":"da58f130-eba3-4e56-97c1-6eba2641fa7d","Type":"ContainerStarted","Data":"cf966f655507fb5be61948a7c3822d954f1f5b70242a56b443e856a0f11fd931"} Sep 30 14:20:09 crc kubenswrapper[4676]: I0930 14:20:09.759186 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6668cdff8d-z8vnk" event={"ID":"ace12602-f0f7-4f29-8c37-72c1e840bacc","Type":"ContainerStarted","Data":"03b907165789de2b1a209bdfadd9e19d3c353ee52f5e77bfa891efbacc53c12c"} Sep 30 14:20:09 crc kubenswrapper[4676]: I0930 14:20:09.761114 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-68fc47cdb4-6758j" event={"ID":"a020c8ba-b848-4a3f-80e4-b3692cf99ffa","Type":"ContainerStarted","Data":"d7cc488d7908bd96bb37867ceb583c546b6081b33237d4eb45b43be2416b9a1b"} Sep 30 14:20:11 crc kubenswrapper[4676]: I0930 14:20:11.780356 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-b5nzl" event={"ID":"c1fd6e1e-38da-4634-9862-21c027ea770a","Type":"ContainerStarted","Data":"b93a672a897fe21e83c7690dbae28e10b7055fe7971a695f3c3c8ae60a15a1ba"} Sep 30 14:20:11 crc kubenswrapper[4676]: I0930 14:20:11.784714 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-45gkb" event={"ID":"b04eaad6-de72-4265-aa3d-fda03a0ea925","Type":"ContainerStarted","Data":"2d2735e743682cc3ec46f83546a934923f0909888643923321d0b0c8d3f62c52"} Sep 30 14:20:11 crc kubenswrapper[4676]: I0930 14:20:11.789564 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5540da5e-a02e-437f-82a8-e0f74ac91760","Type":"ContainerStarted","Data":"1de6c42bdacb6c16f5a7a5eb87b5c6e47392b47b8aeae74dd79a567f63dda5ef"} Sep 30 14:20:11 crc kubenswrapper[4676]: I0930 14:20:11.791962 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-g276r" event={"ID":"da58f130-eba3-4e56-97c1-6eba2641fa7d","Type":"ContainerStarted","Data":"630dba75d1cd824e7fa8825717aee62c6199b7ac50d57cd0d1d5426a4357dd15"} Sep 30 14:20:11 crc kubenswrapper[4676]: I0930 14:20:11.795848 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6668cdff8d-z8vnk" event={"ID":"ace12602-f0f7-4f29-8c37-72c1e840bacc","Type":"ContainerStarted","Data":"41f7da45c7537f91933be6cb3d5a7cc0b180938f05b9eead2baa6c684dcaccdd"} Sep 30 14:20:11 crc kubenswrapper[4676]: I0930 14:20:11.795912 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6668cdff8d-z8vnk" event={"ID":"ace12602-f0f7-4f29-8c37-72c1e840bacc","Type":"ContainerStarted","Data":"e982bb57f0c69fac6b00b72b2db739d79c8214c812ea9ff516f0a7d6e34a9939"} Sep 30 14:20:11 crc kubenswrapper[4676]: I0930 14:20:11.798925 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-68fc47cdb4-6758j" event={"ID":"a020c8ba-b848-4a3f-80e4-b3692cf99ffa","Type":"ContainerStarted","Data":"b52c2ec6e39f5d7c9e0997d0a897194f84db976f766e8a111120878ff7f4ab82"} Sep 30 14:20:11 crc kubenswrapper[4676]: I0930 14:20:11.798961 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-68fc47cdb4-6758j" event={"ID":"a020c8ba-b848-4a3f-80e4-b3692cf99ffa","Type":"ContainerStarted","Data":"f8e711d870ab4db1971d37d7293af330f311c701e1b21e98a09ddf30486840db"} Sep 30 14:20:11 crc kubenswrapper[4676]: I0930 14:20:11.808281 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6668cdff8d-z8vnk" Sep 30 14:20:11 crc kubenswrapper[4676]: I0930 14:20:11.808325 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6668cdff8d-z8vnk" Sep 30 14:20:11 crc kubenswrapper[4676]: I0930 14:20:11.813910 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-b5nzl" podStartSLOduration=3.207884326 podStartE2EDuration="49.813890716s" podCreationTimestamp="2025-09-30 14:19:22 +0000 UTC" firstStartedPulling="2025-09-30 14:19:24.001143616 +0000 UTC m=+1267.984232045" lastFinishedPulling="2025-09-30 14:20:10.607150006 +0000 UTC m=+1314.590238435" observedRunningTime="2025-09-30 14:20:11.80491218 +0000 UTC m=+1315.788000619" watchObservedRunningTime="2025-09-30 14:20:11.813890716 +0000 UTC m=+1315.796979145" Sep 30 14:20:11 crc kubenswrapper[4676]: I0930 14:20:11.830508 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-g276r" podStartSLOduration=4.830491282 podStartE2EDuration="4.830491282s" podCreationTimestamp="2025-09-30 14:20:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:20:11.824436622 +0000 UTC m=+1315.807525081" watchObservedRunningTime="2025-09-30 14:20:11.830491282 +0000 UTC m=+1315.813579711" Sep 30 14:20:11 crc kubenswrapper[4676]: I0930 14:20:11.849268 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-45gkb" podStartSLOduration=3.835544733 podStartE2EDuration="49.849249363s" podCreationTimestamp="2025-09-30 14:19:22 +0000 UTC" firstStartedPulling="2025-09-30 14:19:24.577856818 +0000 UTC m=+1268.560945257" lastFinishedPulling="2025-09-30 14:20:10.591561458 +0000 UTC m=+1314.574649887" observedRunningTime="2025-09-30 14:20:11.846702116 +0000 UTC m=+1315.829790555" watchObservedRunningTime="2025-09-30 14:20:11.849249363 +0000 UTC m=+1315.832337792" Sep 30 14:20:11 crc kubenswrapper[4676]: I0930 14:20:11.873618 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6668cdff8d-z8vnk" podStartSLOduration=39.426679096 podStartE2EDuration="40.87359253s" podCreationTimestamp="2025-09-30 14:19:31 +0000 UTC" firstStartedPulling="2025-09-30 14:20:09.652119684 +0000 UTC m=+1313.635208113" lastFinishedPulling="2025-09-30 14:20:11.099033118 +0000 UTC m=+1315.082121547" observedRunningTime="2025-09-30 14:20:11.86518778 +0000 UTC m=+1315.848276209" watchObservedRunningTime="2025-09-30 14:20:11.87359253 +0000 UTC m=+1315.856680969" Sep 30 14:20:11 crc kubenswrapper[4676]: I0930 14:20:11.895921 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-68fc47cdb4-6758j" podStartSLOduration=39.449336608 podStartE2EDuration="40.895898915s" podCreationTimestamp="2025-09-30 14:19:31 +0000 UTC" firstStartedPulling="2025-09-30 14:20:09.654261169 +0000 UTC m=+1313.637349598" lastFinishedPulling="2025-09-30 14:20:11.100823476 +0000 UTC m=+1315.083911905" observedRunningTime="2025-09-30 14:20:11.88615838 +0000 UTC m=+1315.869246809" watchObservedRunningTime="2025-09-30 14:20:11.895898915 +0000 UTC m=+1315.878987344" Sep 30 14:20:11 crc kubenswrapper[4676]: I0930 14:20:11.896649 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-68fc47cdb4-6758j" Sep 30 14:20:11 crc kubenswrapper[4676]: I0930 14:20:11.896713 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-68fc47cdb4-6758j" Sep 30 14:20:14 crc kubenswrapper[4676]: I0930 14:20:14.829982 4676 generic.go:334] "Generic (PLEG): container finished" podID="da58f130-eba3-4e56-97c1-6eba2641fa7d" containerID="630dba75d1cd824e7fa8825717aee62c6199b7ac50d57cd0d1d5426a4357dd15" exitCode=0 Sep 30 14:20:14 crc kubenswrapper[4676]: I0930 14:20:14.830088 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-g276r" event={"ID":"da58f130-eba3-4e56-97c1-6eba2641fa7d","Type":"ContainerDied","Data":"630dba75d1cd824e7fa8825717aee62c6199b7ac50d57cd0d1d5426a4357dd15"} Sep 30 14:20:16 crc kubenswrapper[4676]: I0930 14:20:16.212948 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-g276r" Sep 30 14:20:16 crc kubenswrapper[4676]: I0930 14:20:16.333408 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da58f130-eba3-4e56-97c1-6eba2641fa7d-combined-ca-bundle\") pod \"da58f130-eba3-4e56-97c1-6eba2641fa7d\" (UID: \"da58f130-eba3-4e56-97c1-6eba2641fa7d\") " Sep 30 14:20:16 crc kubenswrapper[4676]: I0930 14:20:16.333720 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/da58f130-eba3-4e56-97c1-6eba2641fa7d-credential-keys\") pod \"da58f130-eba3-4e56-97c1-6eba2641fa7d\" (UID: \"da58f130-eba3-4e56-97c1-6eba2641fa7d\") " Sep 30 14:20:16 crc kubenswrapper[4676]: I0930 14:20:16.333741 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da58f130-eba3-4e56-97c1-6eba2641fa7d-config-data\") pod \"da58f130-eba3-4e56-97c1-6eba2641fa7d\" (UID: \"da58f130-eba3-4e56-97c1-6eba2641fa7d\") " Sep 30 14:20:16 crc kubenswrapper[4676]: I0930 14:20:16.333775 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmx44\" (UniqueName: \"kubernetes.io/projected/da58f130-eba3-4e56-97c1-6eba2641fa7d-kube-api-access-rmx44\") pod \"da58f130-eba3-4e56-97c1-6eba2641fa7d\" (UID: \"da58f130-eba3-4e56-97c1-6eba2641fa7d\") " Sep 30 14:20:16 crc kubenswrapper[4676]: I0930 14:20:16.333824 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da58f130-eba3-4e56-97c1-6eba2641fa7d-scripts\") pod \"da58f130-eba3-4e56-97c1-6eba2641fa7d\" (UID: \"da58f130-eba3-4e56-97c1-6eba2641fa7d\") " Sep 30 14:20:16 crc kubenswrapper[4676]: I0930 14:20:16.333856 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/da58f130-eba3-4e56-97c1-6eba2641fa7d-fernet-keys\") pod \"da58f130-eba3-4e56-97c1-6eba2641fa7d\" (UID: \"da58f130-eba3-4e56-97c1-6eba2641fa7d\") " Sep 30 14:20:16 crc kubenswrapper[4676]: I0930 14:20:16.341302 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da58f130-eba3-4e56-97c1-6eba2641fa7d-scripts" (OuterVolumeSpecName: "scripts") pod "da58f130-eba3-4e56-97c1-6eba2641fa7d" (UID: "da58f130-eba3-4e56-97c1-6eba2641fa7d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:20:16 crc kubenswrapper[4676]: I0930 14:20:16.341445 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da58f130-eba3-4e56-97c1-6eba2641fa7d-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "da58f130-eba3-4e56-97c1-6eba2641fa7d" (UID: "da58f130-eba3-4e56-97c1-6eba2641fa7d"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:20:16 crc kubenswrapper[4676]: I0930 14:20:16.342300 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da58f130-eba3-4e56-97c1-6eba2641fa7d-kube-api-access-rmx44" (OuterVolumeSpecName: "kube-api-access-rmx44") pod "da58f130-eba3-4e56-97c1-6eba2641fa7d" (UID: "da58f130-eba3-4e56-97c1-6eba2641fa7d"). InnerVolumeSpecName "kube-api-access-rmx44". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:20:16 crc kubenswrapper[4676]: I0930 14:20:16.353540 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da58f130-eba3-4e56-97c1-6eba2641fa7d-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "da58f130-eba3-4e56-97c1-6eba2641fa7d" (UID: "da58f130-eba3-4e56-97c1-6eba2641fa7d"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:20:16 crc kubenswrapper[4676]: I0930 14:20:16.363809 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da58f130-eba3-4e56-97c1-6eba2641fa7d-config-data" (OuterVolumeSpecName: "config-data") pod "da58f130-eba3-4e56-97c1-6eba2641fa7d" (UID: "da58f130-eba3-4e56-97c1-6eba2641fa7d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:20:16 crc kubenswrapper[4676]: I0930 14:20:16.366117 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da58f130-eba3-4e56-97c1-6eba2641fa7d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "da58f130-eba3-4e56-97c1-6eba2641fa7d" (UID: "da58f130-eba3-4e56-97c1-6eba2641fa7d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:20:16 crc kubenswrapper[4676]: I0930 14:20:16.435383 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da58f130-eba3-4e56-97c1-6eba2641fa7d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:16 crc kubenswrapper[4676]: I0930 14:20:16.435420 4676 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/da58f130-eba3-4e56-97c1-6eba2641fa7d-credential-keys\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:16 crc kubenswrapper[4676]: I0930 14:20:16.435430 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da58f130-eba3-4e56-97c1-6eba2641fa7d-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:16 crc kubenswrapper[4676]: I0930 14:20:16.435439 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmx44\" (UniqueName: \"kubernetes.io/projected/da58f130-eba3-4e56-97c1-6eba2641fa7d-kube-api-access-rmx44\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:16 crc kubenswrapper[4676]: I0930 14:20:16.435450 4676 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da58f130-eba3-4e56-97c1-6eba2641fa7d-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:16 crc kubenswrapper[4676]: I0930 14:20:16.435459 4676 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/da58f130-eba3-4e56-97c1-6eba2641fa7d-fernet-keys\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:16 crc kubenswrapper[4676]: I0930 14:20:16.863276 4676 generic.go:334] "Generic (PLEG): container finished" podID="b04eaad6-de72-4265-aa3d-fda03a0ea925" containerID="2d2735e743682cc3ec46f83546a934923f0909888643923321d0b0c8d3f62c52" exitCode=0 Sep 30 14:20:16 crc kubenswrapper[4676]: I0930 14:20:16.863354 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-45gkb" event={"ID":"b04eaad6-de72-4265-aa3d-fda03a0ea925","Type":"ContainerDied","Data":"2d2735e743682cc3ec46f83546a934923f0909888643923321d0b0c8d3f62c52"} Sep 30 14:20:16 crc kubenswrapper[4676]: I0930 14:20:16.865522 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5540da5e-a02e-437f-82a8-e0f74ac91760","Type":"ContainerStarted","Data":"ea7cc09bbd28253f4e39517465438b41cce5f3bad49b276f7337fe769b0d5ec0"} Sep 30 14:20:16 crc kubenswrapper[4676]: I0930 14:20:16.868768 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-g276r" event={"ID":"da58f130-eba3-4e56-97c1-6eba2641fa7d","Type":"ContainerDied","Data":"cf966f655507fb5be61948a7c3822d954f1f5b70242a56b443e856a0f11fd931"} Sep 30 14:20:16 crc kubenswrapper[4676]: I0930 14:20:16.868810 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf966f655507fb5be61948a7c3822d954f1f5b70242a56b443e856a0f11fd931" Sep 30 14:20:16 crc kubenswrapper[4676]: I0930 14:20:16.868897 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-g276r" Sep 30 14:20:16 crc kubenswrapper[4676]: I0930 14:20:16.939424 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-d85658b59-96mgj"] Sep 30 14:20:16 crc kubenswrapper[4676]: E0930 14:20:16.939854 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da58f130-eba3-4e56-97c1-6eba2641fa7d" containerName="keystone-bootstrap" Sep 30 14:20:16 crc kubenswrapper[4676]: I0930 14:20:16.939887 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="da58f130-eba3-4e56-97c1-6eba2641fa7d" containerName="keystone-bootstrap" Sep 30 14:20:16 crc kubenswrapper[4676]: I0930 14:20:16.940079 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="da58f130-eba3-4e56-97c1-6eba2641fa7d" containerName="keystone-bootstrap" Sep 30 14:20:16 crc kubenswrapper[4676]: I0930 14:20:16.940642 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d85658b59-96mgj" Sep 30 14:20:16 crc kubenswrapper[4676]: I0930 14:20:16.942987 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Sep 30 14:20:16 crc kubenswrapper[4676]: I0930 14:20:16.943024 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Sep 30 14:20:16 crc kubenswrapper[4676]: I0930 14:20:16.946005 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Sep 30 14:20:16 crc kubenswrapper[4676]: I0930 14:20:16.946017 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Sep 30 14:20:16 crc kubenswrapper[4676]: I0930 14:20:16.946151 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Sep 30 14:20:16 crc kubenswrapper[4676]: I0930 14:20:16.946227 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-7jtlk" Sep 30 14:20:16 crc kubenswrapper[4676]: I0930 14:20:16.953388 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-d85658b59-96mgj"] Sep 30 14:20:17 crc kubenswrapper[4676]: I0930 14:20:17.045805 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e10025d-8396-4100-8652-3358d52c3199-internal-tls-certs\") pod \"keystone-d85658b59-96mgj\" (UID: \"4e10025d-8396-4100-8652-3358d52c3199\") " pod="openstack/keystone-d85658b59-96mgj" Sep 30 14:20:17 crc kubenswrapper[4676]: I0930 14:20:17.045854 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2mgd\" (UniqueName: \"kubernetes.io/projected/4e10025d-8396-4100-8652-3358d52c3199-kube-api-access-t2mgd\") pod \"keystone-d85658b59-96mgj\" (UID: \"4e10025d-8396-4100-8652-3358d52c3199\") " pod="openstack/keystone-d85658b59-96mgj" Sep 30 14:20:17 crc kubenswrapper[4676]: I0930 14:20:17.045929 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4e10025d-8396-4100-8652-3358d52c3199-credential-keys\") pod \"keystone-d85658b59-96mgj\" (UID: \"4e10025d-8396-4100-8652-3358d52c3199\") " pod="openstack/keystone-d85658b59-96mgj" Sep 30 14:20:17 crc kubenswrapper[4676]: I0930 14:20:17.045959 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e10025d-8396-4100-8652-3358d52c3199-combined-ca-bundle\") pod \"keystone-d85658b59-96mgj\" (UID: \"4e10025d-8396-4100-8652-3358d52c3199\") " pod="openstack/keystone-d85658b59-96mgj" Sep 30 14:20:17 crc kubenswrapper[4676]: I0930 14:20:17.045987 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4e10025d-8396-4100-8652-3358d52c3199-fernet-keys\") pod \"keystone-d85658b59-96mgj\" (UID: \"4e10025d-8396-4100-8652-3358d52c3199\") " pod="openstack/keystone-d85658b59-96mgj" Sep 30 14:20:17 crc kubenswrapper[4676]: I0930 14:20:17.046005 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e10025d-8396-4100-8652-3358d52c3199-public-tls-certs\") pod \"keystone-d85658b59-96mgj\" (UID: \"4e10025d-8396-4100-8652-3358d52c3199\") " pod="openstack/keystone-d85658b59-96mgj" Sep 30 14:20:17 crc kubenswrapper[4676]: I0930 14:20:17.046040 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e10025d-8396-4100-8652-3358d52c3199-config-data\") pod \"keystone-d85658b59-96mgj\" (UID: \"4e10025d-8396-4100-8652-3358d52c3199\") " pod="openstack/keystone-d85658b59-96mgj" Sep 30 14:20:17 crc kubenswrapper[4676]: I0930 14:20:17.046088 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e10025d-8396-4100-8652-3358d52c3199-scripts\") pod \"keystone-d85658b59-96mgj\" (UID: \"4e10025d-8396-4100-8652-3358d52c3199\") " pod="openstack/keystone-d85658b59-96mgj" Sep 30 14:20:17 crc kubenswrapper[4676]: I0930 14:20:17.147534 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e10025d-8396-4100-8652-3358d52c3199-internal-tls-certs\") pod \"keystone-d85658b59-96mgj\" (UID: \"4e10025d-8396-4100-8652-3358d52c3199\") " pod="openstack/keystone-d85658b59-96mgj" Sep 30 14:20:17 crc kubenswrapper[4676]: I0930 14:20:17.147587 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2mgd\" (UniqueName: \"kubernetes.io/projected/4e10025d-8396-4100-8652-3358d52c3199-kube-api-access-t2mgd\") pod \"keystone-d85658b59-96mgj\" (UID: \"4e10025d-8396-4100-8652-3358d52c3199\") " pod="openstack/keystone-d85658b59-96mgj" Sep 30 14:20:17 crc kubenswrapper[4676]: I0930 14:20:17.147624 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4e10025d-8396-4100-8652-3358d52c3199-credential-keys\") pod \"keystone-d85658b59-96mgj\" (UID: \"4e10025d-8396-4100-8652-3358d52c3199\") " pod="openstack/keystone-d85658b59-96mgj" Sep 30 14:20:17 crc kubenswrapper[4676]: I0930 14:20:17.147656 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e10025d-8396-4100-8652-3358d52c3199-combined-ca-bundle\") pod \"keystone-d85658b59-96mgj\" (UID: \"4e10025d-8396-4100-8652-3358d52c3199\") " pod="openstack/keystone-d85658b59-96mgj" Sep 30 14:20:17 crc kubenswrapper[4676]: I0930 14:20:17.147688 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4e10025d-8396-4100-8652-3358d52c3199-fernet-keys\") pod \"keystone-d85658b59-96mgj\" (UID: \"4e10025d-8396-4100-8652-3358d52c3199\") " pod="openstack/keystone-d85658b59-96mgj" Sep 30 14:20:17 crc kubenswrapper[4676]: I0930 14:20:17.147714 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e10025d-8396-4100-8652-3358d52c3199-public-tls-certs\") pod \"keystone-d85658b59-96mgj\" (UID: \"4e10025d-8396-4100-8652-3358d52c3199\") " pod="openstack/keystone-d85658b59-96mgj" Sep 30 14:20:17 crc kubenswrapper[4676]: I0930 14:20:17.147765 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e10025d-8396-4100-8652-3358d52c3199-config-data\") pod \"keystone-d85658b59-96mgj\" (UID: \"4e10025d-8396-4100-8652-3358d52c3199\") " pod="openstack/keystone-d85658b59-96mgj" Sep 30 14:20:17 crc kubenswrapper[4676]: I0930 14:20:17.147795 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e10025d-8396-4100-8652-3358d52c3199-scripts\") pod \"keystone-d85658b59-96mgj\" (UID: \"4e10025d-8396-4100-8652-3358d52c3199\") " pod="openstack/keystone-d85658b59-96mgj" Sep 30 14:20:17 crc kubenswrapper[4676]: I0930 14:20:17.151125 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e10025d-8396-4100-8652-3358d52c3199-scripts\") pod \"keystone-d85658b59-96mgj\" (UID: \"4e10025d-8396-4100-8652-3358d52c3199\") " pod="openstack/keystone-d85658b59-96mgj" Sep 30 14:20:17 crc kubenswrapper[4676]: I0930 14:20:17.152243 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4e10025d-8396-4100-8652-3358d52c3199-fernet-keys\") pod \"keystone-d85658b59-96mgj\" (UID: \"4e10025d-8396-4100-8652-3358d52c3199\") " pod="openstack/keystone-d85658b59-96mgj" Sep 30 14:20:17 crc kubenswrapper[4676]: I0930 14:20:17.152322 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e10025d-8396-4100-8652-3358d52c3199-internal-tls-certs\") pod \"keystone-d85658b59-96mgj\" (UID: \"4e10025d-8396-4100-8652-3358d52c3199\") " pod="openstack/keystone-d85658b59-96mgj" Sep 30 14:20:17 crc kubenswrapper[4676]: I0930 14:20:17.153119 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4e10025d-8396-4100-8652-3358d52c3199-credential-keys\") pod \"keystone-d85658b59-96mgj\" (UID: \"4e10025d-8396-4100-8652-3358d52c3199\") " pod="openstack/keystone-d85658b59-96mgj" Sep 30 14:20:17 crc kubenswrapper[4676]: I0930 14:20:17.156337 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e10025d-8396-4100-8652-3358d52c3199-public-tls-certs\") pod \"keystone-d85658b59-96mgj\" (UID: \"4e10025d-8396-4100-8652-3358d52c3199\") " pod="openstack/keystone-d85658b59-96mgj" Sep 30 14:20:17 crc kubenswrapper[4676]: I0930 14:20:17.157733 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e10025d-8396-4100-8652-3358d52c3199-config-data\") pod \"keystone-d85658b59-96mgj\" (UID: \"4e10025d-8396-4100-8652-3358d52c3199\") " pod="openstack/keystone-d85658b59-96mgj" Sep 30 14:20:17 crc kubenswrapper[4676]: I0930 14:20:17.165794 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e10025d-8396-4100-8652-3358d52c3199-combined-ca-bundle\") pod \"keystone-d85658b59-96mgj\" (UID: \"4e10025d-8396-4100-8652-3358d52c3199\") " pod="openstack/keystone-d85658b59-96mgj" Sep 30 14:20:17 crc kubenswrapper[4676]: I0930 14:20:17.168292 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2mgd\" (UniqueName: \"kubernetes.io/projected/4e10025d-8396-4100-8652-3358d52c3199-kube-api-access-t2mgd\") pod \"keystone-d85658b59-96mgj\" (UID: \"4e10025d-8396-4100-8652-3358d52c3199\") " pod="openstack/keystone-d85658b59-96mgj" Sep 30 14:20:17 crc kubenswrapper[4676]: I0930 14:20:17.260789 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d85658b59-96mgj" Sep 30 14:20:17 crc kubenswrapper[4676]: E0930 14:20:17.331662 4676 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7724b184_865f_4ced_bdf7_867184cf3647.slice/crio-086ac2a3788928169cf290661c3c50dd7ad4957cff71c7bde933f95169f8b4ef.scope\": RecentStats: unable to find data in memory cache]" Sep 30 14:20:17 crc kubenswrapper[4676]: I0930 14:20:17.745518 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-d85658b59-96mgj"] Sep 30 14:20:17 crc kubenswrapper[4676]: W0930 14:20:17.758362 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e10025d_8396_4100_8652_3358d52c3199.slice/crio-ccfdd1e7dbb7546bd1eefea0f01b2343f553022afeec5f3f19c0e3276ab0c005 WatchSource:0}: Error finding container ccfdd1e7dbb7546bd1eefea0f01b2343f553022afeec5f3f19c0e3276ab0c005: Status 404 returned error can't find the container with id ccfdd1e7dbb7546bd1eefea0f01b2343f553022afeec5f3f19c0e3276ab0c005 Sep 30 14:20:17 crc kubenswrapper[4676]: I0930 14:20:17.883610 4676 generic.go:334] "Generic (PLEG): container finished" podID="7724b184-865f-4ced-bdf7-867184cf3647" containerID="086ac2a3788928169cf290661c3c50dd7ad4957cff71c7bde933f95169f8b4ef" exitCode=0 Sep 30 14:20:17 crc kubenswrapper[4676]: I0930 14:20:17.883681 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-fptlq" event={"ID":"7724b184-865f-4ced-bdf7-867184cf3647","Type":"ContainerDied","Data":"086ac2a3788928169cf290661c3c50dd7ad4957cff71c7bde933f95169f8b4ef"} Sep 30 14:20:17 crc kubenswrapper[4676]: I0930 14:20:17.886697 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d85658b59-96mgj" event={"ID":"4e10025d-8396-4100-8652-3358d52c3199","Type":"ContainerStarted","Data":"ccfdd1e7dbb7546bd1eefea0f01b2343f553022afeec5f3f19c0e3276ab0c005"} Sep 30 14:20:18 crc kubenswrapper[4676]: I0930 14:20:18.318302 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-45gkb" Sep 30 14:20:18 crc kubenswrapper[4676]: I0930 14:20:18.473668 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b04eaad6-de72-4265-aa3d-fda03a0ea925-config-data\") pod \"b04eaad6-de72-4265-aa3d-fda03a0ea925\" (UID: \"b04eaad6-de72-4265-aa3d-fda03a0ea925\") " Sep 30 14:20:18 crc kubenswrapper[4676]: I0930 14:20:18.473824 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fngq9\" (UniqueName: \"kubernetes.io/projected/b04eaad6-de72-4265-aa3d-fda03a0ea925-kube-api-access-fngq9\") pod \"b04eaad6-de72-4265-aa3d-fda03a0ea925\" (UID: \"b04eaad6-de72-4265-aa3d-fda03a0ea925\") " Sep 30 14:20:18 crc kubenswrapper[4676]: I0930 14:20:18.473857 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b04eaad6-de72-4265-aa3d-fda03a0ea925-logs\") pod \"b04eaad6-de72-4265-aa3d-fda03a0ea925\" (UID: \"b04eaad6-de72-4265-aa3d-fda03a0ea925\") " Sep 30 14:20:18 crc kubenswrapper[4676]: I0930 14:20:18.473918 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b04eaad6-de72-4265-aa3d-fda03a0ea925-combined-ca-bundle\") pod \"b04eaad6-de72-4265-aa3d-fda03a0ea925\" (UID: \"b04eaad6-de72-4265-aa3d-fda03a0ea925\") " Sep 30 14:20:18 crc kubenswrapper[4676]: I0930 14:20:18.473999 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b04eaad6-de72-4265-aa3d-fda03a0ea925-scripts\") pod \"b04eaad6-de72-4265-aa3d-fda03a0ea925\" (UID: \"b04eaad6-de72-4265-aa3d-fda03a0ea925\") " Sep 30 14:20:18 crc kubenswrapper[4676]: I0930 14:20:18.475356 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b04eaad6-de72-4265-aa3d-fda03a0ea925-logs" (OuterVolumeSpecName: "logs") pod "b04eaad6-de72-4265-aa3d-fda03a0ea925" (UID: "b04eaad6-de72-4265-aa3d-fda03a0ea925"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:20:18 crc kubenswrapper[4676]: I0930 14:20:18.479508 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b04eaad6-de72-4265-aa3d-fda03a0ea925-kube-api-access-fngq9" (OuterVolumeSpecName: "kube-api-access-fngq9") pod "b04eaad6-de72-4265-aa3d-fda03a0ea925" (UID: "b04eaad6-de72-4265-aa3d-fda03a0ea925"). InnerVolumeSpecName "kube-api-access-fngq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:20:18 crc kubenswrapper[4676]: I0930 14:20:18.480074 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b04eaad6-de72-4265-aa3d-fda03a0ea925-scripts" (OuterVolumeSpecName: "scripts") pod "b04eaad6-de72-4265-aa3d-fda03a0ea925" (UID: "b04eaad6-de72-4265-aa3d-fda03a0ea925"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:20:18 crc kubenswrapper[4676]: I0930 14:20:18.503005 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b04eaad6-de72-4265-aa3d-fda03a0ea925-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b04eaad6-de72-4265-aa3d-fda03a0ea925" (UID: "b04eaad6-de72-4265-aa3d-fda03a0ea925"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:20:18 crc kubenswrapper[4676]: I0930 14:20:18.514019 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b04eaad6-de72-4265-aa3d-fda03a0ea925-config-data" (OuterVolumeSpecName: "config-data") pod "b04eaad6-de72-4265-aa3d-fda03a0ea925" (UID: "b04eaad6-de72-4265-aa3d-fda03a0ea925"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:20:18 crc kubenswrapper[4676]: I0930 14:20:18.576209 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fngq9\" (UniqueName: \"kubernetes.io/projected/b04eaad6-de72-4265-aa3d-fda03a0ea925-kube-api-access-fngq9\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:18 crc kubenswrapper[4676]: I0930 14:20:18.576244 4676 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b04eaad6-de72-4265-aa3d-fda03a0ea925-logs\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:18 crc kubenswrapper[4676]: I0930 14:20:18.576253 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b04eaad6-de72-4265-aa3d-fda03a0ea925-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:18 crc kubenswrapper[4676]: I0930 14:20:18.576264 4676 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b04eaad6-de72-4265-aa3d-fda03a0ea925-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:18 crc kubenswrapper[4676]: I0930 14:20:18.576272 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b04eaad6-de72-4265-aa3d-fda03a0ea925-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:18 crc kubenswrapper[4676]: I0930 14:20:18.912055 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-45gkb" Sep 30 14:20:18 crc kubenswrapper[4676]: I0930 14:20:18.912048 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-45gkb" event={"ID":"b04eaad6-de72-4265-aa3d-fda03a0ea925","Type":"ContainerDied","Data":"384c2dfa2233bfc3f317ddfb8882b62680b0db352c8bb0276707a18523693847"} Sep 30 14:20:18 crc kubenswrapper[4676]: I0930 14:20:18.912278 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="384c2dfa2233bfc3f317ddfb8882b62680b0db352c8bb0276707a18523693847" Sep 30 14:20:18 crc kubenswrapper[4676]: I0930 14:20:18.919988 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d85658b59-96mgj" event={"ID":"4e10025d-8396-4100-8652-3358d52c3199","Type":"ContainerStarted","Data":"03dcb1d69c1921d3c7851df074dbb28399a73ead96057223e5bbd35620a1a46e"} Sep 30 14:20:18 crc kubenswrapper[4676]: I0930 14:20:18.920055 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-d85658b59-96mgj" Sep 30 14:20:18 crc kubenswrapper[4676]: I0930 14:20:18.954853 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-d85658b59-96mgj" podStartSLOduration=2.954831409 podStartE2EDuration="2.954831409s" podCreationTimestamp="2025-09-30 14:20:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:20:18.951456471 +0000 UTC m=+1322.934544910" watchObservedRunningTime="2025-09-30 14:20:18.954831409 +0000 UTC m=+1322.937919838" Sep 30 14:20:19 crc kubenswrapper[4676]: I0930 14:20:19.019300 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6558bbc9d4-wdcbn"] Sep 30 14:20:19 crc kubenswrapper[4676]: E0930 14:20:19.019786 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b04eaad6-de72-4265-aa3d-fda03a0ea925" containerName="placement-db-sync" Sep 30 14:20:19 crc kubenswrapper[4676]: I0930 14:20:19.019810 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="b04eaad6-de72-4265-aa3d-fda03a0ea925" containerName="placement-db-sync" Sep 30 14:20:19 crc kubenswrapper[4676]: I0930 14:20:19.020106 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="b04eaad6-de72-4265-aa3d-fda03a0ea925" containerName="placement-db-sync" Sep 30 14:20:19 crc kubenswrapper[4676]: I0930 14:20:19.026029 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6558bbc9d4-wdcbn" Sep 30 14:20:19 crc kubenswrapper[4676]: I0930 14:20:19.028858 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Sep 30 14:20:19 crc kubenswrapper[4676]: I0930 14:20:19.029187 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Sep 30 14:20:19 crc kubenswrapper[4676]: I0930 14:20:19.029360 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Sep 30 14:20:19 crc kubenswrapper[4676]: I0930 14:20:19.029512 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-ml554" Sep 30 14:20:19 crc kubenswrapper[4676]: I0930 14:20:19.033694 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6558bbc9d4-wdcbn"] Sep 30 14:20:19 crc kubenswrapper[4676]: I0930 14:20:19.036075 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Sep 30 14:20:19 crc kubenswrapper[4676]: I0930 14:20:19.186755 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5505b25-a501-44f0-8b24-6630fb71d41b-config-data\") pod \"placement-6558bbc9d4-wdcbn\" (UID: \"c5505b25-a501-44f0-8b24-6630fb71d41b\") " pod="openstack/placement-6558bbc9d4-wdcbn" Sep 30 14:20:19 crc kubenswrapper[4676]: I0930 14:20:19.186833 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5505b25-a501-44f0-8b24-6630fb71d41b-public-tls-certs\") pod \"placement-6558bbc9d4-wdcbn\" (UID: \"c5505b25-a501-44f0-8b24-6630fb71d41b\") " pod="openstack/placement-6558bbc9d4-wdcbn" Sep 30 14:20:19 crc kubenswrapper[4676]: I0930 14:20:19.187746 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5505b25-a501-44f0-8b24-6630fb71d41b-combined-ca-bundle\") pod \"placement-6558bbc9d4-wdcbn\" (UID: \"c5505b25-a501-44f0-8b24-6630fb71d41b\") " pod="openstack/placement-6558bbc9d4-wdcbn" Sep 30 14:20:19 crc kubenswrapper[4676]: I0930 14:20:19.187973 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwh87\" (UniqueName: \"kubernetes.io/projected/c5505b25-a501-44f0-8b24-6630fb71d41b-kube-api-access-rwh87\") pod \"placement-6558bbc9d4-wdcbn\" (UID: \"c5505b25-a501-44f0-8b24-6630fb71d41b\") " pod="openstack/placement-6558bbc9d4-wdcbn" Sep 30 14:20:19 crc kubenswrapper[4676]: I0930 14:20:19.188155 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5505b25-a501-44f0-8b24-6630fb71d41b-internal-tls-certs\") pod \"placement-6558bbc9d4-wdcbn\" (UID: \"c5505b25-a501-44f0-8b24-6630fb71d41b\") " pod="openstack/placement-6558bbc9d4-wdcbn" Sep 30 14:20:19 crc kubenswrapper[4676]: I0930 14:20:19.188236 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5505b25-a501-44f0-8b24-6630fb71d41b-logs\") pod \"placement-6558bbc9d4-wdcbn\" (UID: \"c5505b25-a501-44f0-8b24-6630fb71d41b\") " pod="openstack/placement-6558bbc9d4-wdcbn" Sep 30 14:20:19 crc kubenswrapper[4676]: I0930 14:20:19.188575 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5505b25-a501-44f0-8b24-6630fb71d41b-scripts\") pod \"placement-6558bbc9d4-wdcbn\" (UID: \"c5505b25-a501-44f0-8b24-6630fb71d41b\") " pod="openstack/placement-6558bbc9d4-wdcbn" Sep 30 14:20:19 crc kubenswrapper[4676]: I0930 14:20:19.290615 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5505b25-a501-44f0-8b24-6630fb71d41b-internal-tls-certs\") pod \"placement-6558bbc9d4-wdcbn\" (UID: \"c5505b25-a501-44f0-8b24-6630fb71d41b\") " pod="openstack/placement-6558bbc9d4-wdcbn" Sep 30 14:20:19 crc kubenswrapper[4676]: I0930 14:20:19.290681 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5505b25-a501-44f0-8b24-6630fb71d41b-logs\") pod \"placement-6558bbc9d4-wdcbn\" (UID: \"c5505b25-a501-44f0-8b24-6630fb71d41b\") " pod="openstack/placement-6558bbc9d4-wdcbn" Sep 30 14:20:19 crc kubenswrapper[4676]: I0930 14:20:19.290777 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5505b25-a501-44f0-8b24-6630fb71d41b-scripts\") pod \"placement-6558bbc9d4-wdcbn\" (UID: \"c5505b25-a501-44f0-8b24-6630fb71d41b\") " pod="openstack/placement-6558bbc9d4-wdcbn" Sep 30 14:20:19 crc kubenswrapper[4676]: I0930 14:20:19.290851 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5505b25-a501-44f0-8b24-6630fb71d41b-config-data\") pod \"placement-6558bbc9d4-wdcbn\" (UID: \"c5505b25-a501-44f0-8b24-6630fb71d41b\") " pod="openstack/placement-6558bbc9d4-wdcbn" Sep 30 14:20:19 crc kubenswrapper[4676]: I0930 14:20:19.290940 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5505b25-a501-44f0-8b24-6630fb71d41b-public-tls-certs\") pod \"placement-6558bbc9d4-wdcbn\" (UID: \"c5505b25-a501-44f0-8b24-6630fb71d41b\") " pod="openstack/placement-6558bbc9d4-wdcbn" Sep 30 14:20:19 crc kubenswrapper[4676]: I0930 14:20:19.290974 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5505b25-a501-44f0-8b24-6630fb71d41b-combined-ca-bundle\") pod \"placement-6558bbc9d4-wdcbn\" (UID: \"c5505b25-a501-44f0-8b24-6630fb71d41b\") " pod="openstack/placement-6558bbc9d4-wdcbn" Sep 30 14:20:19 crc kubenswrapper[4676]: I0930 14:20:19.291036 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwh87\" (UniqueName: \"kubernetes.io/projected/c5505b25-a501-44f0-8b24-6630fb71d41b-kube-api-access-rwh87\") pod \"placement-6558bbc9d4-wdcbn\" (UID: \"c5505b25-a501-44f0-8b24-6630fb71d41b\") " pod="openstack/placement-6558bbc9d4-wdcbn" Sep 30 14:20:19 crc kubenswrapper[4676]: I0930 14:20:19.293478 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5505b25-a501-44f0-8b24-6630fb71d41b-logs\") pod \"placement-6558bbc9d4-wdcbn\" (UID: \"c5505b25-a501-44f0-8b24-6630fb71d41b\") " pod="openstack/placement-6558bbc9d4-wdcbn" Sep 30 14:20:19 crc kubenswrapper[4676]: I0930 14:20:19.298858 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5505b25-a501-44f0-8b24-6630fb71d41b-internal-tls-certs\") pod \"placement-6558bbc9d4-wdcbn\" (UID: \"c5505b25-a501-44f0-8b24-6630fb71d41b\") " pod="openstack/placement-6558bbc9d4-wdcbn" Sep 30 14:20:19 crc kubenswrapper[4676]: I0930 14:20:19.299836 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5505b25-a501-44f0-8b24-6630fb71d41b-scripts\") pod \"placement-6558bbc9d4-wdcbn\" (UID: \"c5505b25-a501-44f0-8b24-6630fb71d41b\") " pod="openstack/placement-6558bbc9d4-wdcbn" Sep 30 14:20:19 crc kubenswrapper[4676]: I0930 14:20:19.310673 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5505b25-a501-44f0-8b24-6630fb71d41b-config-data\") pod \"placement-6558bbc9d4-wdcbn\" (UID: \"c5505b25-a501-44f0-8b24-6630fb71d41b\") " pod="openstack/placement-6558bbc9d4-wdcbn" Sep 30 14:20:19 crc kubenswrapper[4676]: I0930 14:20:19.314622 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwh87\" (UniqueName: \"kubernetes.io/projected/c5505b25-a501-44f0-8b24-6630fb71d41b-kube-api-access-rwh87\") pod \"placement-6558bbc9d4-wdcbn\" (UID: \"c5505b25-a501-44f0-8b24-6630fb71d41b\") " pod="openstack/placement-6558bbc9d4-wdcbn" Sep 30 14:20:19 crc kubenswrapper[4676]: I0930 14:20:19.317633 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5505b25-a501-44f0-8b24-6630fb71d41b-public-tls-certs\") pod \"placement-6558bbc9d4-wdcbn\" (UID: \"c5505b25-a501-44f0-8b24-6630fb71d41b\") " pod="openstack/placement-6558bbc9d4-wdcbn" Sep 30 14:20:19 crc kubenswrapper[4676]: I0930 14:20:19.322650 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5505b25-a501-44f0-8b24-6630fb71d41b-combined-ca-bundle\") pod \"placement-6558bbc9d4-wdcbn\" (UID: \"c5505b25-a501-44f0-8b24-6630fb71d41b\") " pod="openstack/placement-6558bbc9d4-wdcbn" Sep 30 14:20:19 crc kubenswrapper[4676]: I0930 14:20:19.349769 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6558bbc9d4-wdcbn" Sep 30 14:20:19 crc kubenswrapper[4676]: I0930 14:20:19.859371 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6558bbc9d4-wdcbn"] Sep 30 14:20:19 crc kubenswrapper[4676]: I0930 14:20:19.937864 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6558bbc9d4-wdcbn" event={"ID":"c5505b25-a501-44f0-8b24-6630fb71d41b","Type":"ContainerStarted","Data":"e11815673e0133367f47011c20cbc5d28bee74b0d33cfa49dd3b19885c24bb3a"} Sep 30 14:20:20 crc kubenswrapper[4676]: I0930 14:20:20.947644 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6558bbc9d4-wdcbn" event={"ID":"c5505b25-a501-44f0-8b24-6630fb71d41b","Type":"ContainerStarted","Data":"7adfb2b449a555f07cfb7cca49f128b6263efab9c373e2452e905535b968af7b"} Sep 30 14:20:21 crc kubenswrapper[4676]: I0930 14:20:21.803502 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6668cdff8d-z8vnk" podUID="ace12602-f0f7-4f29-8c37-72c1e840bacc" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.144:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.144:8443: connect: connection refused" Sep 30 14:20:21 crc kubenswrapper[4676]: I0930 14:20:21.904037 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-68fc47cdb4-6758j" podUID="a020c8ba-b848-4a3f-80e4-b3692cf99ffa" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.145:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.145:8443: connect: connection refused" Sep 30 14:20:23 crc kubenswrapper[4676]: I0930 14:20:23.822709 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-fptlq" Sep 30 14:20:23 crc kubenswrapper[4676]: I0930 14:20:23.867912 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7724b184-865f-4ced-bdf7-867184cf3647-combined-ca-bundle\") pod \"7724b184-865f-4ced-bdf7-867184cf3647\" (UID: \"7724b184-865f-4ced-bdf7-867184cf3647\") " Sep 30 14:20:23 crc kubenswrapper[4676]: I0930 14:20:23.867969 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7724b184-865f-4ced-bdf7-867184cf3647-config-data\") pod \"7724b184-865f-4ced-bdf7-867184cf3647\" (UID: \"7724b184-865f-4ced-bdf7-867184cf3647\") " Sep 30 14:20:23 crc kubenswrapper[4676]: I0930 14:20:23.867997 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7724b184-865f-4ced-bdf7-867184cf3647-db-sync-config-data\") pod \"7724b184-865f-4ced-bdf7-867184cf3647\" (UID: \"7724b184-865f-4ced-bdf7-867184cf3647\") " Sep 30 14:20:23 crc kubenswrapper[4676]: I0930 14:20:23.868029 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9whng\" (UniqueName: \"kubernetes.io/projected/7724b184-865f-4ced-bdf7-867184cf3647-kube-api-access-9whng\") pod \"7724b184-865f-4ced-bdf7-867184cf3647\" (UID: \"7724b184-865f-4ced-bdf7-867184cf3647\") " Sep 30 14:20:23 crc kubenswrapper[4676]: I0930 14:20:23.874620 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7724b184-865f-4ced-bdf7-867184cf3647-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "7724b184-865f-4ced-bdf7-867184cf3647" (UID: "7724b184-865f-4ced-bdf7-867184cf3647"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:20:23 crc kubenswrapper[4676]: I0930 14:20:23.875067 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7724b184-865f-4ced-bdf7-867184cf3647-kube-api-access-9whng" (OuterVolumeSpecName: "kube-api-access-9whng") pod "7724b184-865f-4ced-bdf7-867184cf3647" (UID: "7724b184-865f-4ced-bdf7-867184cf3647"). InnerVolumeSpecName "kube-api-access-9whng". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:20:23 crc kubenswrapper[4676]: I0930 14:20:23.900674 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7724b184-865f-4ced-bdf7-867184cf3647-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7724b184-865f-4ced-bdf7-867184cf3647" (UID: "7724b184-865f-4ced-bdf7-867184cf3647"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:20:23 crc kubenswrapper[4676]: I0930 14:20:23.922907 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7724b184-865f-4ced-bdf7-867184cf3647-config-data" (OuterVolumeSpecName: "config-data") pod "7724b184-865f-4ced-bdf7-867184cf3647" (UID: "7724b184-865f-4ced-bdf7-867184cf3647"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:20:23 crc kubenswrapper[4676]: I0930 14:20:23.970083 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7724b184-865f-4ced-bdf7-867184cf3647-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:23 crc kubenswrapper[4676]: I0930 14:20:23.970133 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7724b184-865f-4ced-bdf7-867184cf3647-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:23 crc kubenswrapper[4676]: I0930 14:20:23.970144 4676 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7724b184-865f-4ced-bdf7-867184cf3647-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:23 crc kubenswrapper[4676]: I0930 14:20:23.970154 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9whng\" (UniqueName: \"kubernetes.io/projected/7724b184-865f-4ced-bdf7-867184cf3647-kube-api-access-9whng\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:23 crc kubenswrapper[4676]: I0930 14:20:23.971628 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-fptlq" event={"ID":"7724b184-865f-4ced-bdf7-867184cf3647","Type":"ContainerDied","Data":"d892166cd8fb315cc629792785ae20766ebb2350fd835e3907ed3005eb1b89f1"} Sep 30 14:20:23 crc kubenswrapper[4676]: I0930 14:20:23.971659 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d892166cd8fb315cc629792785ae20766ebb2350fd835e3907ed3005eb1b89f1" Sep 30 14:20:23 crc kubenswrapper[4676]: I0930 14:20:23.971707 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-fptlq" Sep 30 14:20:25 crc kubenswrapper[4676]: I0930 14:20:25.227682 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-d6pxb"] Sep 30 14:20:25 crc kubenswrapper[4676]: E0930 14:20:25.228461 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7724b184-865f-4ced-bdf7-867184cf3647" containerName="glance-db-sync" Sep 30 14:20:25 crc kubenswrapper[4676]: I0930 14:20:25.228482 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="7724b184-865f-4ced-bdf7-867184cf3647" containerName="glance-db-sync" Sep 30 14:20:25 crc kubenswrapper[4676]: I0930 14:20:25.228743 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="7724b184-865f-4ced-bdf7-867184cf3647" containerName="glance-db-sync" Sep 30 14:20:25 crc kubenswrapper[4676]: I0930 14:20:25.230190 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-d6pxb" Sep 30 14:20:25 crc kubenswrapper[4676]: I0930 14:20:25.238369 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-d6pxb"] Sep 30 14:20:25 crc kubenswrapper[4676]: I0930 14:20:25.292101 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2-config\") pod \"dnsmasq-dns-8b5c85b87-d6pxb\" (UID: \"4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2\") " pod="openstack/dnsmasq-dns-8b5c85b87-d6pxb" Sep 30 14:20:25 crc kubenswrapper[4676]: I0930 14:20:25.292176 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbtq5\" (UniqueName: \"kubernetes.io/projected/4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2-kube-api-access-vbtq5\") pod \"dnsmasq-dns-8b5c85b87-d6pxb\" (UID: \"4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2\") " pod="openstack/dnsmasq-dns-8b5c85b87-d6pxb" Sep 30 14:20:25 crc kubenswrapper[4676]: I0930 14:20:25.292206 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-d6pxb\" (UID: \"4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2\") " pod="openstack/dnsmasq-dns-8b5c85b87-d6pxb" Sep 30 14:20:25 crc kubenswrapper[4676]: I0930 14:20:25.292225 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-d6pxb\" (UID: \"4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2\") " pod="openstack/dnsmasq-dns-8b5c85b87-d6pxb" Sep 30 14:20:25 crc kubenswrapper[4676]: I0930 14:20:25.292270 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-d6pxb\" (UID: \"4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2\") " pod="openstack/dnsmasq-dns-8b5c85b87-d6pxb" Sep 30 14:20:25 crc kubenswrapper[4676]: I0930 14:20:25.292335 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-d6pxb\" (UID: \"4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2\") " pod="openstack/dnsmasq-dns-8b5c85b87-d6pxb" Sep 30 14:20:25 crc kubenswrapper[4676]: I0930 14:20:25.393491 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-d6pxb\" (UID: \"4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2\") " pod="openstack/dnsmasq-dns-8b5c85b87-d6pxb" Sep 30 14:20:25 crc kubenswrapper[4676]: I0930 14:20:25.394637 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2-config\") pod \"dnsmasq-dns-8b5c85b87-d6pxb\" (UID: \"4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2\") " pod="openstack/dnsmasq-dns-8b5c85b87-d6pxb" Sep 30 14:20:25 crc kubenswrapper[4676]: I0930 14:20:25.394563 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-d6pxb\" (UID: \"4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2\") " pod="openstack/dnsmasq-dns-8b5c85b87-d6pxb" Sep 30 14:20:25 crc kubenswrapper[4676]: I0930 14:20:25.395277 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2-config\") pod \"dnsmasq-dns-8b5c85b87-d6pxb\" (UID: \"4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2\") " pod="openstack/dnsmasq-dns-8b5c85b87-d6pxb" Sep 30 14:20:25 crc kubenswrapper[4676]: I0930 14:20:25.396813 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbtq5\" (UniqueName: \"kubernetes.io/projected/4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2-kube-api-access-vbtq5\") pod \"dnsmasq-dns-8b5c85b87-d6pxb\" (UID: \"4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2\") " pod="openstack/dnsmasq-dns-8b5c85b87-d6pxb" Sep 30 14:20:25 crc kubenswrapper[4676]: I0930 14:20:25.396893 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-d6pxb\" (UID: \"4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2\") " pod="openstack/dnsmasq-dns-8b5c85b87-d6pxb" Sep 30 14:20:25 crc kubenswrapper[4676]: I0930 14:20:25.397584 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-d6pxb\" (UID: \"4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2\") " pod="openstack/dnsmasq-dns-8b5c85b87-d6pxb" Sep 30 14:20:25 crc kubenswrapper[4676]: I0930 14:20:25.396932 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-d6pxb\" (UID: \"4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2\") " pod="openstack/dnsmasq-dns-8b5c85b87-d6pxb" Sep 30 14:20:25 crc kubenswrapper[4676]: I0930 14:20:25.397746 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-d6pxb\" (UID: \"4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2\") " pod="openstack/dnsmasq-dns-8b5c85b87-d6pxb" Sep 30 14:20:25 crc kubenswrapper[4676]: I0930 14:20:25.398432 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-d6pxb\" (UID: \"4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2\") " pod="openstack/dnsmasq-dns-8b5c85b87-d6pxb" Sep 30 14:20:25 crc kubenswrapper[4676]: I0930 14:20:25.398617 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-d6pxb\" (UID: \"4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2\") " pod="openstack/dnsmasq-dns-8b5c85b87-d6pxb" Sep 30 14:20:25 crc kubenswrapper[4676]: I0930 14:20:25.426261 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbtq5\" (UniqueName: \"kubernetes.io/projected/4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2-kube-api-access-vbtq5\") pod \"dnsmasq-dns-8b5c85b87-d6pxb\" (UID: \"4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2\") " pod="openstack/dnsmasq-dns-8b5c85b87-d6pxb" Sep 30 14:20:25 crc kubenswrapper[4676]: I0930 14:20:25.551826 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-d6pxb" Sep 30 14:20:26 crc kubenswrapper[4676]: I0930 14:20:26.257398 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 14:20:26 crc kubenswrapper[4676]: I0930 14:20:26.258944 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 14:20:26 crc kubenswrapper[4676]: I0930 14:20:26.261085 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Sep 30 14:20:26 crc kubenswrapper[4676]: I0930 14:20:26.261395 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Sep 30 14:20:26 crc kubenswrapper[4676]: I0930 14:20:26.261498 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-59zsj" Sep 30 14:20:26 crc kubenswrapper[4676]: I0930 14:20:26.266922 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 14:20:26 crc kubenswrapper[4676]: I0930 14:20:26.319798 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7d4676fc-805d-4ece-9bdb-d58750217b99-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7d4676fc-805d-4ece-9bdb-d58750217b99\") " pod="openstack/glance-default-external-api-0" Sep 30 14:20:26 crc kubenswrapper[4676]: I0930 14:20:26.319857 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d4676fc-805d-4ece-9bdb-d58750217b99-config-data\") pod \"glance-default-external-api-0\" (UID: \"7d4676fc-805d-4ece-9bdb-d58750217b99\") " pod="openstack/glance-default-external-api-0" Sep 30 14:20:26 crc kubenswrapper[4676]: I0930 14:20:26.319908 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d4676fc-805d-4ece-9bdb-d58750217b99-scripts\") pod \"glance-default-external-api-0\" (UID: \"7d4676fc-805d-4ece-9bdb-d58750217b99\") " pod="openstack/glance-default-external-api-0" Sep 30 14:20:26 crc kubenswrapper[4676]: I0930 14:20:26.319955 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"7d4676fc-805d-4ece-9bdb-d58750217b99\") " pod="openstack/glance-default-external-api-0" Sep 30 14:20:26 crc kubenswrapper[4676]: I0930 14:20:26.319996 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d4676fc-805d-4ece-9bdb-d58750217b99-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7d4676fc-805d-4ece-9bdb-d58750217b99\") " pod="openstack/glance-default-external-api-0" Sep 30 14:20:26 crc kubenswrapper[4676]: I0930 14:20:26.320082 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d4676fc-805d-4ece-9bdb-d58750217b99-logs\") pod \"glance-default-external-api-0\" (UID: \"7d4676fc-805d-4ece-9bdb-d58750217b99\") " pod="openstack/glance-default-external-api-0" Sep 30 14:20:26 crc kubenswrapper[4676]: I0930 14:20:26.320136 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46bct\" (UniqueName: \"kubernetes.io/projected/7d4676fc-805d-4ece-9bdb-d58750217b99-kube-api-access-46bct\") pod \"glance-default-external-api-0\" (UID: \"7d4676fc-805d-4ece-9bdb-d58750217b99\") " pod="openstack/glance-default-external-api-0" Sep 30 14:20:26 crc kubenswrapper[4676]: I0930 14:20:26.352321 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 14:20:26 crc kubenswrapper[4676]: I0930 14:20:26.354157 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 14:20:26 crc kubenswrapper[4676]: I0930 14:20:26.363781 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Sep 30 14:20:26 crc kubenswrapper[4676]: I0930 14:20:26.417145 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 14:20:26 crc kubenswrapper[4676]: I0930 14:20:26.423078 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7d4676fc-805d-4ece-9bdb-d58750217b99-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7d4676fc-805d-4ece-9bdb-d58750217b99\") " pod="openstack/glance-default-external-api-0" Sep 30 14:20:26 crc kubenswrapper[4676]: I0930 14:20:26.423388 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d4676fc-805d-4ece-9bdb-d58750217b99-config-data\") pod \"glance-default-external-api-0\" (UID: \"7d4676fc-805d-4ece-9bdb-d58750217b99\") " pod="openstack/glance-default-external-api-0" Sep 30 14:20:26 crc kubenswrapper[4676]: I0930 14:20:26.423421 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d4676fc-805d-4ece-9bdb-d58750217b99-scripts\") pod \"glance-default-external-api-0\" (UID: \"7d4676fc-805d-4ece-9bdb-d58750217b99\") " pod="openstack/glance-default-external-api-0" Sep 30 14:20:26 crc kubenswrapper[4676]: I0930 14:20:26.423465 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"7d4676fc-805d-4ece-9bdb-d58750217b99\") " pod="openstack/glance-default-external-api-0" Sep 30 14:20:26 crc kubenswrapper[4676]: I0930 14:20:26.423501 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d4676fc-805d-4ece-9bdb-d58750217b99-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7d4676fc-805d-4ece-9bdb-d58750217b99\") " pod="openstack/glance-default-external-api-0" Sep 30 14:20:26 crc kubenswrapper[4676]: I0930 14:20:26.423587 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d4676fc-805d-4ece-9bdb-d58750217b99-logs\") pod \"glance-default-external-api-0\" (UID: \"7d4676fc-805d-4ece-9bdb-d58750217b99\") " pod="openstack/glance-default-external-api-0" Sep 30 14:20:26 crc kubenswrapper[4676]: I0930 14:20:26.423633 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46bct\" (UniqueName: \"kubernetes.io/projected/7d4676fc-805d-4ece-9bdb-d58750217b99-kube-api-access-46bct\") pod \"glance-default-external-api-0\" (UID: \"7d4676fc-805d-4ece-9bdb-d58750217b99\") " pod="openstack/glance-default-external-api-0" Sep 30 14:20:26 crc kubenswrapper[4676]: I0930 14:20:26.424052 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7d4676fc-805d-4ece-9bdb-d58750217b99-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7d4676fc-805d-4ece-9bdb-d58750217b99\") " pod="openstack/glance-default-external-api-0" Sep 30 14:20:26 crc kubenswrapper[4676]: I0930 14:20:26.428507 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d4676fc-805d-4ece-9bdb-d58750217b99-logs\") pod \"glance-default-external-api-0\" (UID: \"7d4676fc-805d-4ece-9bdb-d58750217b99\") " pod="openstack/glance-default-external-api-0" Sep 30 14:20:26 crc kubenswrapper[4676]: I0930 14:20:26.430697 4676 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"7d4676fc-805d-4ece-9bdb-d58750217b99\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Sep 30 14:20:26 crc kubenswrapper[4676]: I0930 14:20:26.432042 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d4676fc-805d-4ece-9bdb-d58750217b99-scripts\") pod \"glance-default-external-api-0\" (UID: \"7d4676fc-805d-4ece-9bdb-d58750217b99\") " pod="openstack/glance-default-external-api-0" Sep 30 14:20:26 crc kubenswrapper[4676]: I0930 14:20:26.433738 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d4676fc-805d-4ece-9bdb-d58750217b99-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7d4676fc-805d-4ece-9bdb-d58750217b99\") " pod="openstack/glance-default-external-api-0" Sep 30 14:20:26 crc kubenswrapper[4676]: I0930 14:20:26.434844 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d4676fc-805d-4ece-9bdb-d58750217b99-config-data\") pod \"glance-default-external-api-0\" (UID: \"7d4676fc-805d-4ece-9bdb-d58750217b99\") " pod="openstack/glance-default-external-api-0" Sep 30 14:20:26 crc kubenswrapper[4676]: I0930 14:20:26.444804 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46bct\" (UniqueName: \"kubernetes.io/projected/7d4676fc-805d-4ece-9bdb-d58750217b99-kube-api-access-46bct\") pod \"glance-default-external-api-0\" (UID: \"7d4676fc-805d-4ece-9bdb-d58750217b99\") " pod="openstack/glance-default-external-api-0" Sep 30 14:20:26 crc kubenswrapper[4676]: I0930 14:20:26.486405 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"7d4676fc-805d-4ece-9bdb-d58750217b99\") " pod="openstack/glance-default-external-api-0" Sep 30 14:20:26 crc kubenswrapper[4676]: I0930 14:20:26.525926 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/313bd90a-adda-45d6-a1ee-970b9277b7ff-logs\") pod \"glance-default-internal-api-0\" (UID: \"313bd90a-adda-45d6-a1ee-970b9277b7ff\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:20:26 crc kubenswrapper[4676]: I0930 14:20:26.526017 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/313bd90a-adda-45d6-a1ee-970b9277b7ff-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"313bd90a-adda-45d6-a1ee-970b9277b7ff\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:20:26 crc kubenswrapper[4676]: I0930 14:20:26.526043 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/313bd90a-adda-45d6-a1ee-970b9277b7ff-config-data\") pod \"glance-default-internal-api-0\" (UID: \"313bd90a-adda-45d6-a1ee-970b9277b7ff\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:20:26 crc kubenswrapper[4676]: I0930 14:20:26.526091 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zt2kv\" (UniqueName: \"kubernetes.io/projected/313bd90a-adda-45d6-a1ee-970b9277b7ff-kube-api-access-zt2kv\") pod \"glance-default-internal-api-0\" (UID: \"313bd90a-adda-45d6-a1ee-970b9277b7ff\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:20:26 crc kubenswrapper[4676]: I0930 14:20:26.526145 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"313bd90a-adda-45d6-a1ee-970b9277b7ff\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:20:26 crc kubenswrapper[4676]: I0930 14:20:26.526180 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/313bd90a-adda-45d6-a1ee-970b9277b7ff-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"313bd90a-adda-45d6-a1ee-970b9277b7ff\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:20:26 crc kubenswrapper[4676]: I0930 14:20:26.526261 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/313bd90a-adda-45d6-a1ee-970b9277b7ff-scripts\") pod \"glance-default-internal-api-0\" (UID: \"313bd90a-adda-45d6-a1ee-970b9277b7ff\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:20:26 crc kubenswrapper[4676]: I0930 14:20:26.586609 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 14:20:26 crc kubenswrapper[4676]: I0930 14:20:26.628262 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"313bd90a-adda-45d6-a1ee-970b9277b7ff\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:20:26 crc kubenswrapper[4676]: I0930 14:20:26.628312 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/313bd90a-adda-45d6-a1ee-970b9277b7ff-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"313bd90a-adda-45d6-a1ee-970b9277b7ff\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:20:26 crc kubenswrapper[4676]: I0930 14:20:26.628395 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/313bd90a-adda-45d6-a1ee-970b9277b7ff-scripts\") pod \"glance-default-internal-api-0\" (UID: \"313bd90a-adda-45d6-a1ee-970b9277b7ff\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:20:26 crc kubenswrapper[4676]: I0930 14:20:26.628442 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/313bd90a-adda-45d6-a1ee-970b9277b7ff-logs\") pod \"glance-default-internal-api-0\" (UID: \"313bd90a-adda-45d6-a1ee-970b9277b7ff\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:20:26 crc kubenswrapper[4676]: I0930 14:20:26.628476 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/313bd90a-adda-45d6-a1ee-970b9277b7ff-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"313bd90a-adda-45d6-a1ee-970b9277b7ff\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:20:26 crc kubenswrapper[4676]: I0930 14:20:26.628491 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/313bd90a-adda-45d6-a1ee-970b9277b7ff-config-data\") pod \"glance-default-internal-api-0\" (UID: \"313bd90a-adda-45d6-a1ee-970b9277b7ff\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:20:26 crc kubenswrapper[4676]: I0930 14:20:26.628525 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zt2kv\" (UniqueName: \"kubernetes.io/projected/313bd90a-adda-45d6-a1ee-970b9277b7ff-kube-api-access-zt2kv\") pod \"glance-default-internal-api-0\" (UID: \"313bd90a-adda-45d6-a1ee-970b9277b7ff\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:20:26 crc kubenswrapper[4676]: I0930 14:20:26.629184 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/313bd90a-adda-45d6-a1ee-970b9277b7ff-logs\") pod \"glance-default-internal-api-0\" (UID: \"313bd90a-adda-45d6-a1ee-970b9277b7ff\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:20:26 crc kubenswrapper[4676]: I0930 14:20:26.629298 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/313bd90a-adda-45d6-a1ee-970b9277b7ff-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"313bd90a-adda-45d6-a1ee-970b9277b7ff\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:20:26 crc kubenswrapper[4676]: I0930 14:20:26.629873 4676 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"313bd90a-adda-45d6-a1ee-970b9277b7ff\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-internal-api-0" Sep 30 14:20:26 crc kubenswrapper[4676]: I0930 14:20:26.635038 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/313bd90a-adda-45d6-a1ee-970b9277b7ff-config-data\") pod \"glance-default-internal-api-0\" (UID: \"313bd90a-adda-45d6-a1ee-970b9277b7ff\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:20:26 crc kubenswrapper[4676]: I0930 14:20:26.637634 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/313bd90a-adda-45d6-a1ee-970b9277b7ff-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"313bd90a-adda-45d6-a1ee-970b9277b7ff\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:20:26 crc kubenswrapper[4676]: I0930 14:20:26.648415 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/313bd90a-adda-45d6-a1ee-970b9277b7ff-scripts\") pod \"glance-default-internal-api-0\" (UID: \"313bd90a-adda-45d6-a1ee-970b9277b7ff\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:20:26 crc kubenswrapper[4676]: I0930 14:20:26.649534 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zt2kv\" (UniqueName: \"kubernetes.io/projected/313bd90a-adda-45d6-a1ee-970b9277b7ff-kube-api-access-zt2kv\") pod \"glance-default-internal-api-0\" (UID: \"313bd90a-adda-45d6-a1ee-970b9277b7ff\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:20:26 crc kubenswrapper[4676]: I0930 14:20:26.677910 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"313bd90a-adda-45d6-a1ee-970b9277b7ff\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:20:26 crc kubenswrapper[4676]: I0930 14:20:26.696676 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 14:20:27 crc kubenswrapper[4676]: I0930 14:20:27.010408 4676 generic.go:334] "Generic (PLEG): container finished" podID="c1fd6e1e-38da-4634-9862-21c027ea770a" containerID="b93a672a897fe21e83c7690dbae28e10b7055fe7971a695f3c3c8ae60a15a1ba" exitCode=0 Sep 30 14:20:27 crc kubenswrapper[4676]: I0930 14:20:27.010512 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-b5nzl" event={"ID":"c1fd6e1e-38da-4634-9862-21c027ea770a","Type":"ContainerDied","Data":"b93a672a897fe21e83c7690dbae28e10b7055fe7971a695f3c3c8ae60a15a1ba"} Sep 30 14:20:27 crc kubenswrapper[4676]: I0930 14:20:27.013109 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6558bbc9d4-wdcbn" event={"ID":"c5505b25-a501-44f0-8b24-6630fb71d41b","Type":"ContainerStarted","Data":"47461fb3b6359e5cc684730c99a1489dda74dcf9e9244a2f34ac84183bad964b"} Sep 30 14:20:27 crc kubenswrapper[4676]: I0930 14:20:27.013373 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6558bbc9d4-wdcbn" Sep 30 14:20:27 crc kubenswrapper[4676]: I0930 14:20:27.013854 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6558bbc9d4-wdcbn" Sep 30 14:20:27 crc kubenswrapper[4676]: I0930 14:20:27.813904 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6558bbc9d4-wdcbn" podStartSLOduration=9.813865856 podStartE2EDuration="9.813865856s" podCreationTimestamp="2025-09-30 14:20:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:20:27.05214771 +0000 UTC m=+1331.035236139" watchObservedRunningTime="2025-09-30 14:20:27.813865856 +0000 UTC m=+1331.796954285" Sep 30 14:20:27 crc kubenswrapper[4676]: I0930 14:20:27.816266 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 14:20:27 crc kubenswrapper[4676]: I0930 14:20:27.887941 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 14:20:29 crc kubenswrapper[4676]: I0930 14:20:29.847552 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6558bbc9d4-wdcbn" Sep 30 14:20:29 crc kubenswrapper[4676]: I0930 14:20:29.920210 4676 patch_prober.go:28] interesting pod/machine-config-daemon-4k2dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:20:29 crc kubenswrapper[4676]: I0930 14:20:29.920576 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:20:29 crc kubenswrapper[4676]: I0930 14:20:29.920640 4676 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" Sep 30 14:20:29 crc kubenswrapper[4676]: I0930 14:20:29.921546 4676 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5e91fb257d3a45cd5a74b5617de04aa40d1ce872ef596abb2a4557538639b58d"} pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 14:20:29 crc kubenswrapper[4676]: I0930 14:20:29.921623 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerName="machine-config-daemon" containerID="cri-o://5e91fb257d3a45cd5a74b5617de04aa40d1ce872ef596abb2a4557538639b58d" gracePeriod=600 Sep 30 14:20:30 crc kubenswrapper[4676]: I0930 14:20:30.072297 4676 generic.go:334] "Generic (PLEG): container finished" podID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerID="5e91fb257d3a45cd5a74b5617de04aa40d1ce872ef596abb2a4557538639b58d" exitCode=0 Sep 30 14:20:30 crc kubenswrapper[4676]: I0930 14:20:30.072352 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" event={"ID":"af133cb7-f0e4-428e-b348-c6e81493fc1d","Type":"ContainerDied","Data":"5e91fb257d3a45cd5a74b5617de04aa40d1ce872ef596abb2a4557538639b58d"} Sep 30 14:20:30 crc kubenswrapper[4676]: I0930 14:20:30.072389 4676 scope.go:117] "RemoveContainer" containerID="124b2a96400d24919cd22f949ea67e3aa5eaa3e8b7e92aeb3ff38f0b94aac9fe" Sep 30 14:20:30 crc kubenswrapper[4676]: I0930 14:20:30.712724 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-b5nzl" Sep 30 14:20:30 crc kubenswrapper[4676]: I0930 14:20:30.762389 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jktg8\" (UniqueName: \"kubernetes.io/projected/c1fd6e1e-38da-4634-9862-21c027ea770a-kube-api-access-jktg8\") pod \"c1fd6e1e-38da-4634-9862-21c027ea770a\" (UID: \"c1fd6e1e-38da-4634-9862-21c027ea770a\") " Sep 30 14:20:30 crc kubenswrapper[4676]: I0930 14:20:30.762549 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1fd6e1e-38da-4634-9862-21c027ea770a-combined-ca-bundle\") pod \"c1fd6e1e-38da-4634-9862-21c027ea770a\" (UID: \"c1fd6e1e-38da-4634-9862-21c027ea770a\") " Sep 30 14:20:30 crc kubenswrapper[4676]: I0930 14:20:30.762615 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c1fd6e1e-38da-4634-9862-21c027ea770a-db-sync-config-data\") pod \"c1fd6e1e-38da-4634-9862-21c027ea770a\" (UID: \"c1fd6e1e-38da-4634-9862-21c027ea770a\") " Sep 30 14:20:30 crc kubenswrapper[4676]: I0930 14:20:30.770199 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1fd6e1e-38da-4634-9862-21c027ea770a-kube-api-access-jktg8" (OuterVolumeSpecName: "kube-api-access-jktg8") pod "c1fd6e1e-38da-4634-9862-21c027ea770a" (UID: "c1fd6e1e-38da-4634-9862-21c027ea770a"). InnerVolumeSpecName "kube-api-access-jktg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:20:30 crc kubenswrapper[4676]: I0930 14:20:30.771353 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1fd6e1e-38da-4634-9862-21c027ea770a-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "c1fd6e1e-38da-4634-9862-21c027ea770a" (UID: "c1fd6e1e-38da-4634-9862-21c027ea770a"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:20:30 crc kubenswrapper[4676]: I0930 14:20:30.792803 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1fd6e1e-38da-4634-9862-21c027ea770a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c1fd6e1e-38da-4634-9862-21c027ea770a" (UID: "c1fd6e1e-38da-4634-9862-21c027ea770a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:20:30 crc kubenswrapper[4676]: I0930 14:20:30.865292 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1fd6e1e-38da-4634-9862-21c027ea770a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:30 crc kubenswrapper[4676]: I0930 14:20:30.865339 4676 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c1fd6e1e-38da-4634-9862-21c027ea770a-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:30 crc kubenswrapper[4676]: I0930 14:20:30.865356 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jktg8\" (UniqueName: \"kubernetes.io/projected/c1fd6e1e-38da-4634-9862-21c027ea770a-kube-api-access-jktg8\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:31 crc kubenswrapper[4676]: I0930 14:20:31.082639 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-b5nzl" event={"ID":"c1fd6e1e-38da-4634-9862-21c027ea770a","Type":"ContainerDied","Data":"bdfd0f30116ee60462a628ab7dedbf55fad6923b35dbf6e0076e8a5e1cf426e6"} Sep 30 14:20:31 crc kubenswrapper[4676]: I0930 14:20:31.082686 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bdfd0f30116ee60462a628ab7dedbf55fad6923b35dbf6e0076e8a5e1cf426e6" Sep 30 14:20:31 crc kubenswrapper[4676]: I0930 14:20:31.082735 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-b5nzl" Sep 30 14:20:31 crc kubenswrapper[4676]: E0930 14:20:31.303070 4676 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/ubi9/httpd-24:latest" Sep 30 14:20:31 crc kubenswrapper[4676]: E0930 14:20:31.303290 4676 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:proxy-httpd,Image:registry.redhat.io/ubi9/httpd-24:latest,Command:[/usr/sbin/httpd],Args:[-DFOREGROUND],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:proxy-httpd,HostPort:0,ContainerPort:3000,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf/httpd.conf,SubPath:httpd.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf.d/ssl.conf,SubPath:ssl.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:run-httpd,ReadOnly:false,MountPath:/run/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:log-httpd,ReadOnly:false,MountPath:/var/log/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nt82t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(5540da5e-a02e-437f-82a8-e0f74ac91760): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 30 14:20:31 crc kubenswrapper[4676]: E0930 14:20:31.304538 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"proxy-httpd\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"]" pod="openstack/ceilometer-0" podUID="5540da5e-a02e-437f-82a8-e0f74ac91760" Sep 30 14:20:31 crc kubenswrapper[4676]: I0930 14:20:31.803520 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6668cdff8d-z8vnk" podUID="ace12602-f0f7-4f29-8c37-72c1e840bacc" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.144:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.144:8443: connect: connection refused" Sep 30 14:20:31 crc kubenswrapper[4676]: I0930 14:20:31.817058 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 14:20:31 crc kubenswrapper[4676]: I0930 14:20:31.894579 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-d6pxb"] Sep 30 14:20:31 crc kubenswrapper[4676]: I0930 14:20:31.899762 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-68fc47cdb4-6758j" podUID="a020c8ba-b848-4a3f-80e4-b3692cf99ffa" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.145:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.145:8443: connect: connection refused" Sep 30 14:20:31 crc kubenswrapper[4676]: I0930 14:20:31.936944 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.056953 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-bf6466755-m2t9t"] Sep 30 14:20:32 crc kubenswrapper[4676]: E0930 14:20:32.057637 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1fd6e1e-38da-4634-9862-21c027ea770a" containerName="barbican-db-sync" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.057656 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1fd6e1e-38da-4634-9862-21c027ea770a" containerName="barbican-db-sync" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.057903 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1fd6e1e-38da-4634-9862-21c027ea770a" containerName="barbican-db-sync" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.059026 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-bf6466755-m2t9t" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.065792 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.066002 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-g2kbp" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.066126 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.086066 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55zc9\" (UniqueName: \"kubernetes.io/projected/39f521d2-b195-4179-a114-1c1611e4ba2f-kube-api-access-55zc9\") pod \"barbican-worker-bf6466755-m2t9t\" (UID: \"39f521d2-b195-4179-a114-1c1611e4ba2f\") " pod="openstack/barbican-worker-bf6466755-m2t9t" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.086121 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39f521d2-b195-4179-a114-1c1611e4ba2f-logs\") pod \"barbican-worker-bf6466755-m2t9t\" (UID: \"39f521d2-b195-4179-a114-1c1611e4ba2f\") " pod="openstack/barbican-worker-bf6466755-m2t9t" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.086167 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39f521d2-b195-4179-a114-1c1611e4ba2f-config-data\") pod \"barbican-worker-bf6466755-m2t9t\" (UID: \"39f521d2-b195-4179-a114-1c1611e4ba2f\") " pod="openstack/barbican-worker-bf6466755-m2t9t" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.086288 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/39f521d2-b195-4179-a114-1c1611e4ba2f-config-data-custom\") pod \"barbican-worker-bf6466755-m2t9t\" (UID: \"39f521d2-b195-4179-a114-1c1611e4ba2f\") " pod="openstack/barbican-worker-bf6466755-m2t9t" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.086338 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39f521d2-b195-4179-a114-1c1611e4ba2f-combined-ca-bundle\") pod \"barbican-worker-bf6466755-m2t9t\" (UID: \"39f521d2-b195-4179-a114-1c1611e4ba2f\") " pod="openstack/barbican-worker-bf6466755-m2t9t" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.128932 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-bf6466755-m2t9t"] Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.169329 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7d4676fc-805d-4ece-9bdb-d58750217b99","Type":"ContainerStarted","Data":"667fcbd801572bc8881a8d4ca65fac2b92be93f9ac58ee075ab914a131cd1891"} Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.171467 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"313bd90a-adda-45d6-a1ee-970b9277b7ff","Type":"ContainerStarted","Data":"f24ca228bb13f0e84cb2a099db9ee909241e2bbfbff2de83c250b632656323a4"} Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.182515 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" event={"ID":"af133cb7-f0e4-428e-b348-c6e81493fc1d","Type":"ContainerStarted","Data":"629ff1f63cc08d0a90639ac0dfdfcd800429b2ca079d983358e85abb811d00e0"} Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.188222 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5540da5e-a02e-437f-82a8-e0f74ac91760" containerName="ceilometer-notification-agent" containerID="cri-o://1de6c42bdacb6c16f5a7a5eb87b5c6e47392b47b8aeae74dd79a567f63dda5ef" gracePeriod=30 Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.188503 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-d6pxb" event={"ID":"4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2","Type":"ContainerStarted","Data":"26aed09f0b994af604ca7131c3d8415ec30d5b9aa2881340474a18598d54d234"} Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.188554 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5540da5e-a02e-437f-82a8-e0f74ac91760" containerName="sg-core" containerID="cri-o://ea7cc09bbd28253f4e39517465438b41cce5f3bad49b276f7337fe769b0d5ec0" gracePeriod=30 Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.194508 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/39f521d2-b195-4179-a114-1c1611e4ba2f-config-data-custom\") pod \"barbican-worker-bf6466755-m2t9t\" (UID: \"39f521d2-b195-4179-a114-1c1611e4ba2f\") " pod="openstack/barbican-worker-bf6466755-m2t9t" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.194619 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39f521d2-b195-4179-a114-1c1611e4ba2f-combined-ca-bundle\") pod \"barbican-worker-bf6466755-m2t9t\" (UID: \"39f521d2-b195-4179-a114-1c1611e4ba2f\") " pod="openstack/barbican-worker-bf6466755-m2t9t" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.194693 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55zc9\" (UniqueName: \"kubernetes.io/projected/39f521d2-b195-4179-a114-1c1611e4ba2f-kube-api-access-55zc9\") pod \"barbican-worker-bf6466755-m2t9t\" (UID: \"39f521d2-b195-4179-a114-1c1611e4ba2f\") " pod="openstack/barbican-worker-bf6466755-m2t9t" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.194713 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39f521d2-b195-4179-a114-1c1611e4ba2f-logs\") pod \"barbican-worker-bf6466755-m2t9t\" (UID: \"39f521d2-b195-4179-a114-1c1611e4ba2f\") " pod="openstack/barbican-worker-bf6466755-m2t9t" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.194768 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39f521d2-b195-4179-a114-1c1611e4ba2f-config-data\") pod \"barbican-worker-bf6466755-m2t9t\" (UID: \"39f521d2-b195-4179-a114-1c1611e4ba2f\") " pod="openstack/barbican-worker-bf6466755-m2t9t" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.198763 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39f521d2-b195-4179-a114-1c1611e4ba2f-logs\") pod \"barbican-worker-bf6466755-m2t9t\" (UID: \"39f521d2-b195-4179-a114-1c1611e4ba2f\") " pod="openstack/barbican-worker-bf6466755-m2t9t" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.205450 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39f521d2-b195-4179-a114-1c1611e4ba2f-config-data\") pod \"barbican-worker-bf6466755-m2t9t\" (UID: \"39f521d2-b195-4179-a114-1c1611e4ba2f\") " pod="openstack/barbican-worker-bf6466755-m2t9t" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.205684 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-777c6c994b-kk5rn"] Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.210600 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-777c6c994b-kk5rn" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.220863 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.240737 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39f521d2-b195-4179-a114-1c1611e4ba2f-combined-ca-bundle\") pod \"barbican-worker-bf6466755-m2t9t\" (UID: \"39f521d2-b195-4179-a114-1c1611e4ba2f\") " pod="openstack/barbican-worker-bf6466755-m2t9t" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.250048 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/39f521d2-b195-4179-a114-1c1611e4ba2f-config-data-custom\") pod \"barbican-worker-bf6466755-m2t9t\" (UID: \"39f521d2-b195-4179-a114-1c1611e4ba2f\") " pod="openstack/barbican-worker-bf6466755-m2t9t" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.251052 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-777c6c994b-kk5rn"] Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.252619 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55zc9\" (UniqueName: \"kubernetes.io/projected/39f521d2-b195-4179-a114-1c1611e4ba2f-kube-api-access-55zc9\") pod \"barbican-worker-bf6466755-m2t9t\" (UID: \"39f521d2-b195-4179-a114-1c1611e4ba2f\") " pod="openstack/barbican-worker-bf6466755-m2t9t" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.297793 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70c7535f-4b3b-438f-9470-c857ece73452-combined-ca-bundle\") pod \"barbican-keystone-listener-777c6c994b-kk5rn\" (UID: \"70c7535f-4b3b-438f-9470-c857ece73452\") " pod="openstack/barbican-keystone-listener-777c6c994b-kk5rn" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.297962 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kn9b2\" (UniqueName: \"kubernetes.io/projected/70c7535f-4b3b-438f-9470-c857ece73452-kube-api-access-kn9b2\") pod \"barbican-keystone-listener-777c6c994b-kk5rn\" (UID: \"70c7535f-4b3b-438f-9470-c857ece73452\") " pod="openstack/barbican-keystone-listener-777c6c994b-kk5rn" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.297994 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70c7535f-4b3b-438f-9470-c857ece73452-logs\") pod \"barbican-keystone-listener-777c6c994b-kk5rn\" (UID: \"70c7535f-4b3b-438f-9470-c857ece73452\") " pod="openstack/barbican-keystone-listener-777c6c994b-kk5rn" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.298015 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/70c7535f-4b3b-438f-9470-c857ece73452-config-data-custom\") pod \"barbican-keystone-listener-777c6c994b-kk5rn\" (UID: \"70c7535f-4b3b-438f-9470-c857ece73452\") " pod="openstack/barbican-keystone-listener-777c6c994b-kk5rn" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.298047 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70c7535f-4b3b-438f-9470-c857ece73452-config-data\") pod \"barbican-keystone-listener-777c6c994b-kk5rn\" (UID: \"70c7535f-4b3b-438f-9470-c857ece73452\") " pod="openstack/barbican-keystone-listener-777c6c994b-kk5rn" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.388261 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-d6pxb"] Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.431846 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70c7535f-4b3b-438f-9470-c857ece73452-logs\") pod \"barbican-keystone-listener-777c6c994b-kk5rn\" (UID: \"70c7535f-4b3b-438f-9470-c857ece73452\") " pod="openstack/barbican-keystone-listener-777c6c994b-kk5rn" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.431945 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/70c7535f-4b3b-438f-9470-c857ece73452-config-data-custom\") pod \"barbican-keystone-listener-777c6c994b-kk5rn\" (UID: \"70c7535f-4b3b-438f-9470-c857ece73452\") " pod="openstack/barbican-keystone-listener-777c6c994b-kk5rn" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.432042 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70c7535f-4b3b-438f-9470-c857ece73452-config-data\") pod \"barbican-keystone-listener-777c6c994b-kk5rn\" (UID: \"70c7535f-4b3b-438f-9470-c857ece73452\") " pod="openstack/barbican-keystone-listener-777c6c994b-kk5rn" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.432182 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70c7535f-4b3b-438f-9470-c857ece73452-combined-ca-bundle\") pod \"barbican-keystone-listener-777c6c994b-kk5rn\" (UID: \"70c7535f-4b3b-438f-9470-c857ece73452\") " pod="openstack/barbican-keystone-listener-777c6c994b-kk5rn" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.432435 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kn9b2\" (UniqueName: \"kubernetes.io/projected/70c7535f-4b3b-438f-9470-c857ece73452-kube-api-access-kn9b2\") pod \"barbican-keystone-listener-777c6c994b-kk5rn\" (UID: \"70c7535f-4b3b-438f-9470-c857ece73452\") " pod="openstack/barbican-keystone-listener-777c6c994b-kk5rn" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.447429 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70c7535f-4b3b-438f-9470-c857ece73452-logs\") pod \"barbican-keystone-listener-777c6c994b-kk5rn\" (UID: \"70c7535f-4b3b-438f-9470-c857ece73452\") " pod="openstack/barbican-keystone-listener-777c6c994b-kk5rn" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.455585 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/70c7535f-4b3b-438f-9470-c857ece73452-config-data-custom\") pod \"barbican-keystone-listener-777c6c994b-kk5rn\" (UID: \"70c7535f-4b3b-438f-9470-c857ece73452\") " pod="openstack/barbican-keystone-listener-777c6c994b-kk5rn" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.456854 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-59d5ff467f-hnlsg"] Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.460456 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59d5ff467f-hnlsg" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.468875 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70c7535f-4b3b-438f-9470-c857ece73452-combined-ca-bundle\") pod \"barbican-keystone-listener-777c6c994b-kk5rn\" (UID: \"70c7535f-4b3b-438f-9470-c857ece73452\") " pod="openstack/barbican-keystone-listener-777c6c994b-kk5rn" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.470206 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-bf6466755-m2t9t" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.471266 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kn9b2\" (UniqueName: \"kubernetes.io/projected/70c7535f-4b3b-438f-9470-c857ece73452-kube-api-access-kn9b2\") pod \"barbican-keystone-listener-777c6c994b-kk5rn\" (UID: \"70c7535f-4b3b-438f-9470-c857ece73452\") " pod="openstack/barbican-keystone-listener-777c6c994b-kk5rn" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.476554 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70c7535f-4b3b-438f-9470-c857ece73452-config-data\") pod \"barbican-keystone-listener-777c6c994b-kk5rn\" (UID: \"70c7535f-4b3b-438f-9470-c857ece73452\") " pod="openstack/barbican-keystone-listener-777c6c994b-kk5rn" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.535688 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59d5ff467f-hnlsg"] Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.535791 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e-ovsdbserver-nb\") pod \"dnsmasq-dns-59d5ff467f-hnlsg\" (UID: \"d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e\") " pod="openstack/dnsmasq-dns-59d5ff467f-hnlsg" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.535948 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhc5d\" (UniqueName: \"kubernetes.io/projected/d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e-kube-api-access-lhc5d\") pod \"dnsmasq-dns-59d5ff467f-hnlsg\" (UID: \"d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e\") " pod="openstack/dnsmasq-dns-59d5ff467f-hnlsg" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.535977 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e-ovsdbserver-sb\") pod \"dnsmasq-dns-59d5ff467f-hnlsg\" (UID: \"d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e\") " pod="openstack/dnsmasq-dns-59d5ff467f-hnlsg" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.536003 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e-config\") pod \"dnsmasq-dns-59d5ff467f-hnlsg\" (UID: \"d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e\") " pod="openstack/dnsmasq-dns-59d5ff467f-hnlsg" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.536055 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e-dns-svc\") pod \"dnsmasq-dns-59d5ff467f-hnlsg\" (UID: \"d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e\") " pod="openstack/dnsmasq-dns-59d5ff467f-hnlsg" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.536134 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e-dns-swift-storage-0\") pod \"dnsmasq-dns-59d5ff467f-hnlsg\" (UID: \"d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e\") " pod="openstack/dnsmasq-dns-59d5ff467f-hnlsg" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.553399 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6ccd8d4586-px8lq"] Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.556804 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6ccd8d4586-px8lq" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.560292 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.595918 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6ccd8d4586-px8lq"] Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.641959 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/23c298b3-265b-4261-80d8-a79bc312db01-config-data-custom\") pod \"barbican-api-6ccd8d4586-px8lq\" (UID: \"23c298b3-265b-4261-80d8-a79bc312db01\") " pod="openstack/barbican-api-6ccd8d4586-px8lq" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.642229 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23c298b3-265b-4261-80d8-a79bc312db01-logs\") pod \"barbican-api-6ccd8d4586-px8lq\" (UID: \"23c298b3-265b-4261-80d8-a79bc312db01\") " pod="openstack/barbican-api-6ccd8d4586-px8lq" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.642340 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fvvd\" (UniqueName: \"kubernetes.io/projected/23c298b3-265b-4261-80d8-a79bc312db01-kube-api-access-8fvvd\") pod \"barbican-api-6ccd8d4586-px8lq\" (UID: \"23c298b3-265b-4261-80d8-a79bc312db01\") " pod="openstack/barbican-api-6ccd8d4586-px8lq" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.642463 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhc5d\" (UniqueName: \"kubernetes.io/projected/d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e-kube-api-access-lhc5d\") pod \"dnsmasq-dns-59d5ff467f-hnlsg\" (UID: \"d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e\") " pod="openstack/dnsmasq-dns-59d5ff467f-hnlsg" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.642519 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e-ovsdbserver-sb\") pod \"dnsmasq-dns-59d5ff467f-hnlsg\" (UID: \"d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e\") " pod="openstack/dnsmasq-dns-59d5ff467f-hnlsg" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.642560 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e-config\") pod \"dnsmasq-dns-59d5ff467f-hnlsg\" (UID: \"d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e\") " pod="openstack/dnsmasq-dns-59d5ff467f-hnlsg" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.642673 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e-dns-svc\") pod \"dnsmasq-dns-59d5ff467f-hnlsg\" (UID: \"d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e\") " pod="openstack/dnsmasq-dns-59d5ff467f-hnlsg" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.642708 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23c298b3-265b-4261-80d8-a79bc312db01-config-data\") pod \"barbican-api-6ccd8d4586-px8lq\" (UID: \"23c298b3-265b-4261-80d8-a79bc312db01\") " pod="openstack/barbican-api-6ccd8d4586-px8lq" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.642767 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23c298b3-265b-4261-80d8-a79bc312db01-combined-ca-bundle\") pod \"barbican-api-6ccd8d4586-px8lq\" (UID: \"23c298b3-265b-4261-80d8-a79bc312db01\") " pod="openstack/barbican-api-6ccd8d4586-px8lq" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.643074 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e-dns-swift-storage-0\") pod \"dnsmasq-dns-59d5ff467f-hnlsg\" (UID: \"d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e\") " pod="openstack/dnsmasq-dns-59d5ff467f-hnlsg" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.643298 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e-ovsdbserver-nb\") pod \"dnsmasq-dns-59d5ff467f-hnlsg\" (UID: \"d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e\") " pod="openstack/dnsmasq-dns-59d5ff467f-hnlsg" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.644324 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e-ovsdbserver-sb\") pod \"dnsmasq-dns-59d5ff467f-hnlsg\" (UID: \"d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e\") " pod="openstack/dnsmasq-dns-59d5ff467f-hnlsg" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.644356 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e-dns-svc\") pod \"dnsmasq-dns-59d5ff467f-hnlsg\" (UID: \"d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e\") " pod="openstack/dnsmasq-dns-59d5ff467f-hnlsg" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.644713 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e-ovsdbserver-nb\") pod \"dnsmasq-dns-59d5ff467f-hnlsg\" (UID: \"d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e\") " pod="openstack/dnsmasq-dns-59d5ff467f-hnlsg" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.645056 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e-config\") pod \"dnsmasq-dns-59d5ff467f-hnlsg\" (UID: \"d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e\") " pod="openstack/dnsmasq-dns-59d5ff467f-hnlsg" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.645274 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e-dns-swift-storage-0\") pod \"dnsmasq-dns-59d5ff467f-hnlsg\" (UID: \"d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e\") " pod="openstack/dnsmasq-dns-59d5ff467f-hnlsg" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.662195 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-777c6c994b-kk5rn" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.664778 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhc5d\" (UniqueName: \"kubernetes.io/projected/d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e-kube-api-access-lhc5d\") pod \"dnsmasq-dns-59d5ff467f-hnlsg\" (UID: \"d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e\") " pod="openstack/dnsmasq-dns-59d5ff467f-hnlsg" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.746517 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/23c298b3-265b-4261-80d8-a79bc312db01-config-data-custom\") pod \"barbican-api-6ccd8d4586-px8lq\" (UID: \"23c298b3-265b-4261-80d8-a79bc312db01\") " pod="openstack/barbican-api-6ccd8d4586-px8lq" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.746569 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23c298b3-265b-4261-80d8-a79bc312db01-logs\") pod \"barbican-api-6ccd8d4586-px8lq\" (UID: \"23c298b3-265b-4261-80d8-a79bc312db01\") " pod="openstack/barbican-api-6ccd8d4586-px8lq" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.746594 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fvvd\" (UniqueName: \"kubernetes.io/projected/23c298b3-265b-4261-80d8-a79bc312db01-kube-api-access-8fvvd\") pod \"barbican-api-6ccd8d4586-px8lq\" (UID: \"23c298b3-265b-4261-80d8-a79bc312db01\") " pod="openstack/barbican-api-6ccd8d4586-px8lq" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.746652 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23c298b3-265b-4261-80d8-a79bc312db01-config-data\") pod \"barbican-api-6ccd8d4586-px8lq\" (UID: \"23c298b3-265b-4261-80d8-a79bc312db01\") " pod="openstack/barbican-api-6ccd8d4586-px8lq" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.746674 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23c298b3-265b-4261-80d8-a79bc312db01-combined-ca-bundle\") pod \"barbican-api-6ccd8d4586-px8lq\" (UID: \"23c298b3-265b-4261-80d8-a79bc312db01\") " pod="openstack/barbican-api-6ccd8d4586-px8lq" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.748031 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23c298b3-265b-4261-80d8-a79bc312db01-logs\") pod \"barbican-api-6ccd8d4586-px8lq\" (UID: \"23c298b3-265b-4261-80d8-a79bc312db01\") " pod="openstack/barbican-api-6ccd8d4586-px8lq" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.754635 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/23c298b3-265b-4261-80d8-a79bc312db01-config-data-custom\") pod \"barbican-api-6ccd8d4586-px8lq\" (UID: \"23c298b3-265b-4261-80d8-a79bc312db01\") " pod="openstack/barbican-api-6ccd8d4586-px8lq" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.762454 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23c298b3-265b-4261-80d8-a79bc312db01-config-data\") pod \"barbican-api-6ccd8d4586-px8lq\" (UID: \"23c298b3-265b-4261-80d8-a79bc312db01\") " pod="openstack/barbican-api-6ccd8d4586-px8lq" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.763997 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23c298b3-265b-4261-80d8-a79bc312db01-combined-ca-bundle\") pod \"barbican-api-6ccd8d4586-px8lq\" (UID: \"23c298b3-265b-4261-80d8-a79bc312db01\") " pod="openstack/barbican-api-6ccd8d4586-px8lq" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.791580 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fvvd\" (UniqueName: \"kubernetes.io/projected/23c298b3-265b-4261-80d8-a79bc312db01-kube-api-access-8fvvd\") pod \"barbican-api-6ccd8d4586-px8lq\" (UID: \"23c298b3-265b-4261-80d8-a79bc312db01\") " pod="openstack/barbican-api-6ccd8d4586-px8lq" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.844051 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59d5ff467f-hnlsg" Sep 30 14:20:32 crc kubenswrapper[4676]: I0930 14:20:32.875258 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6ccd8d4586-px8lq" Sep 30 14:20:33 crc kubenswrapper[4676]: I0930 14:20:33.132165 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-bf6466755-m2t9t"] Sep 30 14:20:33 crc kubenswrapper[4676]: W0930 14:20:33.157205 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod39f521d2_b195_4179_a114_1c1611e4ba2f.slice/crio-0cbf891c71a505cc87816d07895cb3f9a95736a700542fcf4ba9cf56191b1f98 WatchSource:0}: Error finding container 0cbf891c71a505cc87816d07895cb3f9a95736a700542fcf4ba9cf56191b1f98: Status 404 returned error can't find the container with id 0cbf891c71a505cc87816d07895cb3f9a95736a700542fcf4ba9cf56191b1f98 Sep 30 14:20:33 crc kubenswrapper[4676]: I0930 14:20:33.285524 4676 generic.go:334] "Generic (PLEG): container finished" podID="5540da5e-a02e-437f-82a8-e0f74ac91760" containerID="ea7cc09bbd28253f4e39517465438b41cce5f3bad49b276f7337fe769b0d5ec0" exitCode=2 Sep 30 14:20:33 crc kubenswrapper[4676]: I0930 14:20:33.285633 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5540da5e-a02e-437f-82a8-e0f74ac91760","Type":"ContainerDied","Data":"ea7cc09bbd28253f4e39517465438b41cce5f3bad49b276f7337fe769b0d5ec0"} Sep 30 14:20:33 crc kubenswrapper[4676]: I0930 14:20:33.292930 4676 generic.go:334] "Generic (PLEG): container finished" podID="4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2" containerID="8c17b5729f312f05be21c5f95a406e4692ef4aa38f57533588ab0cd32e8c10f5" exitCode=0 Sep 30 14:20:33 crc kubenswrapper[4676]: I0930 14:20:33.292987 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-d6pxb" event={"ID":"4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2","Type":"ContainerDied","Data":"8c17b5729f312f05be21c5f95a406e4692ef4aa38f57533588ab0cd32e8c10f5"} Sep 30 14:20:33 crc kubenswrapper[4676]: I0930 14:20:33.293678 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-777c6c994b-kk5rn"] Sep 30 14:20:33 crc kubenswrapper[4676]: I0930 14:20:33.306237 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-bf6466755-m2t9t" event={"ID":"39f521d2-b195-4179-a114-1c1611e4ba2f","Type":"ContainerStarted","Data":"0cbf891c71a505cc87816d07895cb3f9a95736a700542fcf4ba9cf56191b1f98"} Sep 30 14:20:33 crc kubenswrapper[4676]: I0930 14:20:33.321232 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-wxj4s" event={"ID":"63080796-b0be-4b3a-8db5-8242e2eb2bb3","Type":"ContainerStarted","Data":"75ae60f130bd580b01b0188c854fd3510ab5c9f6f7640cee5f9d9c91dd5d9345"} Sep 30 14:20:33 crc kubenswrapper[4676]: I0930 14:20:33.342859 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-wxj4s" podStartSLOduration=4.013679832 podStartE2EDuration="1m11.342834606s" podCreationTimestamp="2025-09-30 14:19:22 +0000 UTC" firstStartedPulling="2025-09-30 14:19:23.903957135 +0000 UTC m=+1267.887045564" lastFinishedPulling="2025-09-30 14:20:31.233111919 +0000 UTC m=+1335.216200338" observedRunningTime="2025-09-30 14:20:33.342153249 +0000 UTC m=+1337.325241678" watchObservedRunningTime="2025-09-30 14:20:33.342834606 +0000 UTC m=+1337.325923035" Sep 30 14:20:33 crc kubenswrapper[4676]: W0930 14:20:33.364265 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70c7535f_4b3b_438f_9470_c857ece73452.slice/crio-b536c4dc99e50f176f67f5d220d3ea45375d6cc1abe69ca721d11ced8e7ab353 WatchSource:0}: Error finding container b536c4dc99e50f176f67f5d220d3ea45375d6cc1abe69ca721d11ced8e7ab353: Status 404 returned error can't find the container with id b536c4dc99e50f176f67f5d220d3ea45375d6cc1abe69ca721d11ced8e7ab353 Sep 30 14:20:33 crc kubenswrapper[4676]: I0930 14:20:33.622009 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6ccd8d4586-px8lq"] Sep 30 14:20:33 crc kubenswrapper[4676]: I0930 14:20:33.672220 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59d5ff467f-hnlsg"] Sep 30 14:20:33 crc kubenswrapper[4676]: W0930 14:20:33.716729 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8232ccd_2bcc_4cbe_b7b7_92bbe14cb14e.slice/crio-2f276f3570a080005a25bc08e9f6f6d81cf1403e8ff41344c84940724477d7a4 WatchSource:0}: Error finding container 2f276f3570a080005a25bc08e9f6f6d81cf1403e8ff41344c84940724477d7a4: Status 404 returned error can't find the container with id 2f276f3570a080005a25bc08e9f6f6d81cf1403e8ff41344c84940724477d7a4 Sep 30 14:20:33 crc kubenswrapper[4676]: I0930 14:20:33.903013 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-d6pxb" Sep 30 14:20:33 crc kubenswrapper[4676]: I0930 14:20:33.985526 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbtq5\" (UniqueName: \"kubernetes.io/projected/4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2-kube-api-access-vbtq5\") pod \"4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2\" (UID: \"4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2\") " Sep 30 14:20:33 crc kubenswrapper[4676]: I0930 14:20:33.985590 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2-ovsdbserver-nb\") pod \"4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2\" (UID: \"4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2\") " Sep 30 14:20:33 crc kubenswrapper[4676]: I0930 14:20:33.985659 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2-dns-svc\") pod \"4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2\" (UID: \"4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2\") " Sep 30 14:20:33 crc kubenswrapper[4676]: I0930 14:20:33.985701 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2-config\") pod \"4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2\" (UID: \"4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2\") " Sep 30 14:20:33 crc kubenswrapper[4676]: I0930 14:20:33.985762 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2-ovsdbserver-sb\") pod \"4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2\" (UID: \"4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2\") " Sep 30 14:20:33 crc kubenswrapper[4676]: I0930 14:20:33.985982 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2-dns-swift-storage-0\") pod \"4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2\" (UID: \"4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2\") " Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.001254 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2-kube-api-access-vbtq5" (OuterVolumeSpecName: "kube-api-access-vbtq5") pod "4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2" (UID: "4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2"). InnerVolumeSpecName "kube-api-access-vbtq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.088638 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbtq5\" (UniqueName: \"kubernetes.io/projected/4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2-kube-api-access-vbtq5\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.154293 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2" (UID: "4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.161843 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2-config" (OuterVolumeSpecName: "config") pod "4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2" (UID: "4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.188856 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2" (UID: "4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.193325 4676 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.193370 4676 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.193384 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2-config\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.213261 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2" (UID: "4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.218508 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2" (UID: "4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.249281 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.294926 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5540da5e-a02e-437f-82a8-e0f74ac91760-log-httpd\") pod \"5540da5e-a02e-437f-82a8-e0f74ac91760\" (UID: \"5540da5e-a02e-437f-82a8-e0f74ac91760\") " Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.295353 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5540da5e-a02e-437f-82a8-e0f74ac91760-config-data\") pod \"5540da5e-a02e-437f-82a8-e0f74ac91760\" (UID: \"5540da5e-a02e-437f-82a8-e0f74ac91760\") " Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.295445 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nt82t\" (UniqueName: \"kubernetes.io/projected/5540da5e-a02e-437f-82a8-e0f74ac91760-kube-api-access-nt82t\") pod \"5540da5e-a02e-437f-82a8-e0f74ac91760\" (UID: \"5540da5e-a02e-437f-82a8-e0f74ac91760\") " Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.295559 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5540da5e-a02e-437f-82a8-e0f74ac91760-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5540da5e-a02e-437f-82a8-e0f74ac91760" (UID: "5540da5e-a02e-437f-82a8-e0f74ac91760"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.295591 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5540da5e-a02e-437f-82a8-e0f74ac91760-sg-core-conf-yaml\") pod \"5540da5e-a02e-437f-82a8-e0f74ac91760\" (UID: \"5540da5e-a02e-437f-82a8-e0f74ac91760\") " Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.295620 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5540da5e-a02e-437f-82a8-e0f74ac91760-run-httpd\") pod \"5540da5e-a02e-437f-82a8-e0f74ac91760\" (UID: \"5540da5e-a02e-437f-82a8-e0f74ac91760\") " Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.295720 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5540da5e-a02e-437f-82a8-e0f74ac91760-scripts\") pod \"5540da5e-a02e-437f-82a8-e0f74ac91760\" (UID: \"5540da5e-a02e-437f-82a8-e0f74ac91760\") " Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.295790 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5540da5e-a02e-437f-82a8-e0f74ac91760-combined-ca-bundle\") pod \"5540da5e-a02e-437f-82a8-e0f74ac91760\" (UID: \"5540da5e-a02e-437f-82a8-e0f74ac91760\") " Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.296080 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5540da5e-a02e-437f-82a8-e0f74ac91760-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5540da5e-a02e-437f-82a8-e0f74ac91760" (UID: "5540da5e-a02e-437f-82a8-e0f74ac91760"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.296810 4676 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.296837 4676 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5540da5e-a02e-437f-82a8-e0f74ac91760-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.296849 4676 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.296859 4676 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5540da5e-a02e-437f-82a8-e0f74ac91760-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.300118 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5540da5e-a02e-437f-82a8-e0f74ac91760-scripts" (OuterVolumeSpecName: "scripts") pod "5540da5e-a02e-437f-82a8-e0f74ac91760" (UID: "5540da5e-a02e-437f-82a8-e0f74ac91760"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.312140 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5540da5e-a02e-437f-82a8-e0f74ac91760-kube-api-access-nt82t" (OuterVolumeSpecName: "kube-api-access-nt82t") pod "5540da5e-a02e-437f-82a8-e0f74ac91760" (UID: "5540da5e-a02e-437f-82a8-e0f74ac91760"). InnerVolumeSpecName "kube-api-access-nt82t". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.330967 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5540da5e-a02e-437f-82a8-e0f74ac91760-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5540da5e-a02e-437f-82a8-e0f74ac91760" (UID: "5540da5e-a02e-437f-82a8-e0f74ac91760"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.337862 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5540da5e-a02e-437f-82a8-e0f74ac91760-config-data" (OuterVolumeSpecName: "config-data") pod "5540da5e-a02e-437f-82a8-e0f74ac91760" (UID: "5540da5e-a02e-437f-82a8-e0f74ac91760"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.341274 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5540da5e-a02e-437f-82a8-e0f74ac91760-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5540da5e-a02e-437f-82a8-e0f74ac91760" (UID: "5540da5e-a02e-437f-82a8-e0f74ac91760"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.346017 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-d6pxb" event={"ID":"4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2","Type":"ContainerDied","Data":"26aed09f0b994af604ca7131c3d8415ec30d5b9aa2881340474a18598d54d234"} Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.346063 4676 scope.go:117] "RemoveContainer" containerID="8c17b5729f312f05be21c5f95a406e4692ef4aa38f57533588ab0cd32e8c10f5" Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.346071 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-d6pxb" Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.358999 4676 generic.go:334] "Generic (PLEG): container finished" podID="5540da5e-a02e-437f-82a8-e0f74ac91760" containerID="1de6c42bdacb6c16f5a7a5eb87b5c6e47392b47b8aeae74dd79a567f63dda5ef" exitCode=0 Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.359076 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.359102 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5540da5e-a02e-437f-82a8-e0f74ac91760","Type":"ContainerDied","Data":"1de6c42bdacb6c16f5a7a5eb87b5c6e47392b47b8aeae74dd79a567f63dda5ef"} Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.359131 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5540da5e-a02e-437f-82a8-e0f74ac91760","Type":"ContainerDied","Data":"8e7ea66401a3fe8da0c30c75221e2fe6c4c0e4c20f8ec8a4a4a7dabc57daea2f"} Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.370046 4676 generic.go:334] "Generic (PLEG): container finished" podID="d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e" containerID="e0ff3dafec2fa5e65f44a99bad7928dfa0aed5664e7b4c217b90419015664cae" exitCode=0 Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.370229 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59d5ff467f-hnlsg" event={"ID":"d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e","Type":"ContainerDied","Data":"e0ff3dafec2fa5e65f44a99bad7928dfa0aed5664e7b4c217b90419015664cae"} Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.370638 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59d5ff467f-hnlsg" event={"ID":"d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e","Type":"ContainerStarted","Data":"2f276f3570a080005a25bc08e9f6f6d81cf1403e8ff41344c84940724477d7a4"} Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.372410 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6ccd8d4586-px8lq" event={"ID":"23c298b3-265b-4261-80d8-a79bc312db01","Type":"ContainerStarted","Data":"1e857e25ea7274253d5df5c66541f35ea25cac84b5e27f1cec64a3de51cd79c9"} Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.372446 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6ccd8d4586-px8lq" event={"ID":"23c298b3-265b-4261-80d8-a79bc312db01","Type":"ContainerStarted","Data":"816605b321b4fb0502edda9ac3180961b862a02030113387d09078a37cff4517"} Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.398639 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7d4676fc-805d-4ece-9bdb-d58750217b99","Type":"ContainerStarted","Data":"88aa14f1f65f1ffda81bc119123affef53d6c9c8374c58437144fe8c33b10d91"} Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.401035 4676 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5540da5e-a02e-437f-82a8-e0f74ac91760-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.401368 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5540da5e-a02e-437f-82a8-e0f74ac91760-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.401501 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5540da5e-a02e-437f-82a8-e0f74ac91760-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.401576 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nt82t\" (UniqueName: \"kubernetes.io/projected/5540da5e-a02e-437f-82a8-e0f74ac91760-kube-api-access-nt82t\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.401813 4676 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5540da5e-a02e-437f-82a8-e0f74ac91760-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.422265 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"313bd90a-adda-45d6-a1ee-970b9277b7ff","Type":"ContainerStarted","Data":"7b2b958c2da252793f70c69edc708483f5fb5dbb3e3d7a840d0febb663fa281a"} Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.437872 4676 scope.go:117] "RemoveContainer" containerID="ea7cc09bbd28253f4e39517465438b41cce5f3bad49b276f7337fe769b0d5ec0" Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.452951 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.458894 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-777c6c994b-kk5rn" event={"ID":"70c7535f-4b3b-438f-9470-c857ece73452","Type":"ContainerStarted","Data":"b536c4dc99e50f176f67f5d220d3ea45375d6cc1abe69ca721d11ced8e7ab353"} Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.492024 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.529188 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 14:20:34 crc kubenswrapper[4676]: E0930 14:20:34.529735 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2" containerName="init" Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.529762 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2" containerName="init" Sep 30 14:20:34 crc kubenswrapper[4676]: E0930 14:20:34.529780 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5540da5e-a02e-437f-82a8-e0f74ac91760" containerName="ceilometer-notification-agent" Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.529788 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="5540da5e-a02e-437f-82a8-e0f74ac91760" containerName="ceilometer-notification-agent" Sep 30 14:20:34 crc kubenswrapper[4676]: E0930 14:20:34.529814 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5540da5e-a02e-437f-82a8-e0f74ac91760" containerName="sg-core" Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.529823 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="5540da5e-a02e-437f-82a8-e0f74ac91760" containerName="sg-core" Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.531216 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="5540da5e-a02e-437f-82a8-e0f74ac91760" containerName="ceilometer-notification-agent" Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.531259 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2" containerName="init" Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.531276 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="5540da5e-a02e-437f-82a8-e0f74ac91760" containerName="sg-core" Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.564944 4676 scope.go:117] "RemoveContainer" containerID="1de6c42bdacb6c16f5a7a5eb87b5c6e47392b47b8aeae74dd79a567f63dda5ef" Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.570423 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.576249 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.576715 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.612669 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3822c455-1e30-49b4-8e75-1e880a010303-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3822c455-1e30-49b4-8e75-1e880a010303\") " pod="openstack/ceilometer-0" Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.612765 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3822c455-1e30-49b4-8e75-1e880a010303-scripts\") pod \"ceilometer-0\" (UID: \"3822c455-1e30-49b4-8e75-1e880a010303\") " pod="openstack/ceilometer-0" Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.612919 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3822c455-1e30-49b4-8e75-1e880a010303-config-data\") pod \"ceilometer-0\" (UID: \"3822c455-1e30-49b4-8e75-1e880a010303\") " pod="openstack/ceilometer-0" Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.612973 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbkjd\" (UniqueName: \"kubernetes.io/projected/3822c455-1e30-49b4-8e75-1e880a010303-kube-api-access-kbkjd\") pod \"ceilometer-0\" (UID: \"3822c455-1e30-49b4-8e75-1e880a010303\") " pod="openstack/ceilometer-0" Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.613055 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3822c455-1e30-49b4-8e75-1e880a010303-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3822c455-1e30-49b4-8e75-1e880a010303\") " pod="openstack/ceilometer-0" Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.613095 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3822c455-1e30-49b4-8e75-1e880a010303-log-httpd\") pod \"ceilometer-0\" (UID: \"3822c455-1e30-49b4-8e75-1e880a010303\") " pod="openstack/ceilometer-0" Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.613163 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3822c455-1e30-49b4-8e75-1e880a010303-run-httpd\") pod \"ceilometer-0\" (UID: \"3822c455-1e30-49b4-8e75-1e880a010303\") " pod="openstack/ceilometer-0" Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.613292 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-d6pxb"] Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.633971 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-d6pxb"] Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.652345 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.715410 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3822c455-1e30-49b4-8e75-1e880a010303-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3822c455-1e30-49b4-8e75-1e880a010303\") " pod="openstack/ceilometer-0" Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.716422 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3822c455-1e30-49b4-8e75-1e880a010303-scripts\") pod \"ceilometer-0\" (UID: \"3822c455-1e30-49b4-8e75-1e880a010303\") " pod="openstack/ceilometer-0" Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.716670 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3822c455-1e30-49b4-8e75-1e880a010303-config-data\") pod \"ceilometer-0\" (UID: \"3822c455-1e30-49b4-8e75-1e880a010303\") " pod="openstack/ceilometer-0" Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.716775 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbkjd\" (UniqueName: \"kubernetes.io/projected/3822c455-1e30-49b4-8e75-1e880a010303-kube-api-access-kbkjd\") pod \"ceilometer-0\" (UID: \"3822c455-1e30-49b4-8e75-1e880a010303\") " pod="openstack/ceilometer-0" Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.716937 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3822c455-1e30-49b4-8e75-1e880a010303-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3822c455-1e30-49b4-8e75-1e880a010303\") " pod="openstack/ceilometer-0" Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.717052 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3822c455-1e30-49b4-8e75-1e880a010303-log-httpd\") pod \"ceilometer-0\" (UID: \"3822c455-1e30-49b4-8e75-1e880a010303\") " pod="openstack/ceilometer-0" Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.717229 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3822c455-1e30-49b4-8e75-1e880a010303-run-httpd\") pod \"ceilometer-0\" (UID: \"3822c455-1e30-49b4-8e75-1e880a010303\") " pod="openstack/ceilometer-0" Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.718032 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3822c455-1e30-49b4-8e75-1e880a010303-log-httpd\") pod \"ceilometer-0\" (UID: \"3822c455-1e30-49b4-8e75-1e880a010303\") " pod="openstack/ceilometer-0" Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.721680 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3822c455-1e30-49b4-8e75-1e880a010303-run-httpd\") pod \"ceilometer-0\" (UID: \"3822c455-1e30-49b4-8e75-1e880a010303\") " pod="openstack/ceilometer-0" Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.725934 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3822c455-1e30-49b4-8e75-1e880a010303-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3822c455-1e30-49b4-8e75-1e880a010303\") " pod="openstack/ceilometer-0" Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.737455 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3822c455-1e30-49b4-8e75-1e880a010303-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3822c455-1e30-49b4-8e75-1e880a010303\") " pod="openstack/ceilometer-0" Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.740225 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3822c455-1e30-49b4-8e75-1e880a010303-config-data\") pod \"ceilometer-0\" (UID: \"3822c455-1e30-49b4-8e75-1e880a010303\") " pod="openstack/ceilometer-0" Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.746470 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3822c455-1e30-49b4-8e75-1e880a010303-scripts\") pod \"ceilometer-0\" (UID: \"3822c455-1e30-49b4-8e75-1e880a010303\") " pod="openstack/ceilometer-0" Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.747665 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbkjd\" (UniqueName: \"kubernetes.io/projected/3822c455-1e30-49b4-8e75-1e880a010303-kube-api-access-kbkjd\") pod \"ceilometer-0\" (UID: \"3822c455-1e30-49b4-8e75-1e880a010303\") " pod="openstack/ceilometer-0" Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.903372 4676 scope.go:117] "RemoveContainer" containerID="ea7cc09bbd28253f4e39517465438b41cce5f3bad49b276f7337fe769b0d5ec0" Sep 30 14:20:34 crc kubenswrapper[4676]: E0930 14:20:34.906667 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea7cc09bbd28253f4e39517465438b41cce5f3bad49b276f7337fe769b0d5ec0\": container with ID starting with ea7cc09bbd28253f4e39517465438b41cce5f3bad49b276f7337fe769b0d5ec0 not found: ID does not exist" containerID="ea7cc09bbd28253f4e39517465438b41cce5f3bad49b276f7337fe769b0d5ec0" Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.906730 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea7cc09bbd28253f4e39517465438b41cce5f3bad49b276f7337fe769b0d5ec0"} err="failed to get container status \"ea7cc09bbd28253f4e39517465438b41cce5f3bad49b276f7337fe769b0d5ec0\": rpc error: code = NotFound desc = could not find container \"ea7cc09bbd28253f4e39517465438b41cce5f3bad49b276f7337fe769b0d5ec0\": container with ID starting with ea7cc09bbd28253f4e39517465438b41cce5f3bad49b276f7337fe769b0d5ec0 not found: ID does not exist" Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.906769 4676 scope.go:117] "RemoveContainer" containerID="1de6c42bdacb6c16f5a7a5eb87b5c6e47392b47b8aeae74dd79a567f63dda5ef" Sep 30 14:20:34 crc kubenswrapper[4676]: E0930 14:20:34.907256 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1de6c42bdacb6c16f5a7a5eb87b5c6e47392b47b8aeae74dd79a567f63dda5ef\": container with ID starting with 1de6c42bdacb6c16f5a7a5eb87b5c6e47392b47b8aeae74dd79a567f63dda5ef not found: ID does not exist" containerID="1de6c42bdacb6c16f5a7a5eb87b5c6e47392b47b8aeae74dd79a567f63dda5ef" Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.907282 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1de6c42bdacb6c16f5a7a5eb87b5c6e47392b47b8aeae74dd79a567f63dda5ef"} err="failed to get container status \"1de6c42bdacb6c16f5a7a5eb87b5c6e47392b47b8aeae74dd79a567f63dda5ef\": rpc error: code = NotFound desc = could not find container \"1de6c42bdacb6c16f5a7a5eb87b5c6e47392b47b8aeae74dd79a567f63dda5ef\": container with ID starting with 1de6c42bdacb6c16f5a7a5eb87b5c6e47392b47b8aeae74dd79a567f63dda5ef not found: ID does not exist" Sep 30 14:20:34 crc kubenswrapper[4676]: I0930 14:20:34.924588 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 14:20:35 crc kubenswrapper[4676]: I0930 14:20:35.456174 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2" path="/var/lib/kubelet/pods/4d2e19af-9b9c-4d3c-9dfe-fa974d7c54e2/volumes" Sep 30 14:20:35 crc kubenswrapper[4676]: I0930 14:20:35.457155 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5540da5e-a02e-437f-82a8-e0f74ac91760" path="/var/lib/kubelet/pods/5540da5e-a02e-437f-82a8-e0f74ac91760/volumes" Sep 30 14:20:35 crc kubenswrapper[4676]: I0930 14:20:35.473161 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59d5ff467f-hnlsg" event={"ID":"d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e","Type":"ContainerStarted","Data":"f850c5bf18d71b920a7d2d8e9851076135152ad64a9dae01f8edfd5dd080117c"} Sep 30 14:20:35 crc kubenswrapper[4676]: I0930 14:20:35.473305 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-59d5ff467f-hnlsg" Sep 30 14:20:35 crc kubenswrapper[4676]: I0930 14:20:35.476804 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6ccd8d4586-px8lq" event={"ID":"23c298b3-265b-4261-80d8-a79bc312db01","Type":"ContainerStarted","Data":"8540aa107515f95c73c42c71b0862f3fabd4b602088108839de75afa0b3f4c22"} Sep 30 14:20:35 crc kubenswrapper[4676]: I0930 14:20:35.481984 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6ccd8d4586-px8lq" Sep 30 14:20:35 crc kubenswrapper[4676]: I0930 14:20:35.482031 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6ccd8d4586-px8lq" Sep 30 14:20:35 crc kubenswrapper[4676]: I0930 14:20:35.486721 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7d4676fc-805d-4ece-9bdb-d58750217b99","Type":"ContainerStarted","Data":"ad0c3d55f30ad0eb951726832136e28eb69464665a1d5db8f80df1ebfbb3cb95"} Sep 30 14:20:35 crc kubenswrapper[4676]: I0930 14:20:35.486840 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="7d4676fc-805d-4ece-9bdb-d58750217b99" containerName="glance-log" containerID="cri-o://88aa14f1f65f1ffda81bc119123affef53d6c9c8374c58437144fe8c33b10d91" gracePeriod=30 Sep 30 14:20:35 crc kubenswrapper[4676]: I0930 14:20:35.486941 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="7d4676fc-805d-4ece-9bdb-d58750217b99" containerName="glance-httpd" containerID="cri-o://ad0c3d55f30ad0eb951726832136e28eb69464665a1d5db8f80df1ebfbb3cb95" gracePeriod=30 Sep 30 14:20:35 crc kubenswrapper[4676]: I0930 14:20:35.490649 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"313bd90a-adda-45d6-a1ee-970b9277b7ff","Type":"ContainerStarted","Data":"4be3765007dfb80d2cdaa2cbe29f284d37a161a3a97547367f11eaf78cae7a24"} Sep 30 14:20:35 crc kubenswrapper[4676]: I0930 14:20:35.490951 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="313bd90a-adda-45d6-a1ee-970b9277b7ff" containerName="glance-log" containerID="cri-o://7b2b958c2da252793f70c69edc708483f5fb5dbb3e3d7a840d0febb663fa281a" gracePeriod=30 Sep 30 14:20:35 crc kubenswrapper[4676]: I0930 14:20:35.491059 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="313bd90a-adda-45d6-a1ee-970b9277b7ff" containerName="glance-httpd" containerID="cri-o://4be3765007dfb80d2cdaa2cbe29f284d37a161a3a97547367f11eaf78cae7a24" gracePeriod=30 Sep 30 14:20:35 crc kubenswrapper[4676]: I0930 14:20:35.502964 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-59d5ff467f-hnlsg" podStartSLOduration=3.502949675 podStartE2EDuration="3.502949675s" podCreationTimestamp="2025-09-30 14:20:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:20:35.501561859 +0000 UTC m=+1339.484650288" watchObservedRunningTime="2025-09-30 14:20:35.502949675 +0000 UTC m=+1339.486038094" Sep 30 14:20:35 crc kubenswrapper[4676]: I0930 14:20:35.528021 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 14:20:35 crc kubenswrapper[4676]: I0930 14:20:35.530101 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=10.530081256 podStartE2EDuration="10.530081256s" podCreationTimestamp="2025-09-30 14:20:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:20:35.525328742 +0000 UTC m=+1339.508417171" watchObservedRunningTime="2025-09-30 14:20:35.530081256 +0000 UTC m=+1339.513169685" Sep 30 14:20:35 crc kubenswrapper[4676]: I0930 14:20:35.548993 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6ccd8d4586-px8lq" podStartSLOduration=3.548971532 podStartE2EDuration="3.548971532s" podCreationTimestamp="2025-09-30 14:20:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:20:35.545301275 +0000 UTC m=+1339.528389704" watchObservedRunningTime="2025-09-30 14:20:35.548971532 +0000 UTC m=+1339.532059961" Sep 30 14:20:35 crc kubenswrapper[4676]: I0930 14:20:35.576587 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=10.576570824000001 podStartE2EDuration="10.576570824s" podCreationTimestamp="2025-09-30 14:20:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:20:35.568658598 +0000 UTC m=+1339.551747037" watchObservedRunningTime="2025-09-30 14:20:35.576570824 +0000 UTC m=+1339.559659253" Sep 30 14:20:35 crc kubenswrapper[4676]: I0930 14:20:35.696983 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5fbdbfbbdb-mjhjg"] Sep 30 14:20:35 crc kubenswrapper[4676]: I0930 14:20:35.698569 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5fbdbfbbdb-mjhjg" Sep 30 14:20:35 crc kubenswrapper[4676]: I0930 14:20:35.705002 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Sep 30 14:20:35 crc kubenswrapper[4676]: I0930 14:20:35.705360 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Sep 30 14:20:35 crc kubenswrapper[4676]: I0930 14:20:35.712189 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5fbdbfbbdb-mjhjg"] Sep 30 14:20:35 crc kubenswrapper[4676]: I0930 14:20:35.737926 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pp4tq\" (UniqueName: \"kubernetes.io/projected/1e134bd5-ad40-427d-ba65-7cf9a5a25104-kube-api-access-pp4tq\") pod \"barbican-api-5fbdbfbbdb-mjhjg\" (UID: \"1e134bd5-ad40-427d-ba65-7cf9a5a25104\") " pod="openstack/barbican-api-5fbdbfbbdb-mjhjg" Sep 30 14:20:35 crc kubenswrapper[4676]: I0930 14:20:35.737973 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e134bd5-ad40-427d-ba65-7cf9a5a25104-internal-tls-certs\") pod \"barbican-api-5fbdbfbbdb-mjhjg\" (UID: \"1e134bd5-ad40-427d-ba65-7cf9a5a25104\") " pod="openstack/barbican-api-5fbdbfbbdb-mjhjg" Sep 30 14:20:35 crc kubenswrapper[4676]: I0930 14:20:35.738008 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e134bd5-ad40-427d-ba65-7cf9a5a25104-config-data\") pod \"barbican-api-5fbdbfbbdb-mjhjg\" (UID: \"1e134bd5-ad40-427d-ba65-7cf9a5a25104\") " pod="openstack/barbican-api-5fbdbfbbdb-mjhjg" Sep 30 14:20:35 crc kubenswrapper[4676]: I0930 14:20:35.738027 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e134bd5-ad40-427d-ba65-7cf9a5a25104-public-tls-certs\") pod \"barbican-api-5fbdbfbbdb-mjhjg\" (UID: \"1e134bd5-ad40-427d-ba65-7cf9a5a25104\") " pod="openstack/barbican-api-5fbdbfbbdb-mjhjg" Sep 30 14:20:35 crc kubenswrapper[4676]: I0930 14:20:35.738064 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1e134bd5-ad40-427d-ba65-7cf9a5a25104-config-data-custom\") pod \"barbican-api-5fbdbfbbdb-mjhjg\" (UID: \"1e134bd5-ad40-427d-ba65-7cf9a5a25104\") " pod="openstack/barbican-api-5fbdbfbbdb-mjhjg" Sep 30 14:20:35 crc kubenswrapper[4676]: I0930 14:20:35.738103 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e134bd5-ad40-427d-ba65-7cf9a5a25104-combined-ca-bundle\") pod \"barbican-api-5fbdbfbbdb-mjhjg\" (UID: \"1e134bd5-ad40-427d-ba65-7cf9a5a25104\") " pod="openstack/barbican-api-5fbdbfbbdb-mjhjg" Sep 30 14:20:35 crc kubenswrapper[4676]: I0930 14:20:35.738162 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e134bd5-ad40-427d-ba65-7cf9a5a25104-logs\") pod \"barbican-api-5fbdbfbbdb-mjhjg\" (UID: \"1e134bd5-ad40-427d-ba65-7cf9a5a25104\") " pod="openstack/barbican-api-5fbdbfbbdb-mjhjg" Sep 30 14:20:35 crc kubenswrapper[4676]: I0930 14:20:35.839560 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e134bd5-ad40-427d-ba65-7cf9a5a25104-logs\") pod \"barbican-api-5fbdbfbbdb-mjhjg\" (UID: \"1e134bd5-ad40-427d-ba65-7cf9a5a25104\") " pod="openstack/barbican-api-5fbdbfbbdb-mjhjg" Sep 30 14:20:35 crc kubenswrapper[4676]: I0930 14:20:35.839930 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pp4tq\" (UniqueName: \"kubernetes.io/projected/1e134bd5-ad40-427d-ba65-7cf9a5a25104-kube-api-access-pp4tq\") pod \"barbican-api-5fbdbfbbdb-mjhjg\" (UID: \"1e134bd5-ad40-427d-ba65-7cf9a5a25104\") " pod="openstack/barbican-api-5fbdbfbbdb-mjhjg" Sep 30 14:20:35 crc kubenswrapper[4676]: I0930 14:20:35.840316 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e134bd5-ad40-427d-ba65-7cf9a5a25104-internal-tls-certs\") pod \"barbican-api-5fbdbfbbdb-mjhjg\" (UID: \"1e134bd5-ad40-427d-ba65-7cf9a5a25104\") " pod="openstack/barbican-api-5fbdbfbbdb-mjhjg" Sep 30 14:20:35 crc kubenswrapper[4676]: I0930 14:20:35.840413 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e134bd5-ad40-427d-ba65-7cf9a5a25104-config-data\") pod \"barbican-api-5fbdbfbbdb-mjhjg\" (UID: \"1e134bd5-ad40-427d-ba65-7cf9a5a25104\") " pod="openstack/barbican-api-5fbdbfbbdb-mjhjg" Sep 30 14:20:35 crc kubenswrapper[4676]: I0930 14:20:35.840484 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e134bd5-ad40-427d-ba65-7cf9a5a25104-public-tls-certs\") pod \"barbican-api-5fbdbfbbdb-mjhjg\" (UID: \"1e134bd5-ad40-427d-ba65-7cf9a5a25104\") " pod="openstack/barbican-api-5fbdbfbbdb-mjhjg" Sep 30 14:20:35 crc kubenswrapper[4676]: I0930 14:20:35.840571 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1e134bd5-ad40-427d-ba65-7cf9a5a25104-config-data-custom\") pod \"barbican-api-5fbdbfbbdb-mjhjg\" (UID: \"1e134bd5-ad40-427d-ba65-7cf9a5a25104\") " pod="openstack/barbican-api-5fbdbfbbdb-mjhjg" Sep 30 14:20:35 crc kubenswrapper[4676]: I0930 14:20:35.840664 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e134bd5-ad40-427d-ba65-7cf9a5a25104-combined-ca-bundle\") pod \"barbican-api-5fbdbfbbdb-mjhjg\" (UID: \"1e134bd5-ad40-427d-ba65-7cf9a5a25104\") " pod="openstack/barbican-api-5fbdbfbbdb-mjhjg" Sep 30 14:20:35 crc kubenswrapper[4676]: I0930 14:20:35.840170 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e134bd5-ad40-427d-ba65-7cf9a5a25104-logs\") pod \"barbican-api-5fbdbfbbdb-mjhjg\" (UID: \"1e134bd5-ad40-427d-ba65-7cf9a5a25104\") " pod="openstack/barbican-api-5fbdbfbbdb-mjhjg" Sep 30 14:20:35 crc kubenswrapper[4676]: I0930 14:20:35.847072 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e134bd5-ad40-427d-ba65-7cf9a5a25104-public-tls-certs\") pod \"barbican-api-5fbdbfbbdb-mjhjg\" (UID: \"1e134bd5-ad40-427d-ba65-7cf9a5a25104\") " pod="openstack/barbican-api-5fbdbfbbdb-mjhjg" Sep 30 14:20:35 crc kubenswrapper[4676]: I0930 14:20:35.850290 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1e134bd5-ad40-427d-ba65-7cf9a5a25104-config-data-custom\") pod \"barbican-api-5fbdbfbbdb-mjhjg\" (UID: \"1e134bd5-ad40-427d-ba65-7cf9a5a25104\") " pod="openstack/barbican-api-5fbdbfbbdb-mjhjg" Sep 30 14:20:35 crc kubenswrapper[4676]: I0930 14:20:35.851005 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e134bd5-ad40-427d-ba65-7cf9a5a25104-combined-ca-bundle\") pod \"barbican-api-5fbdbfbbdb-mjhjg\" (UID: \"1e134bd5-ad40-427d-ba65-7cf9a5a25104\") " pod="openstack/barbican-api-5fbdbfbbdb-mjhjg" Sep 30 14:20:35 crc kubenswrapper[4676]: I0930 14:20:35.859209 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e134bd5-ad40-427d-ba65-7cf9a5a25104-config-data\") pod \"barbican-api-5fbdbfbbdb-mjhjg\" (UID: \"1e134bd5-ad40-427d-ba65-7cf9a5a25104\") " pod="openstack/barbican-api-5fbdbfbbdb-mjhjg" Sep 30 14:20:35 crc kubenswrapper[4676]: I0930 14:20:35.859697 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e134bd5-ad40-427d-ba65-7cf9a5a25104-internal-tls-certs\") pod \"barbican-api-5fbdbfbbdb-mjhjg\" (UID: \"1e134bd5-ad40-427d-ba65-7cf9a5a25104\") " pod="openstack/barbican-api-5fbdbfbbdb-mjhjg" Sep 30 14:20:35 crc kubenswrapper[4676]: I0930 14:20:35.865992 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pp4tq\" (UniqueName: \"kubernetes.io/projected/1e134bd5-ad40-427d-ba65-7cf9a5a25104-kube-api-access-pp4tq\") pod \"barbican-api-5fbdbfbbdb-mjhjg\" (UID: \"1e134bd5-ad40-427d-ba65-7cf9a5a25104\") " pod="openstack/barbican-api-5fbdbfbbdb-mjhjg" Sep 30 14:20:36 crc kubenswrapper[4676]: I0930 14:20:36.098316 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5fbdbfbbdb-mjhjg" Sep 30 14:20:36 crc kubenswrapper[4676]: I0930 14:20:36.436838 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 14:20:36 crc kubenswrapper[4676]: I0930 14:20:36.524185 4676 generic.go:334] "Generic (PLEG): container finished" podID="7d4676fc-805d-4ece-9bdb-d58750217b99" containerID="ad0c3d55f30ad0eb951726832136e28eb69464665a1d5db8f80df1ebfbb3cb95" exitCode=0 Sep 30 14:20:36 crc kubenswrapper[4676]: I0930 14:20:36.524229 4676 generic.go:334] "Generic (PLEG): container finished" podID="7d4676fc-805d-4ece-9bdb-d58750217b99" containerID="88aa14f1f65f1ffda81bc119123affef53d6c9c8374c58437144fe8c33b10d91" exitCode=143 Sep 30 14:20:36 crc kubenswrapper[4676]: I0930 14:20:36.524274 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7d4676fc-805d-4ece-9bdb-d58750217b99","Type":"ContainerDied","Data":"ad0c3d55f30ad0eb951726832136e28eb69464665a1d5db8f80df1ebfbb3cb95"} Sep 30 14:20:36 crc kubenswrapper[4676]: I0930 14:20:36.524300 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7d4676fc-805d-4ece-9bdb-d58750217b99","Type":"ContainerDied","Data":"88aa14f1f65f1ffda81bc119123affef53d6c9c8374c58437144fe8c33b10d91"} Sep 30 14:20:36 crc kubenswrapper[4676]: I0930 14:20:36.524309 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7d4676fc-805d-4ece-9bdb-d58750217b99","Type":"ContainerDied","Data":"667fcbd801572bc8881a8d4ca65fac2b92be93f9ac58ee075ab914a131cd1891"} Sep 30 14:20:36 crc kubenswrapper[4676]: I0930 14:20:36.524327 4676 scope.go:117] "RemoveContainer" containerID="ad0c3d55f30ad0eb951726832136e28eb69464665a1d5db8f80df1ebfbb3cb95" Sep 30 14:20:36 crc kubenswrapper[4676]: I0930 14:20:36.524416 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 14:20:36 crc kubenswrapper[4676]: I0930 14:20:36.528863 4676 generic.go:334] "Generic (PLEG): container finished" podID="313bd90a-adda-45d6-a1ee-970b9277b7ff" containerID="4be3765007dfb80d2cdaa2cbe29f284d37a161a3a97547367f11eaf78cae7a24" exitCode=0 Sep 30 14:20:36 crc kubenswrapper[4676]: I0930 14:20:36.528917 4676 generic.go:334] "Generic (PLEG): container finished" podID="313bd90a-adda-45d6-a1ee-970b9277b7ff" containerID="7b2b958c2da252793f70c69edc708483f5fb5dbb3e3d7a840d0febb663fa281a" exitCode=143 Sep 30 14:20:36 crc kubenswrapper[4676]: I0930 14:20:36.528942 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"313bd90a-adda-45d6-a1ee-970b9277b7ff","Type":"ContainerDied","Data":"4be3765007dfb80d2cdaa2cbe29f284d37a161a3a97547367f11eaf78cae7a24"} Sep 30 14:20:36 crc kubenswrapper[4676]: I0930 14:20:36.529001 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"313bd90a-adda-45d6-a1ee-970b9277b7ff","Type":"ContainerDied","Data":"7b2b958c2da252793f70c69edc708483f5fb5dbb3e3d7a840d0febb663fa281a"} Sep 30 14:20:36 crc kubenswrapper[4676]: I0930 14:20:36.530844 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3822c455-1e30-49b4-8e75-1e880a010303","Type":"ContainerStarted","Data":"c7d8c016700e3bf579da0fc035cee26451c0a8214ac4a4dce55c61c798056b44"} Sep 30 14:20:36 crc kubenswrapper[4676]: I0930 14:20:36.552610 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d4676fc-805d-4ece-9bdb-d58750217b99-logs\") pod \"7d4676fc-805d-4ece-9bdb-d58750217b99\" (UID: \"7d4676fc-805d-4ece-9bdb-d58750217b99\") " Sep 30 14:20:36 crc kubenswrapper[4676]: I0930 14:20:36.552718 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d4676fc-805d-4ece-9bdb-d58750217b99-scripts\") pod \"7d4676fc-805d-4ece-9bdb-d58750217b99\" (UID: \"7d4676fc-805d-4ece-9bdb-d58750217b99\") " Sep 30 14:20:36 crc kubenswrapper[4676]: I0930 14:20:36.552845 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"7d4676fc-805d-4ece-9bdb-d58750217b99\" (UID: \"7d4676fc-805d-4ece-9bdb-d58750217b99\") " Sep 30 14:20:36 crc kubenswrapper[4676]: I0930 14:20:36.552903 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7d4676fc-805d-4ece-9bdb-d58750217b99-httpd-run\") pod \"7d4676fc-805d-4ece-9bdb-d58750217b99\" (UID: \"7d4676fc-805d-4ece-9bdb-d58750217b99\") " Sep 30 14:20:36 crc kubenswrapper[4676]: I0930 14:20:36.552930 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46bct\" (UniqueName: \"kubernetes.io/projected/7d4676fc-805d-4ece-9bdb-d58750217b99-kube-api-access-46bct\") pod \"7d4676fc-805d-4ece-9bdb-d58750217b99\" (UID: \"7d4676fc-805d-4ece-9bdb-d58750217b99\") " Sep 30 14:20:36 crc kubenswrapper[4676]: I0930 14:20:36.553041 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d4676fc-805d-4ece-9bdb-d58750217b99-combined-ca-bundle\") pod \"7d4676fc-805d-4ece-9bdb-d58750217b99\" (UID: \"7d4676fc-805d-4ece-9bdb-d58750217b99\") " Sep 30 14:20:36 crc kubenswrapper[4676]: I0930 14:20:36.553062 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d4676fc-805d-4ece-9bdb-d58750217b99-config-data\") pod \"7d4676fc-805d-4ece-9bdb-d58750217b99\" (UID: \"7d4676fc-805d-4ece-9bdb-d58750217b99\") " Sep 30 14:20:36 crc kubenswrapper[4676]: I0930 14:20:36.553481 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d4676fc-805d-4ece-9bdb-d58750217b99-logs" (OuterVolumeSpecName: "logs") pod "7d4676fc-805d-4ece-9bdb-d58750217b99" (UID: "7d4676fc-805d-4ece-9bdb-d58750217b99"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:20:36 crc kubenswrapper[4676]: I0930 14:20:36.553650 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d4676fc-805d-4ece-9bdb-d58750217b99-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7d4676fc-805d-4ece-9bdb-d58750217b99" (UID: "7d4676fc-805d-4ece-9bdb-d58750217b99"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:20:36 crc kubenswrapper[4676]: I0930 14:20:36.554168 4676 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7d4676fc-805d-4ece-9bdb-d58750217b99-httpd-run\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:36 crc kubenswrapper[4676]: I0930 14:20:36.554188 4676 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d4676fc-805d-4ece-9bdb-d58750217b99-logs\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:36 crc kubenswrapper[4676]: I0930 14:20:36.560671 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d4676fc-805d-4ece-9bdb-d58750217b99-scripts" (OuterVolumeSpecName: "scripts") pod "7d4676fc-805d-4ece-9bdb-d58750217b99" (UID: "7d4676fc-805d-4ece-9bdb-d58750217b99"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:20:36 crc kubenswrapper[4676]: I0930 14:20:36.561349 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "7d4676fc-805d-4ece-9bdb-d58750217b99" (UID: "7d4676fc-805d-4ece-9bdb-d58750217b99"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 14:20:36 crc kubenswrapper[4676]: I0930 14:20:36.562184 4676 scope.go:117] "RemoveContainer" containerID="88aa14f1f65f1ffda81bc119123affef53d6c9c8374c58437144fe8c33b10d91" Sep 30 14:20:36 crc kubenswrapper[4676]: I0930 14:20:36.562199 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d4676fc-805d-4ece-9bdb-d58750217b99-kube-api-access-46bct" (OuterVolumeSpecName: "kube-api-access-46bct") pod "7d4676fc-805d-4ece-9bdb-d58750217b99" (UID: "7d4676fc-805d-4ece-9bdb-d58750217b99"). InnerVolumeSpecName "kube-api-access-46bct". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:20:36 crc kubenswrapper[4676]: I0930 14:20:36.592030 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d4676fc-805d-4ece-9bdb-d58750217b99-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7d4676fc-805d-4ece-9bdb-d58750217b99" (UID: "7d4676fc-805d-4ece-9bdb-d58750217b99"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:20:36 crc kubenswrapper[4676]: I0930 14:20:36.604956 4676 scope.go:117] "RemoveContainer" containerID="ad0c3d55f30ad0eb951726832136e28eb69464665a1d5db8f80df1ebfbb3cb95" Sep 30 14:20:36 crc kubenswrapper[4676]: E0930 14:20:36.605804 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad0c3d55f30ad0eb951726832136e28eb69464665a1d5db8f80df1ebfbb3cb95\": container with ID starting with ad0c3d55f30ad0eb951726832136e28eb69464665a1d5db8f80df1ebfbb3cb95 not found: ID does not exist" containerID="ad0c3d55f30ad0eb951726832136e28eb69464665a1d5db8f80df1ebfbb3cb95" Sep 30 14:20:36 crc kubenswrapper[4676]: I0930 14:20:36.605848 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad0c3d55f30ad0eb951726832136e28eb69464665a1d5db8f80df1ebfbb3cb95"} err="failed to get container status \"ad0c3d55f30ad0eb951726832136e28eb69464665a1d5db8f80df1ebfbb3cb95\": rpc error: code = NotFound desc = could not find container \"ad0c3d55f30ad0eb951726832136e28eb69464665a1d5db8f80df1ebfbb3cb95\": container with ID starting with ad0c3d55f30ad0eb951726832136e28eb69464665a1d5db8f80df1ebfbb3cb95 not found: ID does not exist" Sep 30 14:20:36 crc kubenswrapper[4676]: I0930 14:20:36.605879 4676 scope.go:117] "RemoveContainer" containerID="88aa14f1f65f1ffda81bc119123affef53d6c9c8374c58437144fe8c33b10d91" Sep 30 14:20:36 crc kubenswrapper[4676]: E0930 14:20:36.606616 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88aa14f1f65f1ffda81bc119123affef53d6c9c8374c58437144fe8c33b10d91\": container with ID starting with 88aa14f1f65f1ffda81bc119123affef53d6c9c8374c58437144fe8c33b10d91 not found: ID does not exist" containerID="88aa14f1f65f1ffda81bc119123affef53d6c9c8374c58437144fe8c33b10d91" Sep 30 14:20:36 crc kubenswrapper[4676]: I0930 14:20:36.606650 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88aa14f1f65f1ffda81bc119123affef53d6c9c8374c58437144fe8c33b10d91"} err="failed to get container status \"88aa14f1f65f1ffda81bc119123affef53d6c9c8374c58437144fe8c33b10d91\": rpc error: code = NotFound desc = could not find container \"88aa14f1f65f1ffda81bc119123affef53d6c9c8374c58437144fe8c33b10d91\": container with ID starting with 88aa14f1f65f1ffda81bc119123affef53d6c9c8374c58437144fe8c33b10d91 not found: ID does not exist" Sep 30 14:20:36 crc kubenswrapper[4676]: I0930 14:20:36.606670 4676 scope.go:117] "RemoveContainer" containerID="ad0c3d55f30ad0eb951726832136e28eb69464665a1d5db8f80df1ebfbb3cb95" Sep 30 14:20:36 crc kubenswrapper[4676]: I0930 14:20:36.608008 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad0c3d55f30ad0eb951726832136e28eb69464665a1d5db8f80df1ebfbb3cb95"} err="failed to get container status \"ad0c3d55f30ad0eb951726832136e28eb69464665a1d5db8f80df1ebfbb3cb95\": rpc error: code = NotFound desc = could not find container \"ad0c3d55f30ad0eb951726832136e28eb69464665a1d5db8f80df1ebfbb3cb95\": container with ID starting with ad0c3d55f30ad0eb951726832136e28eb69464665a1d5db8f80df1ebfbb3cb95 not found: ID does not exist" Sep 30 14:20:36 crc kubenswrapper[4676]: I0930 14:20:36.608034 4676 scope.go:117] "RemoveContainer" containerID="88aa14f1f65f1ffda81bc119123affef53d6c9c8374c58437144fe8c33b10d91" Sep 30 14:20:36 crc kubenswrapper[4676]: I0930 14:20:36.608416 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88aa14f1f65f1ffda81bc119123affef53d6c9c8374c58437144fe8c33b10d91"} err="failed to get container status \"88aa14f1f65f1ffda81bc119123affef53d6c9c8374c58437144fe8c33b10d91\": rpc error: code = NotFound desc = could not find container \"88aa14f1f65f1ffda81bc119123affef53d6c9c8374c58437144fe8c33b10d91\": container with ID starting with 88aa14f1f65f1ffda81bc119123affef53d6c9c8374c58437144fe8c33b10d91 not found: ID does not exist" Sep 30 14:20:36 crc kubenswrapper[4676]: I0930 14:20:36.637229 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d4676fc-805d-4ece-9bdb-d58750217b99-config-data" (OuterVolumeSpecName: "config-data") pod "7d4676fc-805d-4ece-9bdb-d58750217b99" (UID: "7d4676fc-805d-4ece-9bdb-d58750217b99"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:20:36 crc kubenswrapper[4676]: I0930 14:20:36.664230 4676 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d4676fc-805d-4ece-9bdb-d58750217b99-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:36 crc kubenswrapper[4676]: I0930 14:20:36.664623 4676 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Sep 30 14:20:36 crc kubenswrapper[4676]: I0930 14:20:36.664636 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46bct\" (UniqueName: \"kubernetes.io/projected/7d4676fc-805d-4ece-9bdb-d58750217b99-kube-api-access-46bct\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:36 crc kubenswrapper[4676]: I0930 14:20:36.664646 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d4676fc-805d-4ece-9bdb-d58750217b99-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:36 crc kubenswrapper[4676]: I0930 14:20:36.664655 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d4676fc-805d-4ece-9bdb-d58750217b99-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:36 crc kubenswrapper[4676]: I0930 14:20:36.695693 4676 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Sep 30 14:20:36 crc kubenswrapper[4676]: I0930 14:20:36.766899 4676 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:36 crc kubenswrapper[4676]: I0930 14:20:36.997050 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.012943 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.057420 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 14:20:37 crc kubenswrapper[4676]: E0930 14:20:37.058109 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d4676fc-805d-4ece-9bdb-d58750217b99" containerName="glance-httpd" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.058130 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d4676fc-805d-4ece-9bdb-d58750217b99" containerName="glance-httpd" Sep 30 14:20:37 crc kubenswrapper[4676]: E0930 14:20:37.058194 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d4676fc-805d-4ece-9bdb-d58750217b99" containerName="glance-log" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.058203 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d4676fc-805d-4ece-9bdb-d58750217b99" containerName="glance-log" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.058461 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d4676fc-805d-4ece-9bdb-d58750217b99" containerName="glance-log" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.058474 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d4676fc-805d-4ece-9bdb-d58750217b99" containerName="glance-httpd" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.059834 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.062644 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.063007 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.078389 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.185421 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"00c04551-20dc-4c4a-bb5b-39012ad94d51\") " pod="openstack/glance-default-external-api-0" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.185470 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00c04551-20dc-4c4a-bb5b-39012ad94d51-logs\") pod \"glance-default-external-api-0\" (UID: \"00c04551-20dc-4c4a-bb5b-39012ad94d51\") " pod="openstack/glance-default-external-api-0" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.185538 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmj5w\" (UniqueName: \"kubernetes.io/projected/00c04551-20dc-4c4a-bb5b-39012ad94d51-kube-api-access-zmj5w\") pod \"glance-default-external-api-0\" (UID: \"00c04551-20dc-4c4a-bb5b-39012ad94d51\") " pod="openstack/glance-default-external-api-0" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.185604 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/00c04551-20dc-4c4a-bb5b-39012ad94d51-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"00c04551-20dc-4c4a-bb5b-39012ad94d51\") " pod="openstack/glance-default-external-api-0" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.185662 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/00c04551-20dc-4c4a-bb5b-39012ad94d51-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"00c04551-20dc-4c4a-bb5b-39012ad94d51\") " pod="openstack/glance-default-external-api-0" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.185694 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00c04551-20dc-4c4a-bb5b-39012ad94d51-config-data\") pod \"glance-default-external-api-0\" (UID: \"00c04551-20dc-4c4a-bb5b-39012ad94d51\") " pod="openstack/glance-default-external-api-0" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.185764 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00c04551-20dc-4c4a-bb5b-39012ad94d51-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"00c04551-20dc-4c4a-bb5b-39012ad94d51\") " pod="openstack/glance-default-external-api-0" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.185819 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00c04551-20dc-4c4a-bb5b-39012ad94d51-scripts\") pod \"glance-default-external-api-0\" (UID: \"00c04551-20dc-4c4a-bb5b-39012ad94d51\") " pod="openstack/glance-default-external-api-0" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.263218 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5fbdbfbbdb-mjhjg"] Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.287022 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmj5w\" (UniqueName: \"kubernetes.io/projected/00c04551-20dc-4c4a-bb5b-39012ad94d51-kube-api-access-zmj5w\") pod \"glance-default-external-api-0\" (UID: \"00c04551-20dc-4c4a-bb5b-39012ad94d51\") " pod="openstack/glance-default-external-api-0" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.287391 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/00c04551-20dc-4c4a-bb5b-39012ad94d51-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"00c04551-20dc-4c4a-bb5b-39012ad94d51\") " pod="openstack/glance-default-external-api-0" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.287433 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/00c04551-20dc-4c4a-bb5b-39012ad94d51-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"00c04551-20dc-4c4a-bb5b-39012ad94d51\") " pod="openstack/glance-default-external-api-0" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.287478 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00c04551-20dc-4c4a-bb5b-39012ad94d51-config-data\") pod \"glance-default-external-api-0\" (UID: \"00c04551-20dc-4c4a-bb5b-39012ad94d51\") " pod="openstack/glance-default-external-api-0" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.287540 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00c04551-20dc-4c4a-bb5b-39012ad94d51-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"00c04551-20dc-4c4a-bb5b-39012ad94d51\") " pod="openstack/glance-default-external-api-0" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.287590 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00c04551-20dc-4c4a-bb5b-39012ad94d51-scripts\") pod \"glance-default-external-api-0\" (UID: \"00c04551-20dc-4c4a-bb5b-39012ad94d51\") " pod="openstack/glance-default-external-api-0" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.287628 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"00c04551-20dc-4c4a-bb5b-39012ad94d51\") " pod="openstack/glance-default-external-api-0" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.287655 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00c04551-20dc-4c4a-bb5b-39012ad94d51-logs\") pod \"glance-default-external-api-0\" (UID: \"00c04551-20dc-4c4a-bb5b-39012ad94d51\") " pod="openstack/glance-default-external-api-0" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.288163 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/00c04551-20dc-4c4a-bb5b-39012ad94d51-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"00c04551-20dc-4c4a-bb5b-39012ad94d51\") " pod="openstack/glance-default-external-api-0" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.288252 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00c04551-20dc-4c4a-bb5b-39012ad94d51-logs\") pod \"glance-default-external-api-0\" (UID: \"00c04551-20dc-4c4a-bb5b-39012ad94d51\") " pod="openstack/glance-default-external-api-0" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.288738 4676 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"00c04551-20dc-4c4a-bb5b-39012ad94d51\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.289464 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.293369 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00c04551-20dc-4c4a-bb5b-39012ad94d51-scripts\") pod \"glance-default-external-api-0\" (UID: \"00c04551-20dc-4c4a-bb5b-39012ad94d51\") " pod="openstack/glance-default-external-api-0" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.294474 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00c04551-20dc-4c4a-bb5b-39012ad94d51-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"00c04551-20dc-4c4a-bb5b-39012ad94d51\") " pod="openstack/glance-default-external-api-0" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.294827 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/00c04551-20dc-4c4a-bb5b-39012ad94d51-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"00c04551-20dc-4c4a-bb5b-39012ad94d51\") " pod="openstack/glance-default-external-api-0" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.299444 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00c04551-20dc-4c4a-bb5b-39012ad94d51-config-data\") pod \"glance-default-external-api-0\" (UID: \"00c04551-20dc-4c4a-bb5b-39012ad94d51\") " pod="openstack/glance-default-external-api-0" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.318681 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmj5w\" (UniqueName: \"kubernetes.io/projected/00c04551-20dc-4c4a-bb5b-39012ad94d51-kube-api-access-zmj5w\") pod \"glance-default-external-api-0\" (UID: \"00c04551-20dc-4c4a-bb5b-39012ad94d51\") " pod="openstack/glance-default-external-api-0" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.367166 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"00c04551-20dc-4c4a-bb5b-39012ad94d51\") " pod="openstack/glance-default-external-api-0" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.389128 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/313bd90a-adda-45d6-a1ee-970b9277b7ff-config-data\") pod \"313bd90a-adda-45d6-a1ee-970b9277b7ff\" (UID: \"313bd90a-adda-45d6-a1ee-970b9277b7ff\") " Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.389849 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/313bd90a-adda-45d6-a1ee-970b9277b7ff-logs\") pod \"313bd90a-adda-45d6-a1ee-970b9277b7ff\" (UID: \"313bd90a-adda-45d6-a1ee-970b9277b7ff\") " Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.389931 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/313bd90a-adda-45d6-a1ee-970b9277b7ff-httpd-run\") pod \"313bd90a-adda-45d6-a1ee-970b9277b7ff\" (UID: \"313bd90a-adda-45d6-a1ee-970b9277b7ff\") " Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.389999 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zt2kv\" (UniqueName: \"kubernetes.io/projected/313bd90a-adda-45d6-a1ee-970b9277b7ff-kube-api-access-zt2kv\") pod \"313bd90a-adda-45d6-a1ee-970b9277b7ff\" (UID: \"313bd90a-adda-45d6-a1ee-970b9277b7ff\") " Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.390314 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/313bd90a-adda-45d6-a1ee-970b9277b7ff-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "313bd90a-adda-45d6-a1ee-970b9277b7ff" (UID: "313bd90a-adda-45d6-a1ee-970b9277b7ff"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.390376 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/313bd90a-adda-45d6-a1ee-970b9277b7ff-logs" (OuterVolumeSpecName: "logs") pod "313bd90a-adda-45d6-a1ee-970b9277b7ff" (UID: "313bd90a-adda-45d6-a1ee-970b9277b7ff"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.392630 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/313bd90a-adda-45d6-a1ee-970b9277b7ff-scripts\") pod \"313bd90a-adda-45d6-a1ee-970b9277b7ff\" (UID: \"313bd90a-adda-45d6-a1ee-970b9277b7ff\") " Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.392738 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/313bd90a-adda-45d6-a1ee-970b9277b7ff-combined-ca-bundle\") pod \"313bd90a-adda-45d6-a1ee-970b9277b7ff\" (UID: \"313bd90a-adda-45d6-a1ee-970b9277b7ff\") " Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.392809 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"313bd90a-adda-45d6-a1ee-970b9277b7ff\" (UID: \"313bd90a-adda-45d6-a1ee-970b9277b7ff\") " Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.393576 4676 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/313bd90a-adda-45d6-a1ee-970b9277b7ff-logs\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.393599 4676 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/313bd90a-adda-45d6-a1ee-970b9277b7ff-httpd-run\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.397553 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/313bd90a-adda-45d6-a1ee-970b9277b7ff-scripts" (OuterVolumeSpecName: "scripts") pod "313bd90a-adda-45d6-a1ee-970b9277b7ff" (UID: "313bd90a-adda-45d6-a1ee-970b9277b7ff"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.397691 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/313bd90a-adda-45d6-a1ee-970b9277b7ff-kube-api-access-zt2kv" (OuterVolumeSpecName: "kube-api-access-zt2kv") pod "313bd90a-adda-45d6-a1ee-970b9277b7ff" (UID: "313bd90a-adda-45d6-a1ee-970b9277b7ff"). InnerVolumeSpecName "kube-api-access-zt2kv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.398540 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "313bd90a-adda-45d6-a1ee-970b9277b7ff" (UID: "313bd90a-adda-45d6-a1ee-970b9277b7ff"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.436190 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/313bd90a-adda-45d6-a1ee-970b9277b7ff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "313bd90a-adda-45d6-a1ee-970b9277b7ff" (UID: "313bd90a-adda-45d6-a1ee-970b9277b7ff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.451515 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d4676fc-805d-4ece-9bdb-d58750217b99" path="/var/lib/kubelet/pods/7d4676fc-805d-4ece-9bdb-d58750217b99/volumes" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.467218 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/313bd90a-adda-45d6-a1ee-970b9277b7ff-config-data" (OuterVolumeSpecName: "config-data") pod "313bd90a-adda-45d6-a1ee-970b9277b7ff" (UID: "313bd90a-adda-45d6-a1ee-970b9277b7ff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.502674 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/313bd90a-adda-45d6-a1ee-970b9277b7ff-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.502705 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zt2kv\" (UniqueName: \"kubernetes.io/projected/313bd90a-adda-45d6-a1ee-970b9277b7ff-kube-api-access-zt2kv\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.502715 4676 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/313bd90a-adda-45d6-a1ee-970b9277b7ff-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.502724 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/313bd90a-adda-45d6-a1ee-970b9277b7ff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.502741 4676 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.526240 4676 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.542546 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"313bd90a-adda-45d6-a1ee-970b9277b7ff","Type":"ContainerDied","Data":"f24ca228bb13f0e84cb2a099db9ee909241e2bbfbff2de83c250b632656323a4"} Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.542598 4676 scope.go:117] "RemoveContainer" containerID="4be3765007dfb80d2cdaa2cbe29f284d37a161a3a97547367f11eaf78cae7a24" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.542780 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.545678 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-777c6c994b-kk5rn" event={"ID":"70c7535f-4b3b-438f-9470-c857ece73452","Type":"ContainerStarted","Data":"3420d808b5ad47547c1157ced0615a23135ebbfe0705b0fb32348162ef943869"} Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.546147 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-777c6c994b-kk5rn" event={"ID":"70c7535f-4b3b-438f-9470-c857ece73452","Type":"ContainerStarted","Data":"d77a097b19b8188f98c23bcae315c287118f8dffd206abd14c2c1c93a26f5375"} Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.547476 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-bf6466755-m2t9t" event={"ID":"39f521d2-b195-4179-a114-1c1611e4ba2f","Type":"ContainerStarted","Data":"7cb076afd72fa7f68ca6897e819be7a4a0674f42e47ca3ee1d793c17ab5f6334"} Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.547531 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-bf6466755-m2t9t" event={"ID":"39f521d2-b195-4179-a114-1c1611e4ba2f","Type":"ContainerStarted","Data":"b5f72ce4987be5bb44fed4fc7b6e90800f3bb3bc15d8624356ec7bdcabb2d812"} Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.549454 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5fbdbfbbdb-mjhjg" event={"ID":"1e134bd5-ad40-427d-ba65-7cf9a5a25104","Type":"ContainerStarted","Data":"26b19a892326ebc12b109c677122bfa7dc7c65ea0ffd50c2d04e31753e17e463"} Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.549488 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5fbdbfbbdb-mjhjg" event={"ID":"1e134bd5-ad40-427d-ba65-7cf9a5a25104","Type":"ContainerStarted","Data":"c16126979f590253106cb276b9a86c073fd5470e4a56351cf70f4342eddca978"} Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.553205 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3822c455-1e30-49b4-8e75-1e880a010303","Type":"ContainerStarted","Data":"af18c1802fe77e308e9a28b523e216750c8d949edb8500d897eb5240a5070002"} Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.572037 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-777c6c994b-kk5rn" podStartSLOduration=2.403819685 podStartE2EDuration="5.572009418s" podCreationTimestamp="2025-09-30 14:20:32 +0000 UTC" firstStartedPulling="2025-09-30 14:20:33.3781057 +0000 UTC m=+1337.361194129" lastFinishedPulling="2025-09-30 14:20:36.546295433 +0000 UTC m=+1340.529383862" observedRunningTime="2025-09-30 14:20:37.568634219 +0000 UTC m=+1341.551722648" watchObservedRunningTime="2025-09-30 14:20:37.572009418 +0000 UTC m=+1341.555097847" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.585294 4676 scope.go:117] "RemoveContainer" containerID="7b2b958c2da252793f70c69edc708483f5fb5dbb3e3d7a840d0febb663fa281a" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.597452 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-bf6466755-m2t9t" podStartSLOduration=3.244917351 podStartE2EDuration="6.597429884s" podCreationTimestamp="2025-09-30 14:20:31 +0000 UTC" firstStartedPulling="2025-09-30 14:20:33.193763639 +0000 UTC m=+1337.176852068" lastFinishedPulling="2025-09-30 14:20:36.546276172 +0000 UTC m=+1340.529364601" observedRunningTime="2025-09-30 14:20:37.589534177 +0000 UTC m=+1341.572622606" watchObservedRunningTime="2025-09-30 14:20:37.597429884 +0000 UTC m=+1341.580518313" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.604001 4676 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.638810 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.643746 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.655070 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.662656 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 14:20:37 crc kubenswrapper[4676]: E0930 14:20:37.665571 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="313bd90a-adda-45d6-a1ee-970b9277b7ff" containerName="glance-httpd" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.665590 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="313bd90a-adda-45d6-a1ee-970b9277b7ff" containerName="glance-httpd" Sep 30 14:20:37 crc kubenswrapper[4676]: E0930 14:20:37.665619 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="313bd90a-adda-45d6-a1ee-970b9277b7ff" containerName="glance-log" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.665625 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="313bd90a-adda-45d6-a1ee-970b9277b7ff" containerName="glance-log" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.665805 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="313bd90a-adda-45d6-a1ee-970b9277b7ff" containerName="glance-log" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.665819 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="313bd90a-adda-45d6-a1ee-970b9277b7ff" containerName="glance-httpd" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.666881 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.671099 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.671123 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.680141 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.705445 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77375b04-44bb-4250-a54c-0c193201f738-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"77375b04-44bb-4250-a54c-0c193201f738\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.705508 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"77375b04-44bb-4250-a54c-0c193201f738\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.705567 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/77375b04-44bb-4250-a54c-0c193201f738-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"77375b04-44bb-4250-a54c-0c193201f738\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.705598 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77375b04-44bb-4250-a54c-0c193201f738-scripts\") pod \"glance-default-internal-api-0\" (UID: \"77375b04-44bb-4250-a54c-0c193201f738\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.705694 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77375b04-44bb-4250-a54c-0c193201f738-logs\") pod \"glance-default-internal-api-0\" (UID: \"77375b04-44bb-4250-a54c-0c193201f738\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.705733 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77375b04-44bb-4250-a54c-0c193201f738-config-data\") pod \"glance-default-internal-api-0\" (UID: \"77375b04-44bb-4250-a54c-0c193201f738\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.705811 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pgdf\" (UniqueName: \"kubernetes.io/projected/77375b04-44bb-4250-a54c-0c193201f738-kube-api-access-5pgdf\") pod \"glance-default-internal-api-0\" (UID: \"77375b04-44bb-4250-a54c-0c193201f738\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.705856 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/77375b04-44bb-4250-a54c-0c193201f738-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"77375b04-44bb-4250-a54c-0c193201f738\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.808275 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77375b04-44bb-4250-a54c-0c193201f738-logs\") pod \"glance-default-internal-api-0\" (UID: \"77375b04-44bb-4250-a54c-0c193201f738\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.808322 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77375b04-44bb-4250-a54c-0c193201f738-config-data\") pod \"glance-default-internal-api-0\" (UID: \"77375b04-44bb-4250-a54c-0c193201f738\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.808364 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pgdf\" (UniqueName: \"kubernetes.io/projected/77375b04-44bb-4250-a54c-0c193201f738-kube-api-access-5pgdf\") pod \"glance-default-internal-api-0\" (UID: \"77375b04-44bb-4250-a54c-0c193201f738\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.808401 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/77375b04-44bb-4250-a54c-0c193201f738-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"77375b04-44bb-4250-a54c-0c193201f738\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.808462 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77375b04-44bb-4250-a54c-0c193201f738-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"77375b04-44bb-4250-a54c-0c193201f738\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.808501 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"77375b04-44bb-4250-a54c-0c193201f738\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.808552 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/77375b04-44bb-4250-a54c-0c193201f738-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"77375b04-44bb-4250-a54c-0c193201f738\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.808645 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77375b04-44bb-4250-a54c-0c193201f738-scripts\") pod \"glance-default-internal-api-0\" (UID: \"77375b04-44bb-4250-a54c-0c193201f738\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.810081 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/77375b04-44bb-4250-a54c-0c193201f738-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"77375b04-44bb-4250-a54c-0c193201f738\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.810291 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77375b04-44bb-4250-a54c-0c193201f738-logs\") pod \"glance-default-internal-api-0\" (UID: \"77375b04-44bb-4250-a54c-0c193201f738\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.810358 4676 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"77375b04-44bb-4250-a54c-0c193201f738\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-internal-api-0" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.813362 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77375b04-44bb-4250-a54c-0c193201f738-scripts\") pod \"glance-default-internal-api-0\" (UID: \"77375b04-44bb-4250-a54c-0c193201f738\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.818401 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77375b04-44bb-4250-a54c-0c193201f738-config-data\") pod \"glance-default-internal-api-0\" (UID: \"77375b04-44bb-4250-a54c-0c193201f738\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.824531 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/77375b04-44bb-4250-a54c-0c193201f738-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"77375b04-44bb-4250-a54c-0c193201f738\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.829132 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77375b04-44bb-4250-a54c-0c193201f738-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"77375b04-44bb-4250-a54c-0c193201f738\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.847398 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"77375b04-44bb-4250-a54c-0c193201f738\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:20:37 crc kubenswrapper[4676]: I0930 14:20:37.863621 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pgdf\" (UniqueName: \"kubernetes.io/projected/77375b04-44bb-4250-a54c-0c193201f738-kube-api-access-5pgdf\") pod \"glance-default-internal-api-0\" (UID: \"77375b04-44bb-4250-a54c-0c193201f738\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:20:38 crc kubenswrapper[4676]: I0930 14:20:38.008370 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 14:20:38 crc kubenswrapper[4676]: I0930 14:20:38.310311 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 14:20:38 crc kubenswrapper[4676]: W0930 14:20:38.319974 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod00c04551_20dc_4c4a_bb5b_39012ad94d51.slice/crio-47b6b2ec83b27f33b025f65d8d9b93a207954a291ca9f62a7c6a46f4277138b6 WatchSource:0}: Error finding container 47b6b2ec83b27f33b025f65d8d9b93a207954a291ca9f62a7c6a46f4277138b6: Status 404 returned error can't find the container with id 47b6b2ec83b27f33b025f65d8d9b93a207954a291ca9f62a7c6a46f4277138b6 Sep 30 14:20:38 crc kubenswrapper[4676]: I0930 14:20:38.573128 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3822c455-1e30-49b4-8e75-1e880a010303","Type":"ContainerStarted","Data":"e78bcec643723cd2edcb42a790fb63da5fcd0ffd6359c261bb5ac04018cb39be"} Sep 30 14:20:38 crc kubenswrapper[4676]: I0930 14:20:38.577835 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"00c04551-20dc-4c4a-bb5b-39012ad94d51","Type":"ContainerStarted","Data":"47b6b2ec83b27f33b025f65d8d9b93a207954a291ca9f62a7c6a46f4277138b6"} Sep 30 14:20:38 crc kubenswrapper[4676]: I0930 14:20:38.583576 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5fbdbfbbdb-mjhjg" event={"ID":"1e134bd5-ad40-427d-ba65-7cf9a5a25104","Type":"ContainerStarted","Data":"8c3b699cd947fdce80b4b0288f4e2dec34c90d4be0df8250d7d0f7a965efdffd"} Sep 30 14:20:38 crc kubenswrapper[4676]: I0930 14:20:38.584579 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5fbdbfbbdb-mjhjg" Sep 30 14:20:38 crc kubenswrapper[4676]: I0930 14:20:38.584626 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5fbdbfbbdb-mjhjg" Sep 30 14:20:38 crc kubenswrapper[4676]: I0930 14:20:38.614533 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5fbdbfbbdb-mjhjg" podStartSLOduration=3.6145105429999997 podStartE2EDuration="3.614510543s" podCreationTimestamp="2025-09-30 14:20:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:20:38.611358721 +0000 UTC m=+1342.594447150" watchObservedRunningTime="2025-09-30 14:20:38.614510543 +0000 UTC m=+1342.597598972" Sep 30 14:20:38 crc kubenswrapper[4676]: I0930 14:20:38.739312 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 14:20:38 crc kubenswrapper[4676]: W0930 14:20:38.772445 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod77375b04_44bb_4250_a54c_0c193201f738.slice/crio-5c0285907c9d2d39e4d88107a5a2f6019fcff6a73931bedf16354a62a21b9d01 WatchSource:0}: Error finding container 5c0285907c9d2d39e4d88107a5a2f6019fcff6a73931bedf16354a62a21b9d01: Status 404 returned error can't find the container with id 5c0285907c9d2d39e4d88107a5a2f6019fcff6a73931bedf16354a62a21b9d01 Sep 30 14:20:39 crc kubenswrapper[4676]: I0930 14:20:39.456933 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="313bd90a-adda-45d6-a1ee-970b9277b7ff" path="/var/lib/kubelet/pods/313bd90a-adda-45d6-a1ee-970b9277b7ff/volumes" Sep 30 14:20:39 crc kubenswrapper[4676]: I0930 14:20:39.595338 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"77375b04-44bb-4250-a54c-0c193201f738","Type":"ContainerStarted","Data":"37016d8d8c710b42210167415babc7230f414a4ecdc9f080d9f5aca1c982d291"} Sep 30 14:20:39 crc kubenswrapper[4676]: I0930 14:20:39.595386 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"77375b04-44bb-4250-a54c-0c193201f738","Type":"ContainerStarted","Data":"5c0285907c9d2d39e4d88107a5a2f6019fcff6a73931bedf16354a62a21b9d01"} Sep 30 14:20:39 crc kubenswrapper[4676]: I0930 14:20:39.597264 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3822c455-1e30-49b4-8e75-1e880a010303","Type":"ContainerStarted","Data":"d9f5192673622520d3c069bb015c676dfa1302bca8958f91b0725bb235768f84"} Sep 30 14:20:39 crc kubenswrapper[4676]: I0930 14:20:39.599268 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"00c04551-20dc-4c4a-bb5b-39012ad94d51","Type":"ContainerStarted","Data":"3ae371b754f0fb586408bdbfb98125a3b3dd21e489e16eeea16990c6f8789542"} Sep 30 14:20:40 crc kubenswrapper[4676]: I0930 14:20:40.615858 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3822c455-1e30-49b4-8e75-1e880a010303","Type":"ContainerStarted","Data":"2e616340cf3700bfda5442a6aef8b774e64a5391c2e1c3e45aaee3a8f716024f"} Sep 30 14:20:40 crc kubenswrapper[4676]: I0930 14:20:40.618520 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 14:20:40 crc kubenswrapper[4676]: I0930 14:20:40.621440 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"00c04551-20dc-4c4a-bb5b-39012ad94d51","Type":"ContainerStarted","Data":"66dab40a6f6f0114922534982e6172da0eede50d0b26c825e28b48c515304ad4"} Sep 30 14:20:40 crc kubenswrapper[4676]: I0930 14:20:40.626568 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"77375b04-44bb-4250-a54c-0c193201f738","Type":"ContainerStarted","Data":"925d36d9dc230da8ad39f5b029158299a72246e44963e0fa825f764b1c91bfa9"} Sep 30 14:20:40 crc kubenswrapper[4676]: I0930 14:20:40.653182 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.42413564 podStartE2EDuration="6.653164949s" podCreationTimestamp="2025-09-30 14:20:34 +0000 UTC" firstStartedPulling="2025-09-30 14:20:35.992281391 +0000 UTC m=+1339.975369820" lastFinishedPulling="2025-09-30 14:20:40.22131071 +0000 UTC m=+1344.204399129" observedRunningTime="2025-09-30 14:20:40.645379295 +0000 UTC m=+1344.628467734" watchObservedRunningTime="2025-09-30 14:20:40.653164949 +0000 UTC m=+1344.636253378" Sep 30 14:20:40 crc kubenswrapper[4676]: I0930 14:20:40.672122 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.6720734950000002 podStartE2EDuration="3.672073495s" podCreationTimestamp="2025-09-30 14:20:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:20:40.669672452 +0000 UTC m=+1344.652760891" watchObservedRunningTime="2025-09-30 14:20:40.672073495 +0000 UTC m=+1344.655161914" Sep 30 14:20:40 crc kubenswrapper[4676]: I0930 14:20:40.703534 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.703507758 podStartE2EDuration="4.703507758s" podCreationTimestamp="2025-09-30 14:20:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:20:40.700201232 +0000 UTC m=+1344.683289671" watchObservedRunningTime="2025-09-30 14:20:40.703507758 +0000 UTC m=+1344.686596197" Sep 30 14:20:41 crc kubenswrapper[4676]: I0930 14:20:41.637543 4676 generic.go:334] "Generic (PLEG): container finished" podID="63080796-b0be-4b3a-8db5-8242e2eb2bb3" containerID="75ae60f130bd580b01b0188c854fd3510ab5c9f6f7640cee5f9d9c91dd5d9345" exitCode=0 Sep 30 14:20:41 crc kubenswrapper[4676]: I0930 14:20:41.637640 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-wxj4s" event={"ID":"63080796-b0be-4b3a-8db5-8242e2eb2bb3","Type":"ContainerDied","Data":"75ae60f130bd580b01b0188c854fd3510ab5c9f6f7640cee5f9d9c91dd5d9345"} Sep 30 14:20:42 crc kubenswrapper[4676]: I0930 14:20:42.847138 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-59d5ff467f-hnlsg" Sep 30 14:20:42 crc kubenswrapper[4676]: I0930 14:20:42.928466 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-k2b4r"] Sep 30 14:20:42 crc kubenswrapper[4676]: I0930 14:20:42.928727 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-76fcf4b695-k2b4r" podUID="cef0985d-7a66-49f5-a4ac-fabf92844c3e" containerName="dnsmasq-dns" containerID="cri-o://185eae2b3dcc4a3ac93914b66f37a3ecbfc99964c50d6eb283b10d3dbdff7545" gracePeriod=10 Sep 30 14:20:43 crc kubenswrapper[4676]: I0930 14:20:43.190329 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-wxj4s" Sep 30 14:20:43 crc kubenswrapper[4676]: I0930 14:20:43.324614 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/63080796-b0be-4b3a-8db5-8242e2eb2bb3-etc-machine-id\") pod \"63080796-b0be-4b3a-8db5-8242e2eb2bb3\" (UID: \"63080796-b0be-4b3a-8db5-8242e2eb2bb3\") " Sep 30 14:20:43 crc kubenswrapper[4676]: I0930 14:20:43.324785 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63080796-b0be-4b3a-8db5-8242e2eb2bb3-config-data\") pod \"63080796-b0be-4b3a-8db5-8242e2eb2bb3\" (UID: \"63080796-b0be-4b3a-8db5-8242e2eb2bb3\") " Sep 30 14:20:43 crc kubenswrapper[4676]: I0930 14:20:43.324864 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63080796-b0be-4b3a-8db5-8242e2eb2bb3-scripts\") pod \"63080796-b0be-4b3a-8db5-8242e2eb2bb3\" (UID: \"63080796-b0be-4b3a-8db5-8242e2eb2bb3\") " Sep 30 14:20:43 crc kubenswrapper[4676]: I0930 14:20:43.324927 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63080796-b0be-4b3a-8db5-8242e2eb2bb3-combined-ca-bundle\") pod \"63080796-b0be-4b3a-8db5-8242e2eb2bb3\" (UID: \"63080796-b0be-4b3a-8db5-8242e2eb2bb3\") " Sep 30 14:20:43 crc kubenswrapper[4676]: I0930 14:20:43.324960 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/63080796-b0be-4b3a-8db5-8242e2eb2bb3-db-sync-config-data\") pod \"63080796-b0be-4b3a-8db5-8242e2eb2bb3\" (UID: \"63080796-b0be-4b3a-8db5-8242e2eb2bb3\") " Sep 30 14:20:43 crc kubenswrapper[4676]: I0930 14:20:43.325044 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7czf\" (UniqueName: \"kubernetes.io/projected/63080796-b0be-4b3a-8db5-8242e2eb2bb3-kube-api-access-t7czf\") pod \"63080796-b0be-4b3a-8db5-8242e2eb2bb3\" (UID: \"63080796-b0be-4b3a-8db5-8242e2eb2bb3\") " Sep 30 14:20:43 crc kubenswrapper[4676]: I0930 14:20:43.326355 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/63080796-b0be-4b3a-8db5-8242e2eb2bb3-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "63080796-b0be-4b3a-8db5-8242e2eb2bb3" (UID: "63080796-b0be-4b3a-8db5-8242e2eb2bb3"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 14:20:43 crc kubenswrapper[4676]: I0930 14:20:43.337130 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63080796-b0be-4b3a-8db5-8242e2eb2bb3-kube-api-access-t7czf" (OuterVolumeSpecName: "kube-api-access-t7czf") pod "63080796-b0be-4b3a-8db5-8242e2eb2bb3" (UID: "63080796-b0be-4b3a-8db5-8242e2eb2bb3"). InnerVolumeSpecName "kube-api-access-t7czf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:20:43 crc kubenswrapper[4676]: I0930 14:20:43.337259 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63080796-b0be-4b3a-8db5-8242e2eb2bb3-scripts" (OuterVolumeSpecName: "scripts") pod "63080796-b0be-4b3a-8db5-8242e2eb2bb3" (UID: "63080796-b0be-4b3a-8db5-8242e2eb2bb3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:20:43 crc kubenswrapper[4676]: I0930 14:20:43.348463 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63080796-b0be-4b3a-8db5-8242e2eb2bb3-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "63080796-b0be-4b3a-8db5-8242e2eb2bb3" (UID: "63080796-b0be-4b3a-8db5-8242e2eb2bb3"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:20:43 crc kubenswrapper[4676]: I0930 14:20:43.399118 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63080796-b0be-4b3a-8db5-8242e2eb2bb3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "63080796-b0be-4b3a-8db5-8242e2eb2bb3" (UID: "63080796-b0be-4b3a-8db5-8242e2eb2bb3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:20:43 crc kubenswrapper[4676]: I0930 14:20:43.399175 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63080796-b0be-4b3a-8db5-8242e2eb2bb3-config-data" (OuterVolumeSpecName: "config-data") pod "63080796-b0be-4b3a-8db5-8242e2eb2bb3" (UID: "63080796-b0be-4b3a-8db5-8242e2eb2bb3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:20:43 crc kubenswrapper[4676]: I0930 14:20:43.428951 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63080796-b0be-4b3a-8db5-8242e2eb2bb3-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:43 crc kubenswrapper[4676]: I0930 14:20:43.429192 4676 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63080796-b0be-4b3a-8db5-8242e2eb2bb3-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:43 crc kubenswrapper[4676]: I0930 14:20:43.429285 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63080796-b0be-4b3a-8db5-8242e2eb2bb3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:43 crc kubenswrapper[4676]: I0930 14:20:43.429362 4676 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/63080796-b0be-4b3a-8db5-8242e2eb2bb3-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:43 crc kubenswrapper[4676]: I0930 14:20:43.429429 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7czf\" (UniqueName: \"kubernetes.io/projected/63080796-b0be-4b3a-8db5-8242e2eb2bb3-kube-api-access-t7czf\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:43 crc kubenswrapper[4676]: I0930 14:20:43.429515 4676 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/63080796-b0be-4b3a-8db5-8242e2eb2bb3-etc-machine-id\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:43 crc kubenswrapper[4676]: I0930 14:20:43.550514 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-76fcf4b695-k2b4r" podUID="cef0985d-7a66-49f5-a4ac-fabf92844c3e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.142:5353: connect: connection refused" Sep 30 14:20:43 crc kubenswrapper[4676]: I0930 14:20:43.658833 4676 generic.go:334] "Generic (PLEG): container finished" podID="cef0985d-7a66-49f5-a4ac-fabf92844c3e" containerID="185eae2b3dcc4a3ac93914b66f37a3ecbfc99964c50d6eb283b10d3dbdff7545" exitCode=0 Sep 30 14:20:43 crc kubenswrapper[4676]: I0930 14:20:43.658929 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-k2b4r" event={"ID":"cef0985d-7a66-49f5-a4ac-fabf92844c3e","Type":"ContainerDied","Data":"185eae2b3dcc4a3ac93914b66f37a3ecbfc99964c50d6eb283b10d3dbdff7545"} Sep 30 14:20:43 crc kubenswrapper[4676]: I0930 14:20:43.661515 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-wxj4s" event={"ID":"63080796-b0be-4b3a-8db5-8242e2eb2bb3","Type":"ContainerDied","Data":"c5e57c8741c9ce91017786c2a42a15da636d0308f2596ec415e201cb107129cf"} Sep 30 14:20:43 crc kubenswrapper[4676]: I0930 14:20:43.661645 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5e57c8741c9ce91017786c2a42a15da636d0308f2596ec415e201cb107129cf" Sep 30 14:20:43 crc kubenswrapper[4676]: I0930 14:20:43.661789 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-wxj4s" Sep 30 14:20:43 crc kubenswrapper[4676]: I0930 14:20:43.950343 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 14:20:43 crc kubenswrapper[4676]: E0930 14:20:43.955962 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63080796-b0be-4b3a-8db5-8242e2eb2bb3" containerName="cinder-db-sync" Sep 30 14:20:43 crc kubenswrapper[4676]: I0930 14:20:43.956005 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="63080796-b0be-4b3a-8db5-8242e2eb2bb3" containerName="cinder-db-sync" Sep 30 14:20:43 crc kubenswrapper[4676]: I0930 14:20:43.956368 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="63080796-b0be-4b3a-8db5-8242e2eb2bb3" containerName="cinder-db-sync" Sep 30 14:20:43 crc kubenswrapper[4676]: I0930 14:20:43.959338 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 14:20:43 crc kubenswrapper[4676]: I0930 14:20:43.964412 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Sep 30 14:20:43 crc kubenswrapper[4676]: I0930 14:20:43.964592 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Sep 30 14:20:43 crc kubenswrapper[4676]: I0930 14:20:43.964803 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-jt2br" Sep 30 14:20:43 crc kubenswrapper[4676]: I0930 14:20:43.971646 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Sep 30 14:20:44 crc kubenswrapper[4676]: I0930 14:20:44.003324 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 14:20:44 crc kubenswrapper[4676]: I0930 14:20:44.041028 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6f141d8-0218-4c0e-94b8-7c1a235b2e11-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e6f141d8-0218-4c0e-94b8-7c1a235b2e11\") " pod="openstack/cinder-scheduler-0" Sep 30 14:20:44 crc kubenswrapper[4676]: I0930 14:20:44.041497 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e6f141d8-0218-4c0e-94b8-7c1a235b2e11-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e6f141d8-0218-4c0e-94b8-7c1a235b2e11\") " pod="openstack/cinder-scheduler-0" Sep 30 14:20:44 crc kubenswrapper[4676]: I0930 14:20:44.041717 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6f141d8-0218-4c0e-94b8-7c1a235b2e11-scripts\") pod \"cinder-scheduler-0\" (UID: \"e6f141d8-0218-4c0e-94b8-7c1a235b2e11\") " pod="openstack/cinder-scheduler-0" Sep 30 14:20:44 crc kubenswrapper[4676]: I0930 14:20:44.041856 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e6f141d8-0218-4c0e-94b8-7c1a235b2e11-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e6f141d8-0218-4c0e-94b8-7c1a235b2e11\") " pod="openstack/cinder-scheduler-0" Sep 30 14:20:44 crc kubenswrapper[4676]: I0930 14:20:44.042081 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pndxc\" (UniqueName: \"kubernetes.io/projected/e6f141d8-0218-4c0e-94b8-7c1a235b2e11-kube-api-access-pndxc\") pod \"cinder-scheduler-0\" (UID: \"e6f141d8-0218-4c0e-94b8-7c1a235b2e11\") " pod="openstack/cinder-scheduler-0" Sep 30 14:20:44 crc kubenswrapper[4676]: I0930 14:20:44.042363 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6f141d8-0218-4c0e-94b8-7c1a235b2e11-config-data\") pod \"cinder-scheduler-0\" (UID: \"e6f141d8-0218-4c0e-94b8-7c1a235b2e11\") " pod="openstack/cinder-scheduler-0" Sep 30 14:20:44 crc kubenswrapper[4676]: I0930 14:20:44.130318 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-69c986f6d7-wszsv"] Sep 30 14:20:44 crc kubenswrapper[4676]: I0930 14:20:44.135616 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69c986f6d7-wszsv" Sep 30 14:20:44 crc kubenswrapper[4676]: I0930 14:20:44.147041 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pndxc\" (UniqueName: \"kubernetes.io/projected/e6f141d8-0218-4c0e-94b8-7c1a235b2e11-kube-api-access-pndxc\") pod \"cinder-scheduler-0\" (UID: \"e6f141d8-0218-4c0e-94b8-7c1a235b2e11\") " pod="openstack/cinder-scheduler-0" Sep 30 14:20:44 crc kubenswrapper[4676]: I0930 14:20:44.147143 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6f141d8-0218-4c0e-94b8-7c1a235b2e11-config-data\") pod \"cinder-scheduler-0\" (UID: \"e6f141d8-0218-4c0e-94b8-7c1a235b2e11\") " pod="openstack/cinder-scheduler-0" Sep 30 14:20:44 crc kubenswrapper[4676]: I0930 14:20:44.147183 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6f141d8-0218-4c0e-94b8-7c1a235b2e11-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e6f141d8-0218-4c0e-94b8-7c1a235b2e11\") " pod="openstack/cinder-scheduler-0" Sep 30 14:20:44 crc kubenswrapper[4676]: I0930 14:20:44.147228 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e6f141d8-0218-4c0e-94b8-7c1a235b2e11-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e6f141d8-0218-4c0e-94b8-7c1a235b2e11\") " pod="openstack/cinder-scheduler-0" Sep 30 14:20:44 crc kubenswrapper[4676]: I0930 14:20:44.147271 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6f141d8-0218-4c0e-94b8-7c1a235b2e11-scripts\") pod \"cinder-scheduler-0\" (UID: \"e6f141d8-0218-4c0e-94b8-7c1a235b2e11\") " pod="openstack/cinder-scheduler-0" Sep 30 14:20:44 crc kubenswrapper[4676]: I0930 14:20:44.147296 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e6f141d8-0218-4c0e-94b8-7c1a235b2e11-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e6f141d8-0218-4c0e-94b8-7c1a235b2e11\") " pod="openstack/cinder-scheduler-0" Sep 30 14:20:44 crc kubenswrapper[4676]: I0930 14:20:44.147404 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e6f141d8-0218-4c0e-94b8-7c1a235b2e11-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e6f141d8-0218-4c0e-94b8-7c1a235b2e11\") " pod="openstack/cinder-scheduler-0" Sep 30 14:20:44 crc kubenswrapper[4676]: I0930 14:20:44.168594 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e6f141d8-0218-4c0e-94b8-7c1a235b2e11-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e6f141d8-0218-4c0e-94b8-7c1a235b2e11\") " pod="openstack/cinder-scheduler-0" Sep 30 14:20:44 crc kubenswrapper[4676]: I0930 14:20:44.170024 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6f141d8-0218-4c0e-94b8-7c1a235b2e11-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e6f141d8-0218-4c0e-94b8-7c1a235b2e11\") " pod="openstack/cinder-scheduler-0" Sep 30 14:20:44 crc kubenswrapper[4676]: I0930 14:20:44.171610 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6f141d8-0218-4c0e-94b8-7c1a235b2e11-scripts\") pod \"cinder-scheduler-0\" (UID: \"e6f141d8-0218-4c0e-94b8-7c1a235b2e11\") " pod="openstack/cinder-scheduler-0" Sep 30 14:20:44 crc kubenswrapper[4676]: I0930 14:20:44.178805 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6f141d8-0218-4c0e-94b8-7c1a235b2e11-config-data\") pod \"cinder-scheduler-0\" (UID: \"e6f141d8-0218-4c0e-94b8-7c1a235b2e11\") " pod="openstack/cinder-scheduler-0" Sep 30 14:20:44 crc kubenswrapper[4676]: I0930 14:20:44.199113 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pndxc\" (UniqueName: \"kubernetes.io/projected/e6f141d8-0218-4c0e-94b8-7c1a235b2e11-kube-api-access-pndxc\") pod \"cinder-scheduler-0\" (UID: \"e6f141d8-0218-4c0e-94b8-7c1a235b2e11\") " pod="openstack/cinder-scheduler-0" Sep 30 14:20:44 crc kubenswrapper[4676]: I0930 14:20:44.229269 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69c986f6d7-wszsv"] Sep 30 14:20:44 crc kubenswrapper[4676]: I0930 14:20:44.249160 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5c59c419-a30f-4ba1-960f-e953943a6d7e-ovsdbserver-nb\") pod \"dnsmasq-dns-69c986f6d7-wszsv\" (UID: \"5c59c419-a30f-4ba1-960f-e953943a6d7e\") " pod="openstack/dnsmasq-dns-69c986f6d7-wszsv" Sep 30 14:20:44 crc kubenswrapper[4676]: I0930 14:20:44.249268 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ntb8\" (UniqueName: \"kubernetes.io/projected/5c59c419-a30f-4ba1-960f-e953943a6d7e-kube-api-access-5ntb8\") pod \"dnsmasq-dns-69c986f6d7-wszsv\" (UID: \"5c59c419-a30f-4ba1-960f-e953943a6d7e\") " pod="openstack/dnsmasq-dns-69c986f6d7-wszsv" Sep 30 14:20:44 crc kubenswrapper[4676]: I0930 14:20:44.249341 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5c59c419-a30f-4ba1-960f-e953943a6d7e-dns-svc\") pod \"dnsmasq-dns-69c986f6d7-wszsv\" (UID: \"5c59c419-a30f-4ba1-960f-e953943a6d7e\") " pod="openstack/dnsmasq-dns-69c986f6d7-wszsv" Sep 30 14:20:44 crc kubenswrapper[4676]: I0930 14:20:44.249410 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5c59c419-a30f-4ba1-960f-e953943a6d7e-dns-swift-storage-0\") pod \"dnsmasq-dns-69c986f6d7-wszsv\" (UID: \"5c59c419-a30f-4ba1-960f-e953943a6d7e\") " pod="openstack/dnsmasq-dns-69c986f6d7-wszsv" Sep 30 14:20:44 crc kubenswrapper[4676]: I0930 14:20:44.249434 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5c59c419-a30f-4ba1-960f-e953943a6d7e-ovsdbserver-sb\") pod \"dnsmasq-dns-69c986f6d7-wszsv\" (UID: \"5c59c419-a30f-4ba1-960f-e953943a6d7e\") " pod="openstack/dnsmasq-dns-69c986f6d7-wszsv" Sep 30 14:20:44 crc kubenswrapper[4676]: I0930 14:20:44.249468 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c59c419-a30f-4ba1-960f-e953943a6d7e-config\") pod \"dnsmasq-dns-69c986f6d7-wszsv\" (UID: \"5c59c419-a30f-4ba1-960f-e953943a6d7e\") " pod="openstack/dnsmasq-dns-69c986f6d7-wszsv" Sep 30 14:20:44 crc kubenswrapper[4676]: I0930 14:20:44.289582 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 14:20:44 crc kubenswrapper[4676]: I0930 14:20:44.317736 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Sep 30 14:20:44 crc kubenswrapper[4676]: I0930 14:20:44.319380 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 14:20:44 crc kubenswrapper[4676]: I0930 14:20:44.322267 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Sep 30 14:20:44 crc kubenswrapper[4676]: I0930 14:20:44.334518 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Sep 30 14:20:44 crc kubenswrapper[4676]: I0930 14:20:44.351011 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ntb8\" (UniqueName: \"kubernetes.io/projected/5c59c419-a30f-4ba1-960f-e953943a6d7e-kube-api-access-5ntb8\") pod \"dnsmasq-dns-69c986f6d7-wszsv\" (UID: \"5c59c419-a30f-4ba1-960f-e953943a6d7e\") " pod="openstack/dnsmasq-dns-69c986f6d7-wszsv" Sep 30 14:20:44 crc kubenswrapper[4676]: I0930 14:20:44.351122 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5c59c419-a30f-4ba1-960f-e953943a6d7e-dns-svc\") pod \"dnsmasq-dns-69c986f6d7-wszsv\" (UID: \"5c59c419-a30f-4ba1-960f-e953943a6d7e\") " pod="openstack/dnsmasq-dns-69c986f6d7-wszsv" Sep 30 14:20:44 crc kubenswrapper[4676]: I0930 14:20:44.356201 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5c59c419-a30f-4ba1-960f-e953943a6d7e-dns-svc\") pod \"dnsmasq-dns-69c986f6d7-wszsv\" (UID: \"5c59c419-a30f-4ba1-960f-e953943a6d7e\") " pod="openstack/dnsmasq-dns-69c986f6d7-wszsv" Sep 30 14:20:44 crc kubenswrapper[4676]: I0930 14:20:44.356393 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5c59c419-a30f-4ba1-960f-e953943a6d7e-dns-swift-storage-0\") pod \"dnsmasq-dns-69c986f6d7-wszsv\" (UID: \"5c59c419-a30f-4ba1-960f-e953943a6d7e\") " pod="openstack/dnsmasq-dns-69c986f6d7-wszsv" Sep 30 14:20:44 crc kubenswrapper[4676]: I0930 14:20:44.356433 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5c59c419-a30f-4ba1-960f-e953943a6d7e-ovsdbserver-sb\") pod \"dnsmasq-dns-69c986f6d7-wszsv\" (UID: \"5c59c419-a30f-4ba1-960f-e953943a6d7e\") " pod="openstack/dnsmasq-dns-69c986f6d7-wszsv" Sep 30 14:20:44 crc kubenswrapper[4676]: I0930 14:20:44.356471 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c59c419-a30f-4ba1-960f-e953943a6d7e-config\") pod \"dnsmasq-dns-69c986f6d7-wszsv\" (UID: \"5c59c419-a30f-4ba1-960f-e953943a6d7e\") " pod="openstack/dnsmasq-dns-69c986f6d7-wszsv" Sep 30 14:20:44 crc kubenswrapper[4676]: I0930 14:20:44.356783 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5c59c419-a30f-4ba1-960f-e953943a6d7e-ovsdbserver-nb\") pod \"dnsmasq-dns-69c986f6d7-wszsv\" (UID: \"5c59c419-a30f-4ba1-960f-e953943a6d7e\") " pod="openstack/dnsmasq-dns-69c986f6d7-wszsv" Sep 30 14:20:44 crc kubenswrapper[4676]: I0930 14:20:44.357860 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5c59c419-a30f-4ba1-960f-e953943a6d7e-ovsdbserver-sb\") pod \"dnsmasq-dns-69c986f6d7-wszsv\" (UID: \"5c59c419-a30f-4ba1-960f-e953943a6d7e\") " pod="openstack/dnsmasq-dns-69c986f6d7-wszsv" Sep 30 14:20:44 crc kubenswrapper[4676]: I0930 14:20:44.358113 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5c59c419-a30f-4ba1-960f-e953943a6d7e-dns-swift-storage-0\") pod \"dnsmasq-dns-69c986f6d7-wszsv\" (UID: \"5c59c419-a30f-4ba1-960f-e953943a6d7e\") " pod="openstack/dnsmasq-dns-69c986f6d7-wszsv" Sep 30 14:20:44 crc kubenswrapper[4676]: I0930 14:20:44.358582 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5c59c419-a30f-4ba1-960f-e953943a6d7e-ovsdbserver-nb\") pod \"dnsmasq-dns-69c986f6d7-wszsv\" (UID: \"5c59c419-a30f-4ba1-960f-e953943a6d7e\") " pod="openstack/dnsmasq-dns-69c986f6d7-wszsv" Sep 30 14:20:44 crc kubenswrapper[4676]: I0930 14:20:44.359239 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c59c419-a30f-4ba1-960f-e953943a6d7e-config\") pod \"dnsmasq-dns-69c986f6d7-wszsv\" (UID: \"5c59c419-a30f-4ba1-960f-e953943a6d7e\") " pod="openstack/dnsmasq-dns-69c986f6d7-wszsv" Sep 30 14:20:44 crc kubenswrapper[4676]: I0930 14:20:44.379365 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ntb8\" (UniqueName: \"kubernetes.io/projected/5c59c419-a30f-4ba1-960f-e953943a6d7e-kube-api-access-5ntb8\") pod \"dnsmasq-dns-69c986f6d7-wszsv\" (UID: \"5c59c419-a30f-4ba1-960f-e953943a6d7e\") " pod="openstack/dnsmasq-dns-69c986f6d7-wszsv" Sep 30 14:20:44 crc kubenswrapper[4676]: I0930 14:20:44.459092 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c28f715c-a254-49ae-94e4-60fcacf0e59a-scripts\") pod \"cinder-api-0\" (UID: \"c28f715c-a254-49ae-94e4-60fcacf0e59a\") " pod="openstack/cinder-api-0" Sep 30 14:20:44 crc kubenswrapper[4676]: I0930 14:20:44.459230 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c28f715c-a254-49ae-94e4-60fcacf0e59a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c28f715c-a254-49ae-94e4-60fcacf0e59a\") " pod="openstack/cinder-api-0" Sep 30 14:20:44 crc kubenswrapper[4676]: I0930 14:20:44.459303 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c28f715c-a254-49ae-94e4-60fcacf0e59a-config-data-custom\") pod \"cinder-api-0\" (UID: \"c28f715c-a254-49ae-94e4-60fcacf0e59a\") " pod="openstack/cinder-api-0" Sep 30 14:20:44 crc kubenswrapper[4676]: I0930 14:20:44.459457 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c28f715c-a254-49ae-94e4-60fcacf0e59a-logs\") pod \"cinder-api-0\" (UID: \"c28f715c-a254-49ae-94e4-60fcacf0e59a\") " pod="openstack/cinder-api-0" Sep 30 14:20:44 crc kubenswrapper[4676]: I0930 14:20:44.459484 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c28f715c-a254-49ae-94e4-60fcacf0e59a-config-data\") pod \"cinder-api-0\" (UID: \"c28f715c-a254-49ae-94e4-60fcacf0e59a\") " pod="openstack/cinder-api-0" Sep 30 14:20:44 crc kubenswrapper[4676]: I0930 14:20:44.459563 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwz8m\" (UniqueName: \"kubernetes.io/projected/c28f715c-a254-49ae-94e4-60fcacf0e59a-kube-api-access-vwz8m\") pod \"cinder-api-0\" (UID: \"c28f715c-a254-49ae-94e4-60fcacf0e59a\") " pod="openstack/cinder-api-0" Sep 30 14:20:44 crc kubenswrapper[4676]: I0930 14:20:44.459637 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c28f715c-a254-49ae-94e4-60fcacf0e59a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c28f715c-a254-49ae-94e4-60fcacf0e59a\") " pod="openstack/cinder-api-0" Sep 30 14:20:44 crc kubenswrapper[4676]: I0930 14:20:44.551193 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69c986f6d7-wszsv" Sep 30 14:20:44 crc kubenswrapper[4676]: I0930 14:20:44.560909 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwz8m\" (UniqueName: \"kubernetes.io/projected/c28f715c-a254-49ae-94e4-60fcacf0e59a-kube-api-access-vwz8m\") pod \"cinder-api-0\" (UID: \"c28f715c-a254-49ae-94e4-60fcacf0e59a\") " pod="openstack/cinder-api-0" Sep 30 14:20:44 crc kubenswrapper[4676]: I0930 14:20:44.561051 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c28f715c-a254-49ae-94e4-60fcacf0e59a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c28f715c-a254-49ae-94e4-60fcacf0e59a\") " pod="openstack/cinder-api-0" Sep 30 14:20:44 crc kubenswrapper[4676]: I0930 14:20:44.561140 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c28f715c-a254-49ae-94e4-60fcacf0e59a-scripts\") pod \"cinder-api-0\" (UID: \"c28f715c-a254-49ae-94e4-60fcacf0e59a\") " pod="openstack/cinder-api-0" Sep 30 14:20:44 crc kubenswrapper[4676]: I0930 14:20:44.561240 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c28f715c-a254-49ae-94e4-60fcacf0e59a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c28f715c-a254-49ae-94e4-60fcacf0e59a\") " pod="openstack/cinder-api-0" Sep 30 14:20:44 crc kubenswrapper[4676]: I0930 14:20:44.561301 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c28f715c-a254-49ae-94e4-60fcacf0e59a-config-data-custom\") pod \"cinder-api-0\" (UID: \"c28f715c-a254-49ae-94e4-60fcacf0e59a\") " pod="openstack/cinder-api-0" Sep 30 14:20:44 crc kubenswrapper[4676]: I0930 14:20:44.561376 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c28f715c-a254-49ae-94e4-60fcacf0e59a-logs\") pod \"cinder-api-0\" (UID: \"c28f715c-a254-49ae-94e4-60fcacf0e59a\") " pod="openstack/cinder-api-0" Sep 30 14:20:44 crc kubenswrapper[4676]: I0930 14:20:44.561408 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c28f715c-a254-49ae-94e4-60fcacf0e59a-config-data\") pod \"cinder-api-0\" (UID: \"c28f715c-a254-49ae-94e4-60fcacf0e59a\") " pod="openstack/cinder-api-0" Sep 30 14:20:44 crc kubenswrapper[4676]: I0930 14:20:44.561515 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c28f715c-a254-49ae-94e4-60fcacf0e59a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c28f715c-a254-49ae-94e4-60fcacf0e59a\") " pod="openstack/cinder-api-0" Sep 30 14:20:44 crc kubenswrapper[4676]: I0930 14:20:44.563910 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c28f715c-a254-49ae-94e4-60fcacf0e59a-logs\") pod \"cinder-api-0\" (UID: \"c28f715c-a254-49ae-94e4-60fcacf0e59a\") " pod="openstack/cinder-api-0" Sep 30 14:20:44 crc kubenswrapper[4676]: I0930 14:20:44.566679 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c28f715c-a254-49ae-94e4-60fcacf0e59a-scripts\") pod \"cinder-api-0\" (UID: \"c28f715c-a254-49ae-94e4-60fcacf0e59a\") " pod="openstack/cinder-api-0" Sep 30 14:20:44 crc kubenswrapper[4676]: I0930 14:20:44.571936 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c28f715c-a254-49ae-94e4-60fcacf0e59a-config-data\") pod \"cinder-api-0\" (UID: \"c28f715c-a254-49ae-94e4-60fcacf0e59a\") " pod="openstack/cinder-api-0" Sep 30 14:20:44 crc kubenswrapper[4676]: I0930 14:20:44.572588 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c28f715c-a254-49ae-94e4-60fcacf0e59a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c28f715c-a254-49ae-94e4-60fcacf0e59a\") " pod="openstack/cinder-api-0" Sep 30 14:20:44 crc kubenswrapper[4676]: I0930 14:20:44.583740 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c28f715c-a254-49ae-94e4-60fcacf0e59a-config-data-custom\") pod \"cinder-api-0\" (UID: \"c28f715c-a254-49ae-94e4-60fcacf0e59a\") " pod="openstack/cinder-api-0" Sep 30 14:20:44 crc kubenswrapper[4676]: I0930 14:20:44.583996 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwz8m\" (UniqueName: \"kubernetes.io/projected/c28f715c-a254-49ae-94e4-60fcacf0e59a-kube-api-access-vwz8m\") pod \"cinder-api-0\" (UID: \"c28f715c-a254-49ae-94e4-60fcacf0e59a\") " pod="openstack/cinder-api-0" Sep 30 14:20:44 crc kubenswrapper[4676]: I0930 14:20:44.639736 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 14:20:44 crc kubenswrapper[4676]: I0930 14:20:44.868206 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-6668cdff8d-z8vnk" Sep 30 14:20:45 crc kubenswrapper[4676]: I0930 14:20:45.188386 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-68fc47cdb4-6758j" Sep 30 14:20:46 crc kubenswrapper[4676]: I0930 14:20:46.959235 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-6ccd8d4586-px8lq" podUID="23c298b3-265b-4261-80d8-a79bc312db01" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.155:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 14:20:46 crc kubenswrapper[4676]: I0930 14:20:46.959255 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-6ccd8d4586-px8lq" podUID="23c298b3-265b-4261-80d8-a79bc312db01" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.155:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 14:20:47 crc kubenswrapper[4676]: I0930 14:20:47.068617 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Sep 30 14:20:47 crc kubenswrapper[4676]: I0930 14:20:47.105234 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-5fbdbfbbdb-mjhjg" podUID="1e134bd5-ad40-427d-ba65-7cf9a5a25104" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.157:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 14:20:47 crc kubenswrapper[4676]: I0930 14:20:47.111173 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-5fbdbfbbdb-mjhjg" podUID="1e134bd5-ad40-427d-ba65-7cf9a5a25104" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.157:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 14:20:47 crc kubenswrapper[4676]: I0930 14:20:47.514073 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6ccd8d4586-px8lq" podUID="23c298b3-265b-4261-80d8-a79bc312db01" containerName="barbican-api" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 14:20:47 crc kubenswrapper[4676]: I0930 14:20:47.553453 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6ccd8d4586-px8lq" podUID="23c298b3-265b-4261-80d8-a79bc312db01" containerName="barbican-api-log" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 14:20:47 crc kubenswrapper[4676]: I0930 14:20:47.557685 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-6668cdff8d-z8vnk" Sep 30 14:20:47 crc kubenswrapper[4676]: I0930 14:20:47.645425 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Sep 30 14:20:47 crc kubenswrapper[4676]: I0930 14:20:47.645543 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Sep 30 14:20:47 crc kubenswrapper[4676]: I0930 14:20:47.786879 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-68fc47cdb4-6758j" Sep 30 14:20:47 crc kubenswrapper[4676]: I0930 14:20:47.862242 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6668cdff8d-z8vnk"] Sep 30 14:20:47 crc kubenswrapper[4676]: I0930 14:20:47.862556 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6668cdff8d-z8vnk" podUID="ace12602-f0f7-4f29-8c37-72c1e840bacc" containerName="horizon-log" containerID="cri-o://e982bb57f0c69fac6b00b72b2db739d79c8214c812ea9ff516f0a7d6e34a9939" gracePeriod=30 Sep 30 14:20:47 crc kubenswrapper[4676]: I0930 14:20:47.862751 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6668cdff8d-z8vnk" podUID="ace12602-f0f7-4f29-8c37-72c1e840bacc" containerName="horizon" containerID="cri-o://41f7da45c7537f91933be6cb3d5a7cc0b180938f05b9eead2baa6c684dcaccdd" gracePeriod=30 Sep 30 14:20:47 crc kubenswrapper[4676]: I0930 14:20:47.993840 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Sep 30 14:20:47 crc kubenswrapper[4676]: I0930 14:20:47.994445 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Sep 30 14:20:47 crc kubenswrapper[4676]: I0930 14:20:47.994958 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Sep 30 14:20:48 crc kubenswrapper[4676]: I0930 14:20:48.009611 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Sep 30 14:20:48 crc kubenswrapper[4676]: I0930 14:20:48.009667 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Sep 30 14:20:48 crc kubenswrapper[4676]: I0930 14:20:48.126130 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Sep 30 14:20:48 crc kubenswrapper[4676]: I0930 14:20:48.126724 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Sep 30 14:20:48 crc kubenswrapper[4676]: I0930 14:20:48.435396 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5fbdbfbbdb-mjhjg" Sep 30 14:20:48 crc kubenswrapper[4676]: I0930 14:20:48.729280 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Sep 30 14:20:48 crc kubenswrapper[4676]: I0930 14:20:48.729317 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Sep 30 14:20:48 crc kubenswrapper[4676]: I0930 14:20:48.729330 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Sep 30 14:20:49 crc kubenswrapper[4676]: I0930 14:20:49.081873 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5fbdbfbbdb-mjhjg" Sep 30 14:20:49 crc kubenswrapper[4676]: I0930 14:20:49.162070 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6ccd8d4586-px8lq"] Sep 30 14:20:49 crc kubenswrapper[4676]: I0930 14:20:49.162964 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6ccd8d4586-px8lq" podUID="23c298b3-265b-4261-80d8-a79bc312db01" containerName="barbican-api-log" containerID="cri-o://1e857e25ea7274253d5df5c66541f35ea25cac84b5e27f1cec64a3de51cd79c9" gracePeriod=30 Sep 30 14:20:49 crc kubenswrapper[4676]: I0930 14:20:49.163183 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6ccd8d4586-px8lq" podUID="23c298b3-265b-4261-80d8-a79bc312db01" containerName="barbican-api" containerID="cri-o://8540aa107515f95c73c42c71b0862f3fabd4b602088108839de75afa0b3f4c22" gracePeriod=30 Sep 30 14:20:49 crc kubenswrapper[4676]: I0930 14:20:49.175177 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-6ccd8d4586-px8lq" podUID="23c298b3-265b-4261-80d8-a79bc312db01" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.155:9311/healthcheck\": EOF" Sep 30 14:20:49 crc kubenswrapper[4676]: I0930 14:20:49.175280 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-6ccd8d4586-px8lq" podUID="23c298b3-265b-4261-80d8-a79bc312db01" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.155:9311/healthcheck\": EOF" Sep 30 14:20:49 crc kubenswrapper[4676]: I0930 14:20:49.175980 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6ccd8d4586-px8lq" podUID="23c298b3-265b-4261-80d8-a79bc312db01" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.155:9311/healthcheck\": EOF" Sep 30 14:20:49 crc kubenswrapper[4676]: I0930 14:20:49.178554 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6ccd8d4586-px8lq" podUID="23c298b3-265b-4261-80d8-a79bc312db01" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.155:9311/healthcheck\": EOF" Sep 30 14:20:49 crc kubenswrapper[4676]: I0930 14:20:49.178725 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6ccd8d4586-px8lq" podUID="23c298b3-265b-4261-80d8-a79bc312db01" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.155:9311/healthcheck\": EOF" Sep 30 14:20:49 crc kubenswrapper[4676]: I0930 14:20:49.785069 4676 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 14:20:50 crc kubenswrapper[4676]: I0930 14:20:50.851035 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-k2b4r" event={"ID":"cef0985d-7a66-49f5-a4ac-fabf92844c3e","Type":"ContainerDied","Data":"204a08448dcc4ef2f75612186a75c6e67dcf048c5c40c62840f5281afd13fb21"} Sep 30 14:20:50 crc kubenswrapper[4676]: I0930 14:20:50.851559 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="204a08448dcc4ef2f75612186a75c6e67dcf048c5c40c62840f5281afd13fb21" Sep 30 14:20:50 crc kubenswrapper[4676]: I0930 14:20:50.892658 4676 generic.go:334] "Generic (PLEG): container finished" podID="23c298b3-265b-4261-80d8-a79bc312db01" containerID="1e857e25ea7274253d5df5c66541f35ea25cac84b5e27f1cec64a3de51cd79c9" exitCode=143 Sep 30 14:20:50 crc kubenswrapper[4676]: I0930 14:20:50.892786 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6ccd8d4586-px8lq" event={"ID":"23c298b3-265b-4261-80d8-a79bc312db01","Type":"ContainerDied","Data":"1e857e25ea7274253d5df5c66541f35ea25cac84b5e27f1cec64a3de51cd79c9"} Sep 30 14:20:51 crc kubenswrapper[4676]: I0930 14:20:51.000694 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fcf4b695-k2b4r" Sep 30 14:20:51 crc kubenswrapper[4676]: I0930 14:20:51.133973 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cef0985d-7a66-49f5-a4ac-fabf92844c3e-dns-svc\") pod \"cef0985d-7a66-49f5-a4ac-fabf92844c3e\" (UID: \"cef0985d-7a66-49f5-a4ac-fabf92844c3e\") " Sep 30 14:20:51 crc kubenswrapper[4676]: I0930 14:20:51.134037 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zcgkm\" (UniqueName: \"kubernetes.io/projected/cef0985d-7a66-49f5-a4ac-fabf92844c3e-kube-api-access-zcgkm\") pod \"cef0985d-7a66-49f5-a4ac-fabf92844c3e\" (UID: \"cef0985d-7a66-49f5-a4ac-fabf92844c3e\") " Sep 30 14:20:51 crc kubenswrapper[4676]: I0930 14:20:51.134206 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cef0985d-7a66-49f5-a4ac-fabf92844c3e-ovsdbserver-nb\") pod \"cef0985d-7a66-49f5-a4ac-fabf92844c3e\" (UID: \"cef0985d-7a66-49f5-a4ac-fabf92844c3e\") " Sep 30 14:20:51 crc kubenswrapper[4676]: I0930 14:20:51.134278 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cef0985d-7a66-49f5-a4ac-fabf92844c3e-dns-swift-storage-0\") pod \"cef0985d-7a66-49f5-a4ac-fabf92844c3e\" (UID: \"cef0985d-7a66-49f5-a4ac-fabf92844c3e\") " Sep 30 14:20:51 crc kubenswrapper[4676]: I0930 14:20:51.134332 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cef0985d-7a66-49f5-a4ac-fabf92844c3e-ovsdbserver-sb\") pod \"cef0985d-7a66-49f5-a4ac-fabf92844c3e\" (UID: \"cef0985d-7a66-49f5-a4ac-fabf92844c3e\") " Sep 30 14:20:51 crc kubenswrapper[4676]: I0930 14:20:51.134380 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cef0985d-7a66-49f5-a4ac-fabf92844c3e-config\") pod \"cef0985d-7a66-49f5-a4ac-fabf92844c3e\" (UID: \"cef0985d-7a66-49f5-a4ac-fabf92844c3e\") " Sep 30 14:20:51 crc kubenswrapper[4676]: I0930 14:20:51.207555 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cef0985d-7a66-49f5-a4ac-fabf92844c3e-kube-api-access-zcgkm" (OuterVolumeSpecName: "kube-api-access-zcgkm") pod "cef0985d-7a66-49f5-a4ac-fabf92844c3e" (UID: "cef0985d-7a66-49f5-a4ac-fabf92844c3e"). InnerVolumeSpecName "kube-api-access-zcgkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:20:51 crc kubenswrapper[4676]: I0930 14:20:51.256661 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zcgkm\" (UniqueName: \"kubernetes.io/projected/cef0985d-7a66-49f5-a4ac-fabf92844c3e-kube-api-access-zcgkm\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:51 crc kubenswrapper[4676]: I0930 14:20:51.326912 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cef0985d-7a66-49f5-a4ac-fabf92844c3e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cef0985d-7a66-49f5-a4ac-fabf92844c3e" (UID: "cef0985d-7a66-49f5-a4ac-fabf92844c3e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:20:51 crc kubenswrapper[4676]: I0930 14:20:51.360239 4676 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cef0985d-7a66-49f5-a4ac-fabf92844c3e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:51 crc kubenswrapper[4676]: I0930 14:20:51.367145 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cef0985d-7a66-49f5-a4ac-fabf92844c3e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cef0985d-7a66-49f5-a4ac-fabf92844c3e" (UID: "cef0985d-7a66-49f5-a4ac-fabf92844c3e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:20:51 crc kubenswrapper[4676]: I0930 14:20:51.465293 4676 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cef0985d-7a66-49f5-a4ac-fabf92844c3e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:51 crc kubenswrapper[4676]: I0930 14:20:51.493779 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cef0985d-7a66-49f5-a4ac-fabf92844c3e-config" (OuterVolumeSpecName: "config") pod "cef0985d-7a66-49f5-a4ac-fabf92844c3e" (UID: "cef0985d-7a66-49f5-a4ac-fabf92844c3e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:20:51 crc kubenswrapper[4676]: I0930 14:20:51.493932 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6558bbc9d4-wdcbn" Sep 30 14:20:51 crc kubenswrapper[4676]: I0930 14:20:51.504595 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cef0985d-7a66-49f5-a4ac-fabf92844c3e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cef0985d-7a66-49f5-a4ac-fabf92844c3e" (UID: "cef0985d-7a66-49f5-a4ac-fabf92844c3e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:20:51 crc kubenswrapper[4676]: I0930 14:20:51.526032 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cef0985d-7a66-49f5-a4ac-fabf92844c3e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "cef0985d-7a66-49f5-a4ac-fabf92844c3e" (UID: "cef0985d-7a66-49f5-a4ac-fabf92844c3e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:20:51 crc kubenswrapper[4676]: I0930 14:20:51.569148 4676 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cef0985d-7a66-49f5-a4ac-fabf92844c3e-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:51 crc kubenswrapper[4676]: I0930 14:20:51.569183 4676 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cef0985d-7a66-49f5-a4ac-fabf92844c3e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:51 crc kubenswrapper[4676]: I0930 14:20:51.569193 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cef0985d-7a66-49f5-a4ac-fabf92844c3e-config\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:51 crc kubenswrapper[4676]: I0930 14:20:51.717219 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 14:20:51 crc kubenswrapper[4676]: W0930 14:20:51.729536 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6f141d8_0218_4c0e_94b8_7c1a235b2e11.slice/crio-8f04cae86892be24c8a795c55272daf680d7426131ba24445610681789ad61d2 WatchSource:0}: Error finding container 8f04cae86892be24c8a795c55272daf680d7426131ba24445610681789ad61d2: Status 404 returned error can't find the container with id 8f04cae86892be24c8a795c55272daf680d7426131ba24445610681789ad61d2 Sep 30 14:20:51 crc kubenswrapper[4676]: I0930 14:20:51.809606 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6668cdff8d-z8vnk" podUID="ace12602-f0f7-4f29-8c37-72c1e840bacc" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.144:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.144:8443: connect: connection refused" Sep 30 14:20:51 crc kubenswrapper[4676]: I0930 14:20:51.867355 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69c986f6d7-wszsv"] Sep 30 14:20:51 crc kubenswrapper[4676]: I0930 14:20:51.897579 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Sep 30 14:20:51 crc kubenswrapper[4676]: W0930 14:20:51.899174 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c59c419_a30f_4ba1_960f_e953943a6d7e.slice/crio-986402eb6033d7e35cd1ca4a1b6d2158d0bc7868ef940b9ae73d07f036cd7696 WatchSource:0}: Error finding container 986402eb6033d7e35cd1ca4a1b6d2158d0bc7868ef940b9ae73d07f036cd7696: Status 404 returned error can't find the container with id 986402eb6033d7e35cd1ca4a1b6d2158d0bc7868ef940b9ae73d07f036cd7696 Sep 30 14:20:51 crc kubenswrapper[4676]: I0930 14:20:51.911602 4676 generic.go:334] "Generic (PLEG): container finished" podID="ace12602-f0f7-4f29-8c37-72c1e840bacc" containerID="41f7da45c7537f91933be6cb3d5a7cc0b180938f05b9eead2baa6c684dcaccdd" exitCode=0 Sep 30 14:20:51 crc kubenswrapper[4676]: I0930 14:20:51.911663 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6668cdff8d-z8vnk" event={"ID":"ace12602-f0f7-4f29-8c37-72c1e840bacc","Type":"ContainerDied","Data":"41f7da45c7537f91933be6cb3d5a7cc0b180938f05b9eead2baa6c684dcaccdd"} Sep 30 14:20:51 crc kubenswrapper[4676]: W0930 14:20:51.919366 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc28f715c_a254_49ae_94e4_60fcacf0e59a.slice/crio-b0100b665e9a27af8c7e00af1121876f65607dcb369a8bbf347d585e2222ff8f WatchSource:0}: Error finding container b0100b665e9a27af8c7e00af1121876f65607dcb369a8bbf347d585e2222ff8f: Status 404 returned error can't find the container with id b0100b665e9a27af8c7e00af1121876f65607dcb369a8bbf347d585e2222ff8f Sep 30 14:20:51 crc kubenswrapper[4676]: I0930 14:20:51.919514 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-d85658b59-96mgj" Sep 30 14:20:51 crc kubenswrapper[4676]: I0930 14:20:51.920777 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fcf4b695-k2b4r" Sep 30 14:20:51 crc kubenswrapper[4676]: I0930 14:20:51.921027 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e6f141d8-0218-4c0e-94b8-7c1a235b2e11","Type":"ContainerStarted","Data":"8f04cae86892be24c8a795c55272daf680d7426131ba24445610681789ad61d2"} Sep 30 14:20:52 crc kubenswrapper[4676]: I0930 14:20:52.126725 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-k2b4r"] Sep 30 14:20:52 crc kubenswrapper[4676]: I0930 14:20:52.146079 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-k2b4r"] Sep 30 14:20:52 crc kubenswrapper[4676]: I0930 14:20:52.947607 4676 generic.go:334] "Generic (PLEG): container finished" podID="5c59c419-a30f-4ba1-960f-e953943a6d7e" containerID="4ed8299712a7685ee0d452f6172577127f517d1280aa1674a5980448bca1689b" exitCode=0 Sep 30 14:20:52 crc kubenswrapper[4676]: I0930 14:20:52.947951 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69c986f6d7-wszsv" event={"ID":"5c59c419-a30f-4ba1-960f-e953943a6d7e","Type":"ContainerDied","Data":"4ed8299712a7685ee0d452f6172577127f517d1280aa1674a5980448bca1689b"} Sep 30 14:20:52 crc kubenswrapper[4676]: I0930 14:20:52.947984 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69c986f6d7-wszsv" event={"ID":"5c59c419-a30f-4ba1-960f-e953943a6d7e","Type":"ContainerStarted","Data":"986402eb6033d7e35cd1ca4a1b6d2158d0bc7868ef940b9ae73d07f036cd7696"} Sep 30 14:20:52 crc kubenswrapper[4676]: I0930 14:20:52.951777 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c28f715c-a254-49ae-94e4-60fcacf0e59a","Type":"ContainerStarted","Data":"b0100b665e9a27af8c7e00af1121876f65607dcb369a8bbf347d585e2222ff8f"} Sep 30 14:20:53 crc kubenswrapper[4676]: I0930 14:20:53.089211 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Sep 30 14:20:53 crc kubenswrapper[4676]: I0930 14:20:53.089995 4676 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 14:20:53 crc kubenswrapper[4676]: I0930 14:20:53.136263 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Sep 30 14:20:53 crc kubenswrapper[4676]: I0930 14:20:53.426531 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Sep 30 14:20:53 crc kubenswrapper[4676]: I0930 14:20:53.426642 4676 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 14:20:53 crc kubenswrapper[4676]: I0930 14:20:53.458224 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cef0985d-7a66-49f5-a4ac-fabf92844c3e" path="/var/lib/kubelet/pods/cef0985d-7a66-49f5-a4ac-fabf92844c3e/volumes" Sep 30 14:20:53 crc kubenswrapper[4676]: I0930 14:20:53.486126 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Sep 30 14:20:53 crc kubenswrapper[4676]: I0930 14:20:53.557377 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-76fcf4b695-k2b4r" podUID="cef0985d-7a66-49f5-a4ac-fabf92844c3e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.142:5353: i/o timeout" Sep 30 14:20:53 crc kubenswrapper[4676]: I0930 14:20:53.980540 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c28f715c-a254-49ae-94e4-60fcacf0e59a","Type":"ContainerStarted","Data":"668cb1de1252b051f90c5be09307fbf24afaa97bc0e03196f9291620088ef4fb"} Sep 30 14:20:53 crc kubenswrapper[4676]: I0930 14:20:53.988693 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69c986f6d7-wszsv" event={"ID":"5c59c419-a30f-4ba1-960f-e953943a6d7e","Type":"ContainerStarted","Data":"9d1ef98550228a0013cca58ec73418fd5a0bbfa1786f2637580ae547a8560f62"} Sep 30 14:20:54 crc kubenswrapper[4676]: I0930 14:20:54.018305 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-69c986f6d7-wszsv" podStartSLOduration=10.018285336 podStartE2EDuration="10.018285336s" podCreationTimestamp="2025-09-30 14:20:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:20:54.009868546 +0000 UTC m=+1357.992956975" watchObservedRunningTime="2025-09-30 14:20:54.018285336 +0000 UTC m=+1358.001373765" Sep 30 14:20:54 crc kubenswrapper[4676]: I0930 14:20:54.222017 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6ccd8d4586-px8lq" podUID="23c298b3-265b-4261-80d8-a79bc312db01" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.155:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 14:20:54 crc kubenswrapper[4676]: I0930 14:20:54.551441 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-69c986f6d7-wszsv" Sep 30 14:20:55 crc kubenswrapper[4676]: I0930 14:20:55.007070 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e6f141d8-0218-4c0e-94b8-7c1a235b2e11","Type":"ContainerStarted","Data":"79fc113902778c2d239816856ca21d28cd12b90268b02a7e0ed291427e2d4d94"} Sep 30 14:20:55 crc kubenswrapper[4676]: I0930 14:20:55.008368 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e6f141d8-0218-4c0e-94b8-7c1a235b2e11","Type":"ContainerStarted","Data":"beea21d7edaa31807923b4791b366b785e78d23ba9a9b0d6e7588d36f69ff128"} Sep 30 14:20:55 crc kubenswrapper[4676]: I0930 14:20:55.011420 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="c28f715c-a254-49ae-94e4-60fcacf0e59a" containerName="cinder-api-log" containerID="cri-o://668cb1de1252b051f90c5be09307fbf24afaa97bc0e03196f9291620088ef4fb" gracePeriod=30 Sep 30 14:20:55 crc kubenswrapper[4676]: I0930 14:20:55.011649 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c28f715c-a254-49ae-94e4-60fcacf0e59a","Type":"ContainerStarted","Data":"04015019d537669099c8123ab825fb07dfaab05a0543a4c922cef8c735fb6f3b"} Sep 30 14:20:55 crc kubenswrapper[4676]: I0930 14:20:55.011744 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Sep 30 14:20:55 crc kubenswrapper[4676]: I0930 14:20:55.011748 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="c28f715c-a254-49ae-94e4-60fcacf0e59a" containerName="cinder-api" containerID="cri-o://04015019d537669099c8123ab825fb07dfaab05a0543a4c922cef8c735fb6f3b" gracePeriod=30 Sep 30 14:20:55 crc kubenswrapper[4676]: I0930 14:20:55.034150 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=10.868611602 podStartE2EDuration="12.034122202s" podCreationTimestamp="2025-09-30 14:20:43 +0000 UTC" firstStartedPulling="2025-09-30 14:20:51.811301978 +0000 UTC m=+1355.794390407" lastFinishedPulling="2025-09-30 14:20:52.976812578 +0000 UTC m=+1356.959901007" observedRunningTime="2025-09-30 14:20:55.028266659 +0000 UTC m=+1359.011355088" watchObservedRunningTime="2025-09-30 14:20:55.034122202 +0000 UTC m=+1359.017210631" Sep 30 14:20:55 crc kubenswrapper[4676]: I0930 14:20:55.060711 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=11.060685588 podStartE2EDuration="11.060685588s" podCreationTimestamp="2025-09-30 14:20:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:20:55.055871163 +0000 UTC m=+1359.038959592" watchObservedRunningTime="2025-09-30 14:20:55.060685588 +0000 UTC m=+1359.043774007" Sep 30 14:20:55 crc kubenswrapper[4676]: I0930 14:20:55.698386 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6ccd8d4586-px8lq" podUID="23c298b3-265b-4261-80d8-a79bc312db01" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.155:9311/healthcheck\": read tcp 10.217.0.2:38470->10.217.0.155:9311: read: connection reset by peer" Sep 30 14:20:55 crc kubenswrapper[4676]: I0930 14:20:55.698418 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6ccd8d4586-px8lq" podUID="23c298b3-265b-4261-80d8-a79bc312db01" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.155:9311/healthcheck\": read tcp 10.217.0.2:38456->10.217.0.155:9311: read: connection reset by peer" Sep 30 14:20:55 crc kubenswrapper[4676]: I0930 14:20:55.985298 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.063814 4676 generic.go:334] "Generic (PLEG): container finished" podID="f9a3961e-e61e-4a02-9fe4-bc1b5ae097cc" containerID="a9ee4e61b97d16119699b853291cd49fbeb6cc8203cf25984ecfb1d6e96db9cf" exitCode=0 Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.063927 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-2mftx" event={"ID":"f9a3961e-e61e-4a02-9fe4-bc1b5ae097cc","Type":"ContainerDied","Data":"a9ee4e61b97d16119699b853291cd49fbeb6cc8203cf25984ecfb1d6e96db9cf"} Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.082773 4676 generic.go:334] "Generic (PLEG): container finished" podID="c28f715c-a254-49ae-94e4-60fcacf0e59a" containerID="04015019d537669099c8123ab825fb07dfaab05a0543a4c922cef8c735fb6f3b" exitCode=0 Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.082809 4676 generic.go:334] "Generic (PLEG): container finished" podID="c28f715c-a254-49ae-94e4-60fcacf0e59a" containerID="668cb1de1252b051f90c5be09307fbf24afaa97bc0e03196f9291620088ef4fb" exitCode=143 Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.082889 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c28f715c-a254-49ae-94e4-60fcacf0e59a","Type":"ContainerDied","Data":"04015019d537669099c8123ab825fb07dfaab05a0543a4c922cef8c735fb6f3b"} Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.082922 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c28f715c-a254-49ae-94e4-60fcacf0e59a","Type":"ContainerDied","Data":"668cb1de1252b051f90c5be09307fbf24afaa97bc0e03196f9291620088ef4fb"} Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.082936 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c28f715c-a254-49ae-94e4-60fcacf0e59a","Type":"ContainerDied","Data":"b0100b665e9a27af8c7e00af1121876f65607dcb369a8bbf347d585e2222ff8f"} Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.082952 4676 scope.go:117] "RemoveContainer" containerID="04015019d537669099c8123ab825fb07dfaab05a0543a4c922cef8c735fb6f3b" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.083076 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.084518 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c28f715c-a254-49ae-94e4-60fcacf0e59a-logs\") pod \"c28f715c-a254-49ae-94e4-60fcacf0e59a\" (UID: \"c28f715c-a254-49ae-94e4-60fcacf0e59a\") " Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.084635 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwz8m\" (UniqueName: \"kubernetes.io/projected/c28f715c-a254-49ae-94e4-60fcacf0e59a-kube-api-access-vwz8m\") pod \"c28f715c-a254-49ae-94e4-60fcacf0e59a\" (UID: \"c28f715c-a254-49ae-94e4-60fcacf0e59a\") " Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.084691 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c28f715c-a254-49ae-94e4-60fcacf0e59a-config-data\") pod \"c28f715c-a254-49ae-94e4-60fcacf0e59a\" (UID: \"c28f715c-a254-49ae-94e4-60fcacf0e59a\") " Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.084785 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c28f715c-a254-49ae-94e4-60fcacf0e59a-etc-machine-id\") pod \"c28f715c-a254-49ae-94e4-60fcacf0e59a\" (UID: \"c28f715c-a254-49ae-94e4-60fcacf0e59a\") " Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.084813 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c28f715c-a254-49ae-94e4-60fcacf0e59a-combined-ca-bundle\") pod \"c28f715c-a254-49ae-94e4-60fcacf0e59a\" (UID: \"c28f715c-a254-49ae-94e4-60fcacf0e59a\") " Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.084841 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c28f715c-a254-49ae-94e4-60fcacf0e59a-scripts\") pod \"c28f715c-a254-49ae-94e4-60fcacf0e59a\" (UID: \"c28f715c-a254-49ae-94e4-60fcacf0e59a\") " Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.084871 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c28f715c-a254-49ae-94e4-60fcacf0e59a-config-data-custom\") pod \"c28f715c-a254-49ae-94e4-60fcacf0e59a\" (UID: \"c28f715c-a254-49ae-94e4-60fcacf0e59a\") " Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.085163 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c28f715c-a254-49ae-94e4-60fcacf0e59a-logs" (OuterVolumeSpecName: "logs") pod "c28f715c-a254-49ae-94e4-60fcacf0e59a" (UID: "c28f715c-a254-49ae-94e4-60fcacf0e59a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.085382 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c28f715c-a254-49ae-94e4-60fcacf0e59a-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "c28f715c-a254-49ae-94e4-60fcacf0e59a" (UID: "c28f715c-a254-49ae-94e4-60fcacf0e59a"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.085468 4676 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c28f715c-a254-49ae-94e4-60fcacf0e59a-etc-machine-id\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.085489 4676 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c28f715c-a254-49ae-94e4-60fcacf0e59a-logs\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.090030 4676 generic.go:334] "Generic (PLEG): container finished" podID="23c298b3-265b-4261-80d8-a79bc312db01" containerID="8540aa107515f95c73c42c71b0862f3fabd4b602088108839de75afa0b3f4c22" exitCode=0 Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.091978 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6ccd8d4586-px8lq" event={"ID":"23c298b3-265b-4261-80d8-a79bc312db01","Type":"ContainerDied","Data":"8540aa107515f95c73c42c71b0862f3fabd4b602088108839de75afa0b3f4c22"} Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.106709 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c28f715c-a254-49ae-94e4-60fcacf0e59a-kube-api-access-vwz8m" (OuterVolumeSpecName: "kube-api-access-vwz8m") pod "c28f715c-a254-49ae-94e4-60fcacf0e59a" (UID: "c28f715c-a254-49ae-94e4-60fcacf0e59a"). InnerVolumeSpecName "kube-api-access-vwz8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.124240 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c28f715c-a254-49ae-94e4-60fcacf0e59a-scripts" (OuterVolumeSpecName: "scripts") pod "c28f715c-a254-49ae-94e4-60fcacf0e59a" (UID: "c28f715c-a254-49ae-94e4-60fcacf0e59a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.124351 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c28f715c-a254-49ae-94e4-60fcacf0e59a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c28f715c-a254-49ae-94e4-60fcacf0e59a" (UID: "c28f715c-a254-49ae-94e4-60fcacf0e59a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.170968 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Sep 30 14:20:56 crc kubenswrapper[4676]: E0930 14:20:56.171520 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c28f715c-a254-49ae-94e4-60fcacf0e59a" containerName="cinder-api-log" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.171535 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="c28f715c-a254-49ae-94e4-60fcacf0e59a" containerName="cinder-api-log" Sep 30 14:20:56 crc kubenswrapper[4676]: E0930 14:20:56.171563 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c28f715c-a254-49ae-94e4-60fcacf0e59a" containerName="cinder-api" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.171569 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="c28f715c-a254-49ae-94e4-60fcacf0e59a" containerName="cinder-api" Sep 30 14:20:56 crc kubenswrapper[4676]: E0930 14:20:56.171591 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cef0985d-7a66-49f5-a4ac-fabf92844c3e" containerName="dnsmasq-dns" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.171598 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="cef0985d-7a66-49f5-a4ac-fabf92844c3e" containerName="dnsmasq-dns" Sep 30 14:20:56 crc kubenswrapper[4676]: E0930 14:20:56.171609 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cef0985d-7a66-49f5-a4ac-fabf92844c3e" containerName="init" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.171615 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="cef0985d-7a66-49f5-a4ac-fabf92844c3e" containerName="init" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.171770 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="c28f715c-a254-49ae-94e4-60fcacf0e59a" containerName="cinder-api" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.171792 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="cef0985d-7a66-49f5-a4ac-fabf92844c3e" containerName="dnsmasq-dns" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.171802 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="c28f715c-a254-49ae-94e4-60fcacf0e59a" containerName="cinder-api-log" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.172479 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.180154 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.185007 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.185236 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-h4hfk" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.195627 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwz8m\" (UniqueName: \"kubernetes.io/projected/c28f715c-a254-49ae-94e4-60fcacf0e59a-kube-api-access-vwz8m\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.195657 4676 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c28f715c-a254-49ae-94e4-60fcacf0e59a-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.195668 4676 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c28f715c-a254-49ae-94e4-60fcacf0e59a-config-data-custom\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.203175 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.209175 4676 scope.go:117] "RemoveContainer" containerID="668cb1de1252b051f90c5be09307fbf24afaa97bc0e03196f9291620088ef4fb" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.237271 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c28f715c-a254-49ae-94e4-60fcacf0e59a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c28f715c-a254-49ae-94e4-60fcacf0e59a" (UID: "c28f715c-a254-49ae-94e4-60fcacf0e59a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.301207 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c28f715c-a254-49ae-94e4-60fcacf0e59a-config-data" (OuterVolumeSpecName: "config-data") pod "c28f715c-a254-49ae-94e4-60fcacf0e59a" (UID: "c28f715c-a254-49ae-94e4-60fcacf0e59a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.306047 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f682d501-bba0-4b08-98aa-0ee2a0603939-openstack-config\") pod \"openstackclient\" (UID: \"f682d501-bba0-4b08-98aa-0ee2a0603939\") " pod="openstack/openstackclient" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.306127 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f682d501-bba0-4b08-98aa-0ee2a0603939-openstack-config-secret\") pod \"openstackclient\" (UID: \"f682d501-bba0-4b08-98aa-0ee2a0603939\") " pod="openstack/openstackclient" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.306156 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgsdp\" (UniqueName: \"kubernetes.io/projected/f682d501-bba0-4b08-98aa-0ee2a0603939-kube-api-access-qgsdp\") pod \"openstackclient\" (UID: \"f682d501-bba0-4b08-98aa-0ee2a0603939\") " pod="openstack/openstackclient" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.306255 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f682d501-bba0-4b08-98aa-0ee2a0603939-combined-ca-bundle\") pod \"openstackclient\" (UID: \"f682d501-bba0-4b08-98aa-0ee2a0603939\") " pod="openstack/openstackclient" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.306350 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c28f715c-a254-49ae-94e4-60fcacf0e59a-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.306364 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c28f715c-a254-49ae-94e4-60fcacf0e59a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.374959 4676 scope.go:117] "RemoveContainer" containerID="04015019d537669099c8123ab825fb07dfaab05a0543a4c922cef8c735fb6f3b" Sep 30 14:20:56 crc kubenswrapper[4676]: E0930 14:20:56.376148 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04015019d537669099c8123ab825fb07dfaab05a0543a4c922cef8c735fb6f3b\": container with ID starting with 04015019d537669099c8123ab825fb07dfaab05a0543a4c922cef8c735fb6f3b not found: ID does not exist" containerID="04015019d537669099c8123ab825fb07dfaab05a0543a4c922cef8c735fb6f3b" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.376188 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04015019d537669099c8123ab825fb07dfaab05a0543a4c922cef8c735fb6f3b"} err="failed to get container status \"04015019d537669099c8123ab825fb07dfaab05a0543a4c922cef8c735fb6f3b\": rpc error: code = NotFound desc = could not find container \"04015019d537669099c8123ab825fb07dfaab05a0543a4c922cef8c735fb6f3b\": container with ID starting with 04015019d537669099c8123ab825fb07dfaab05a0543a4c922cef8c735fb6f3b not found: ID does not exist" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.376208 4676 scope.go:117] "RemoveContainer" containerID="668cb1de1252b051f90c5be09307fbf24afaa97bc0e03196f9291620088ef4fb" Sep 30 14:20:56 crc kubenswrapper[4676]: E0930 14:20:56.376404 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"668cb1de1252b051f90c5be09307fbf24afaa97bc0e03196f9291620088ef4fb\": container with ID starting with 668cb1de1252b051f90c5be09307fbf24afaa97bc0e03196f9291620088ef4fb not found: ID does not exist" containerID="668cb1de1252b051f90c5be09307fbf24afaa97bc0e03196f9291620088ef4fb" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.376424 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"668cb1de1252b051f90c5be09307fbf24afaa97bc0e03196f9291620088ef4fb"} err="failed to get container status \"668cb1de1252b051f90c5be09307fbf24afaa97bc0e03196f9291620088ef4fb\": rpc error: code = NotFound desc = could not find container \"668cb1de1252b051f90c5be09307fbf24afaa97bc0e03196f9291620088ef4fb\": container with ID starting with 668cb1de1252b051f90c5be09307fbf24afaa97bc0e03196f9291620088ef4fb not found: ID does not exist" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.376437 4676 scope.go:117] "RemoveContainer" containerID="04015019d537669099c8123ab825fb07dfaab05a0543a4c922cef8c735fb6f3b" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.376651 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04015019d537669099c8123ab825fb07dfaab05a0543a4c922cef8c735fb6f3b"} err="failed to get container status \"04015019d537669099c8123ab825fb07dfaab05a0543a4c922cef8c735fb6f3b\": rpc error: code = NotFound desc = could not find container \"04015019d537669099c8123ab825fb07dfaab05a0543a4c922cef8c735fb6f3b\": container with ID starting with 04015019d537669099c8123ab825fb07dfaab05a0543a4c922cef8c735fb6f3b not found: ID does not exist" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.376673 4676 scope.go:117] "RemoveContainer" containerID="668cb1de1252b051f90c5be09307fbf24afaa97bc0e03196f9291620088ef4fb" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.376789 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6ccd8d4586-px8lq" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.377446 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"668cb1de1252b051f90c5be09307fbf24afaa97bc0e03196f9291620088ef4fb"} err="failed to get container status \"668cb1de1252b051f90c5be09307fbf24afaa97bc0e03196f9291620088ef4fb\": rpc error: code = NotFound desc = could not find container \"668cb1de1252b051f90c5be09307fbf24afaa97bc0e03196f9291620088ef4fb\": container with ID starting with 668cb1de1252b051f90c5be09307fbf24afaa97bc0e03196f9291620088ef4fb not found: ID does not exist" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.428712 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f682d501-bba0-4b08-98aa-0ee2a0603939-combined-ca-bundle\") pod \"openstackclient\" (UID: \"f682d501-bba0-4b08-98aa-0ee2a0603939\") " pod="openstack/openstackclient" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.431940 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f682d501-bba0-4b08-98aa-0ee2a0603939-openstack-config\") pod \"openstackclient\" (UID: \"f682d501-bba0-4b08-98aa-0ee2a0603939\") " pod="openstack/openstackclient" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.432187 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f682d501-bba0-4b08-98aa-0ee2a0603939-openstack-config-secret\") pod \"openstackclient\" (UID: \"f682d501-bba0-4b08-98aa-0ee2a0603939\") " pod="openstack/openstackclient" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.432265 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgsdp\" (UniqueName: \"kubernetes.io/projected/f682d501-bba0-4b08-98aa-0ee2a0603939-kube-api-access-qgsdp\") pod \"openstackclient\" (UID: \"f682d501-bba0-4b08-98aa-0ee2a0603939\") " pod="openstack/openstackclient" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.435268 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f682d501-bba0-4b08-98aa-0ee2a0603939-openstack-config\") pod \"openstackclient\" (UID: \"f682d501-bba0-4b08-98aa-0ee2a0603939\") " pod="openstack/openstackclient" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.441457 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f682d501-bba0-4b08-98aa-0ee2a0603939-openstack-config-secret\") pod \"openstackclient\" (UID: \"f682d501-bba0-4b08-98aa-0ee2a0603939\") " pod="openstack/openstackclient" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.442002 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f682d501-bba0-4b08-98aa-0ee2a0603939-combined-ca-bundle\") pod \"openstackclient\" (UID: \"f682d501-bba0-4b08-98aa-0ee2a0603939\") " pod="openstack/openstackclient" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.452635 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgsdp\" (UniqueName: \"kubernetes.io/projected/f682d501-bba0-4b08-98aa-0ee2a0603939-kube-api-access-qgsdp\") pod \"openstackclient\" (UID: \"f682d501-bba0-4b08-98aa-0ee2a0603939\") " pod="openstack/openstackclient" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.486559 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.495347 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.508098 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Sep 30 14:20:56 crc kubenswrapper[4676]: E0930 14:20:56.508478 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23c298b3-265b-4261-80d8-a79bc312db01" containerName="barbican-api-log" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.508498 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="23c298b3-265b-4261-80d8-a79bc312db01" containerName="barbican-api-log" Sep 30 14:20:56 crc kubenswrapper[4676]: E0930 14:20:56.508519 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23c298b3-265b-4261-80d8-a79bc312db01" containerName="barbican-api" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.508525 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="23c298b3-265b-4261-80d8-a79bc312db01" containerName="barbican-api" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.508695 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="23c298b3-265b-4261-80d8-a79bc312db01" containerName="barbican-api" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.508713 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="23c298b3-265b-4261-80d8-a79bc312db01" containerName="barbican-api-log" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.509771 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.513349 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.513390 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.517384 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.526127 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.533732 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23c298b3-265b-4261-80d8-a79bc312db01-logs\") pod \"23c298b3-265b-4261-80d8-a79bc312db01\" (UID: \"23c298b3-265b-4261-80d8-a79bc312db01\") " Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.533860 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23c298b3-265b-4261-80d8-a79bc312db01-combined-ca-bundle\") pod \"23c298b3-265b-4261-80d8-a79bc312db01\" (UID: \"23c298b3-265b-4261-80d8-a79bc312db01\") " Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.533924 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fvvd\" (UniqueName: \"kubernetes.io/projected/23c298b3-265b-4261-80d8-a79bc312db01-kube-api-access-8fvvd\") pod \"23c298b3-265b-4261-80d8-a79bc312db01\" (UID: \"23c298b3-265b-4261-80d8-a79bc312db01\") " Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.533967 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/23c298b3-265b-4261-80d8-a79bc312db01-config-data-custom\") pod \"23c298b3-265b-4261-80d8-a79bc312db01\" (UID: \"23c298b3-265b-4261-80d8-a79bc312db01\") " Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.534042 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23c298b3-265b-4261-80d8-a79bc312db01-config-data\") pod \"23c298b3-265b-4261-80d8-a79bc312db01\" (UID: \"23c298b3-265b-4261-80d8-a79bc312db01\") " Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.536023 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23c298b3-265b-4261-80d8-a79bc312db01-logs" (OuterVolumeSpecName: "logs") pod "23c298b3-265b-4261-80d8-a79bc312db01" (UID: "23c298b3-265b-4261-80d8-a79bc312db01"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.542397 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23c298b3-265b-4261-80d8-a79bc312db01-kube-api-access-8fvvd" (OuterVolumeSpecName: "kube-api-access-8fvvd") pod "23c298b3-265b-4261-80d8-a79bc312db01" (UID: "23c298b3-265b-4261-80d8-a79bc312db01"). InnerVolumeSpecName "kube-api-access-8fvvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.545086 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23c298b3-265b-4261-80d8-a79bc312db01-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "23c298b3-265b-4261-80d8-a79bc312db01" (UID: "23c298b3-265b-4261-80d8-a79bc312db01"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.579677 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23c298b3-265b-4261-80d8-a79bc312db01-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "23c298b3-265b-4261-80d8-a79bc312db01" (UID: "23c298b3-265b-4261-80d8-a79bc312db01"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.606918 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23c298b3-265b-4261-80d8-a79bc312db01-config-data" (OuterVolumeSpecName: "config-data") pod "23c298b3-265b-4261-80d8-a79bc312db01" (UID: "23c298b3-265b-4261-80d8-a79bc312db01"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.636665 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f490182f-5ea6-45fa-85d0-a6b1c02c5849-public-tls-certs\") pod \"cinder-api-0\" (UID: \"f490182f-5ea6-45fa-85d0-a6b1c02c5849\") " pod="openstack/cinder-api-0" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.636743 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f490182f-5ea6-45fa-85d0-a6b1c02c5849-config-data\") pod \"cinder-api-0\" (UID: \"f490182f-5ea6-45fa-85d0-a6b1c02c5849\") " pod="openstack/cinder-api-0" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.636797 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f490182f-5ea6-45fa-85d0-a6b1c02c5849-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"f490182f-5ea6-45fa-85d0-a6b1c02c5849\") " pod="openstack/cinder-api-0" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.636831 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f490182f-5ea6-45fa-85d0-a6b1c02c5849-logs\") pod \"cinder-api-0\" (UID: \"f490182f-5ea6-45fa-85d0-a6b1c02c5849\") " pod="openstack/cinder-api-0" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.636854 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f490182f-5ea6-45fa-85d0-a6b1c02c5849-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f490182f-5ea6-45fa-85d0-a6b1c02c5849\") " pod="openstack/cinder-api-0" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.637094 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f490182f-5ea6-45fa-85d0-a6b1c02c5849-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f490182f-5ea6-45fa-85d0-a6b1c02c5849\") " pod="openstack/cinder-api-0" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.637273 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f490182f-5ea6-45fa-85d0-a6b1c02c5849-scripts\") pod \"cinder-api-0\" (UID: \"f490182f-5ea6-45fa-85d0-a6b1c02c5849\") " pod="openstack/cinder-api-0" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.637396 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wmfk\" (UniqueName: \"kubernetes.io/projected/f490182f-5ea6-45fa-85d0-a6b1c02c5849-kube-api-access-5wmfk\") pod \"cinder-api-0\" (UID: \"f490182f-5ea6-45fa-85d0-a6b1c02c5849\") " pod="openstack/cinder-api-0" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.637438 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f490182f-5ea6-45fa-85d0-a6b1c02c5849-config-data-custom\") pod \"cinder-api-0\" (UID: \"f490182f-5ea6-45fa-85d0-a6b1c02c5849\") " pod="openstack/cinder-api-0" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.640257 4676 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23c298b3-265b-4261-80d8-a79bc312db01-logs\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.640299 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23c298b3-265b-4261-80d8-a79bc312db01-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.640313 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fvvd\" (UniqueName: \"kubernetes.io/projected/23c298b3-265b-4261-80d8-a79bc312db01-kube-api-access-8fvvd\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.640337 4676 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/23c298b3-265b-4261-80d8-a79bc312db01-config-data-custom\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.640350 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23c298b3-265b-4261-80d8-a79bc312db01-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.672934 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.746097 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f490182f-5ea6-45fa-85d0-a6b1c02c5849-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f490182f-5ea6-45fa-85d0-a6b1c02c5849\") " pod="openstack/cinder-api-0" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.746244 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f490182f-5ea6-45fa-85d0-a6b1c02c5849-scripts\") pod \"cinder-api-0\" (UID: \"f490182f-5ea6-45fa-85d0-a6b1c02c5849\") " pod="openstack/cinder-api-0" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.746312 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wmfk\" (UniqueName: \"kubernetes.io/projected/f490182f-5ea6-45fa-85d0-a6b1c02c5849-kube-api-access-5wmfk\") pod \"cinder-api-0\" (UID: \"f490182f-5ea6-45fa-85d0-a6b1c02c5849\") " pod="openstack/cinder-api-0" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.746335 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f490182f-5ea6-45fa-85d0-a6b1c02c5849-config-data-custom\") pod \"cinder-api-0\" (UID: \"f490182f-5ea6-45fa-85d0-a6b1c02c5849\") " pod="openstack/cinder-api-0" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.746379 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f490182f-5ea6-45fa-85d0-a6b1c02c5849-public-tls-certs\") pod \"cinder-api-0\" (UID: \"f490182f-5ea6-45fa-85d0-a6b1c02c5849\") " pod="openstack/cinder-api-0" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.746419 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f490182f-5ea6-45fa-85d0-a6b1c02c5849-config-data\") pod \"cinder-api-0\" (UID: \"f490182f-5ea6-45fa-85d0-a6b1c02c5849\") " pod="openstack/cinder-api-0" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.746465 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f490182f-5ea6-45fa-85d0-a6b1c02c5849-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"f490182f-5ea6-45fa-85d0-a6b1c02c5849\") " pod="openstack/cinder-api-0" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.746488 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f490182f-5ea6-45fa-85d0-a6b1c02c5849-logs\") pod \"cinder-api-0\" (UID: \"f490182f-5ea6-45fa-85d0-a6b1c02c5849\") " pod="openstack/cinder-api-0" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.746505 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f490182f-5ea6-45fa-85d0-a6b1c02c5849-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f490182f-5ea6-45fa-85d0-a6b1c02c5849\") " pod="openstack/cinder-api-0" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.747494 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f490182f-5ea6-45fa-85d0-a6b1c02c5849-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f490182f-5ea6-45fa-85d0-a6b1c02c5849\") " pod="openstack/cinder-api-0" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.749164 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f490182f-5ea6-45fa-85d0-a6b1c02c5849-logs\") pod \"cinder-api-0\" (UID: \"f490182f-5ea6-45fa-85d0-a6b1c02c5849\") " pod="openstack/cinder-api-0" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.753246 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f490182f-5ea6-45fa-85d0-a6b1c02c5849-public-tls-certs\") pod \"cinder-api-0\" (UID: \"f490182f-5ea6-45fa-85d0-a6b1c02c5849\") " pod="openstack/cinder-api-0" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.754372 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f490182f-5ea6-45fa-85d0-a6b1c02c5849-scripts\") pod \"cinder-api-0\" (UID: \"f490182f-5ea6-45fa-85d0-a6b1c02c5849\") " pod="openstack/cinder-api-0" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.755230 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f490182f-5ea6-45fa-85d0-a6b1c02c5849-config-data\") pod \"cinder-api-0\" (UID: \"f490182f-5ea6-45fa-85d0-a6b1c02c5849\") " pod="openstack/cinder-api-0" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.758523 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f490182f-5ea6-45fa-85d0-a6b1c02c5849-config-data-custom\") pod \"cinder-api-0\" (UID: \"f490182f-5ea6-45fa-85d0-a6b1c02c5849\") " pod="openstack/cinder-api-0" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.758602 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f490182f-5ea6-45fa-85d0-a6b1c02c5849-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f490182f-5ea6-45fa-85d0-a6b1c02c5849\") " pod="openstack/cinder-api-0" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.781010 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f490182f-5ea6-45fa-85d0-a6b1c02c5849-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"f490182f-5ea6-45fa-85d0-a6b1c02c5849\") " pod="openstack/cinder-api-0" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.792600 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wmfk\" (UniqueName: \"kubernetes.io/projected/f490182f-5ea6-45fa-85d0-a6b1c02c5849-kube-api-access-5wmfk\") pod \"cinder-api-0\" (UID: \"f490182f-5ea6-45fa-85d0-a6b1c02c5849\") " pod="openstack/cinder-api-0" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.828480 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 14:20:56 crc kubenswrapper[4676]: I0930 14:20:56.978587 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Sep 30 14:20:57 crc kubenswrapper[4676]: W0930 14:20:57.011823 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf682d501_bba0_4b08_98aa_0ee2a0603939.slice/crio-d072662a80e8f8755c14d6c5330dde0059d5cc06a74fb82bcba831220523f236 WatchSource:0}: Error finding container d072662a80e8f8755c14d6c5330dde0059d5cc06a74fb82bcba831220523f236: Status 404 returned error can't find the container with id d072662a80e8f8755c14d6c5330dde0059d5cc06a74fb82bcba831220523f236 Sep 30 14:20:57 crc kubenswrapper[4676]: I0930 14:20:57.122108 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6ccd8d4586-px8lq" event={"ID":"23c298b3-265b-4261-80d8-a79bc312db01","Type":"ContainerDied","Data":"816605b321b4fb0502edda9ac3180961b862a02030113387d09078a37cff4517"} Sep 30 14:20:57 crc kubenswrapper[4676]: I0930 14:20:57.122167 4676 scope.go:117] "RemoveContainer" containerID="8540aa107515f95c73c42c71b0862f3fabd4b602088108839de75afa0b3f4c22" Sep 30 14:20:57 crc kubenswrapper[4676]: I0930 14:20:57.122344 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6ccd8d4586-px8lq" Sep 30 14:20:57 crc kubenswrapper[4676]: I0930 14:20:57.131458 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"f682d501-bba0-4b08-98aa-0ee2a0603939","Type":"ContainerStarted","Data":"d072662a80e8f8755c14d6c5330dde0059d5cc06a74fb82bcba831220523f236"} Sep 30 14:20:57 crc kubenswrapper[4676]: I0930 14:20:57.228124 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6ccd8d4586-px8lq"] Sep 30 14:20:57 crc kubenswrapper[4676]: I0930 14:20:57.238597 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6ccd8d4586-px8lq"] Sep 30 14:20:57 crc kubenswrapper[4676]: I0930 14:20:57.272589 4676 scope.go:117] "RemoveContainer" containerID="1e857e25ea7274253d5df5c66541f35ea25cac84b5e27f1cec64a3de51cd79c9" Sep 30 14:20:57 crc kubenswrapper[4676]: I0930 14:20:57.382872 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Sep 30 14:20:57 crc kubenswrapper[4676]: I0930 14:20:57.448601 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23c298b3-265b-4261-80d8-a79bc312db01" path="/var/lib/kubelet/pods/23c298b3-265b-4261-80d8-a79bc312db01/volumes" Sep 30 14:20:57 crc kubenswrapper[4676]: I0930 14:20:57.449591 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c28f715c-a254-49ae-94e4-60fcacf0e59a" path="/var/lib/kubelet/pods/c28f715c-a254-49ae-94e4-60fcacf0e59a/volumes" Sep 30 14:20:57 crc kubenswrapper[4676]: I0930 14:20:57.600014 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-2mftx" Sep 30 14:20:57 crc kubenswrapper[4676]: I0930 14:20:57.768090 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bsmt5\" (UniqueName: \"kubernetes.io/projected/f9a3961e-e61e-4a02-9fe4-bc1b5ae097cc-kube-api-access-bsmt5\") pod \"f9a3961e-e61e-4a02-9fe4-bc1b5ae097cc\" (UID: \"f9a3961e-e61e-4a02-9fe4-bc1b5ae097cc\") " Sep 30 14:20:57 crc kubenswrapper[4676]: I0930 14:20:57.768141 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9a3961e-e61e-4a02-9fe4-bc1b5ae097cc-combined-ca-bundle\") pod \"f9a3961e-e61e-4a02-9fe4-bc1b5ae097cc\" (UID: \"f9a3961e-e61e-4a02-9fe4-bc1b5ae097cc\") " Sep 30 14:20:57 crc kubenswrapper[4676]: I0930 14:20:57.768217 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f9a3961e-e61e-4a02-9fe4-bc1b5ae097cc-config\") pod \"f9a3961e-e61e-4a02-9fe4-bc1b5ae097cc\" (UID: \"f9a3961e-e61e-4a02-9fe4-bc1b5ae097cc\") " Sep 30 14:20:57 crc kubenswrapper[4676]: I0930 14:20:57.775918 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9a3961e-e61e-4a02-9fe4-bc1b5ae097cc-kube-api-access-bsmt5" (OuterVolumeSpecName: "kube-api-access-bsmt5") pod "f9a3961e-e61e-4a02-9fe4-bc1b5ae097cc" (UID: "f9a3961e-e61e-4a02-9fe4-bc1b5ae097cc"). InnerVolumeSpecName "kube-api-access-bsmt5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:20:57 crc kubenswrapper[4676]: I0930 14:20:57.823630 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9a3961e-e61e-4a02-9fe4-bc1b5ae097cc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f9a3961e-e61e-4a02-9fe4-bc1b5ae097cc" (UID: "f9a3961e-e61e-4a02-9fe4-bc1b5ae097cc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:20:57 crc kubenswrapper[4676]: I0930 14:20:57.835081 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9a3961e-e61e-4a02-9fe4-bc1b5ae097cc-config" (OuterVolumeSpecName: "config") pod "f9a3961e-e61e-4a02-9fe4-bc1b5ae097cc" (UID: "f9a3961e-e61e-4a02-9fe4-bc1b5ae097cc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:20:57 crc kubenswrapper[4676]: I0930 14:20:57.875823 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/f9a3961e-e61e-4a02-9fe4-bc1b5ae097cc-config\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:57 crc kubenswrapper[4676]: I0930 14:20:57.876183 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bsmt5\" (UniqueName: \"kubernetes.io/projected/f9a3961e-e61e-4a02-9fe4-bc1b5ae097cc-kube-api-access-bsmt5\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:57 crc kubenswrapper[4676]: I0930 14:20:57.876294 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9a3961e-e61e-4a02-9fe4-bc1b5ae097cc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:58 crc kubenswrapper[4676]: I0930 14:20:58.154155 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f490182f-5ea6-45fa-85d0-a6b1c02c5849","Type":"ContainerStarted","Data":"d96168e59857690e27296a71073845a7499cf196387f09f6da7b4351b2a4b7bc"} Sep 30 14:20:58 crc kubenswrapper[4676]: I0930 14:20:58.176578 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-2mftx" event={"ID":"f9a3961e-e61e-4a02-9fe4-bc1b5ae097cc","Type":"ContainerDied","Data":"67d39d4e26cf5e5521d66ba11efa23b3ade4a19783643d7ebefff7340d0ac4ed"} Sep 30 14:20:58 crc kubenswrapper[4676]: I0930 14:20:58.176633 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67d39d4e26cf5e5521d66ba11efa23b3ade4a19783643d7ebefff7340d0ac4ed" Sep 30 14:20:58 crc kubenswrapper[4676]: I0930 14:20:58.176790 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-2mftx" Sep 30 14:20:58 crc kubenswrapper[4676]: I0930 14:20:58.416580 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6bbb84f9b4-l58vk"] Sep 30 14:20:58 crc kubenswrapper[4676]: E0930 14:20:58.417148 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9a3961e-e61e-4a02-9fe4-bc1b5ae097cc" containerName="neutron-db-sync" Sep 30 14:20:58 crc kubenswrapper[4676]: I0930 14:20:58.417188 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9a3961e-e61e-4a02-9fe4-bc1b5ae097cc" containerName="neutron-db-sync" Sep 30 14:20:58 crc kubenswrapper[4676]: I0930 14:20:58.417388 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9a3961e-e61e-4a02-9fe4-bc1b5ae097cc" containerName="neutron-db-sync" Sep 30 14:20:58 crc kubenswrapper[4676]: I0930 14:20:58.418366 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6bbb84f9b4-l58vk" Sep 30 14:20:58 crc kubenswrapper[4676]: I0930 14:20:58.421951 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-jvbzz" Sep 30 14:20:58 crc kubenswrapper[4676]: I0930 14:20:58.422054 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Sep 30 14:20:58 crc kubenswrapper[4676]: I0930 14:20:58.422224 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Sep 30 14:20:58 crc kubenswrapper[4676]: I0930 14:20:58.433139 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Sep 30 14:20:58 crc kubenswrapper[4676]: I0930 14:20:58.437178 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69c986f6d7-wszsv"] Sep 30 14:20:58 crc kubenswrapper[4676]: I0930 14:20:58.437522 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-69c986f6d7-wszsv" podUID="5c59c419-a30f-4ba1-960f-e953943a6d7e" containerName="dnsmasq-dns" containerID="cri-o://9d1ef98550228a0013cca58ec73418fd5a0bbfa1786f2637580ae547a8560f62" gracePeriod=10 Sep 30 14:20:58 crc kubenswrapper[4676]: I0930 14:20:58.440614 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-69c986f6d7-wszsv" Sep 30 14:20:58 crc kubenswrapper[4676]: I0930 14:20:58.474154 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6bbb84f9b4-l58vk"] Sep 30 14:20:58 crc kubenswrapper[4676]: I0930 14:20:58.488216 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-bwmdv"] Sep 30 14:20:58 crc kubenswrapper[4676]: I0930 14:20:58.490437 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-bwmdv" Sep 30 14:20:58 crc kubenswrapper[4676]: I0930 14:20:58.517267 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-bwmdv"] Sep 30 14:20:58 crc kubenswrapper[4676]: I0930 14:20:58.606292 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5bccc396-3182-4955-8d75-93c5b0b221c6-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-bwmdv\" (UID: \"5bccc396-3182-4955-8d75-93c5b0b221c6\") " pod="openstack/dnsmasq-dns-5784cf869f-bwmdv" Sep 30 14:20:58 crc kubenswrapper[4676]: I0930 14:20:58.606359 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e8be2e62-b82f-43cb-b1bb-099d7d0f3478-config\") pod \"neutron-6bbb84f9b4-l58vk\" (UID: \"e8be2e62-b82f-43cb-b1bb-099d7d0f3478\") " pod="openstack/neutron-6bbb84f9b4-l58vk" Sep 30 14:20:58 crc kubenswrapper[4676]: I0930 14:20:58.606414 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5bccc396-3182-4955-8d75-93c5b0b221c6-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-bwmdv\" (UID: \"5bccc396-3182-4955-8d75-93c5b0b221c6\") " pod="openstack/dnsmasq-dns-5784cf869f-bwmdv" Sep 30 14:20:58 crc kubenswrapper[4676]: I0930 14:20:58.606435 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5bccc396-3182-4955-8d75-93c5b0b221c6-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-bwmdv\" (UID: \"5bccc396-3182-4955-8d75-93c5b0b221c6\") " pod="openstack/dnsmasq-dns-5784cf869f-bwmdv" Sep 30 14:20:58 crc kubenswrapper[4676]: I0930 14:20:58.606459 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bccc396-3182-4955-8d75-93c5b0b221c6-config\") pod \"dnsmasq-dns-5784cf869f-bwmdv\" (UID: \"5bccc396-3182-4955-8d75-93c5b0b221c6\") " pod="openstack/dnsmasq-dns-5784cf869f-bwmdv" Sep 30 14:20:58 crc kubenswrapper[4676]: I0930 14:20:58.606480 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8be2e62-b82f-43cb-b1bb-099d7d0f3478-ovndb-tls-certs\") pod \"neutron-6bbb84f9b4-l58vk\" (UID: \"e8be2e62-b82f-43cb-b1bb-099d7d0f3478\") " pod="openstack/neutron-6bbb84f9b4-l58vk" Sep 30 14:20:58 crc kubenswrapper[4676]: I0930 14:20:58.606530 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfzpt\" (UniqueName: \"kubernetes.io/projected/e8be2e62-b82f-43cb-b1bb-099d7d0f3478-kube-api-access-vfzpt\") pod \"neutron-6bbb84f9b4-l58vk\" (UID: \"e8be2e62-b82f-43cb-b1bb-099d7d0f3478\") " pod="openstack/neutron-6bbb84f9b4-l58vk" Sep 30 14:20:58 crc kubenswrapper[4676]: I0930 14:20:58.606589 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e8be2e62-b82f-43cb-b1bb-099d7d0f3478-httpd-config\") pod \"neutron-6bbb84f9b4-l58vk\" (UID: \"e8be2e62-b82f-43cb-b1bb-099d7d0f3478\") " pod="openstack/neutron-6bbb84f9b4-l58vk" Sep 30 14:20:58 crc kubenswrapper[4676]: I0930 14:20:58.606615 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8be2e62-b82f-43cb-b1bb-099d7d0f3478-combined-ca-bundle\") pod \"neutron-6bbb84f9b4-l58vk\" (UID: \"e8be2e62-b82f-43cb-b1bb-099d7d0f3478\") " pod="openstack/neutron-6bbb84f9b4-l58vk" Sep 30 14:20:58 crc kubenswrapper[4676]: I0930 14:20:58.606635 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh9vj\" (UniqueName: \"kubernetes.io/projected/5bccc396-3182-4955-8d75-93c5b0b221c6-kube-api-access-nh9vj\") pod \"dnsmasq-dns-5784cf869f-bwmdv\" (UID: \"5bccc396-3182-4955-8d75-93c5b0b221c6\") " pod="openstack/dnsmasq-dns-5784cf869f-bwmdv" Sep 30 14:20:58 crc kubenswrapper[4676]: I0930 14:20:58.606674 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5bccc396-3182-4955-8d75-93c5b0b221c6-dns-svc\") pod \"dnsmasq-dns-5784cf869f-bwmdv\" (UID: \"5bccc396-3182-4955-8d75-93c5b0b221c6\") " pod="openstack/dnsmasq-dns-5784cf869f-bwmdv" Sep 30 14:20:58 crc kubenswrapper[4676]: I0930 14:20:58.708119 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e8be2e62-b82f-43cb-b1bb-099d7d0f3478-httpd-config\") pod \"neutron-6bbb84f9b4-l58vk\" (UID: \"e8be2e62-b82f-43cb-b1bb-099d7d0f3478\") " pod="openstack/neutron-6bbb84f9b4-l58vk" Sep 30 14:20:58 crc kubenswrapper[4676]: I0930 14:20:58.708629 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8be2e62-b82f-43cb-b1bb-099d7d0f3478-combined-ca-bundle\") pod \"neutron-6bbb84f9b4-l58vk\" (UID: \"e8be2e62-b82f-43cb-b1bb-099d7d0f3478\") " pod="openstack/neutron-6bbb84f9b4-l58vk" Sep 30 14:20:58 crc kubenswrapper[4676]: I0930 14:20:58.708659 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nh9vj\" (UniqueName: \"kubernetes.io/projected/5bccc396-3182-4955-8d75-93c5b0b221c6-kube-api-access-nh9vj\") pod \"dnsmasq-dns-5784cf869f-bwmdv\" (UID: \"5bccc396-3182-4955-8d75-93c5b0b221c6\") " pod="openstack/dnsmasq-dns-5784cf869f-bwmdv" Sep 30 14:20:58 crc kubenswrapper[4676]: I0930 14:20:58.708705 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5bccc396-3182-4955-8d75-93c5b0b221c6-dns-svc\") pod \"dnsmasq-dns-5784cf869f-bwmdv\" (UID: \"5bccc396-3182-4955-8d75-93c5b0b221c6\") " pod="openstack/dnsmasq-dns-5784cf869f-bwmdv" Sep 30 14:20:58 crc kubenswrapper[4676]: I0930 14:20:58.708728 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5bccc396-3182-4955-8d75-93c5b0b221c6-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-bwmdv\" (UID: \"5bccc396-3182-4955-8d75-93c5b0b221c6\") " pod="openstack/dnsmasq-dns-5784cf869f-bwmdv" Sep 30 14:20:58 crc kubenswrapper[4676]: I0930 14:20:58.708771 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e8be2e62-b82f-43cb-b1bb-099d7d0f3478-config\") pod \"neutron-6bbb84f9b4-l58vk\" (UID: \"e8be2e62-b82f-43cb-b1bb-099d7d0f3478\") " pod="openstack/neutron-6bbb84f9b4-l58vk" Sep 30 14:20:58 crc kubenswrapper[4676]: I0930 14:20:58.708795 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5bccc396-3182-4955-8d75-93c5b0b221c6-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-bwmdv\" (UID: \"5bccc396-3182-4955-8d75-93c5b0b221c6\") " pod="openstack/dnsmasq-dns-5784cf869f-bwmdv" Sep 30 14:20:58 crc kubenswrapper[4676]: I0930 14:20:58.708817 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5bccc396-3182-4955-8d75-93c5b0b221c6-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-bwmdv\" (UID: \"5bccc396-3182-4955-8d75-93c5b0b221c6\") " pod="openstack/dnsmasq-dns-5784cf869f-bwmdv" Sep 30 14:20:58 crc kubenswrapper[4676]: I0930 14:20:58.708841 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bccc396-3182-4955-8d75-93c5b0b221c6-config\") pod \"dnsmasq-dns-5784cf869f-bwmdv\" (UID: \"5bccc396-3182-4955-8d75-93c5b0b221c6\") " pod="openstack/dnsmasq-dns-5784cf869f-bwmdv" Sep 30 14:20:58 crc kubenswrapper[4676]: I0930 14:20:58.708865 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8be2e62-b82f-43cb-b1bb-099d7d0f3478-ovndb-tls-certs\") pod \"neutron-6bbb84f9b4-l58vk\" (UID: \"e8be2e62-b82f-43cb-b1bb-099d7d0f3478\") " pod="openstack/neutron-6bbb84f9b4-l58vk" Sep 30 14:20:58 crc kubenswrapper[4676]: I0930 14:20:58.708923 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfzpt\" (UniqueName: \"kubernetes.io/projected/e8be2e62-b82f-43cb-b1bb-099d7d0f3478-kube-api-access-vfzpt\") pod \"neutron-6bbb84f9b4-l58vk\" (UID: \"e8be2e62-b82f-43cb-b1bb-099d7d0f3478\") " pod="openstack/neutron-6bbb84f9b4-l58vk" Sep 30 14:20:58 crc kubenswrapper[4676]: I0930 14:20:58.711864 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5bccc396-3182-4955-8d75-93c5b0b221c6-dns-svc\") pod \"dnsmasq-dns-5784cf869f-bwmdv\" (UID: \"5bccc396-3182-4955-8d75-93c5b0b221c6\") " pod="openstack/dnsmasq-dns-5784cf869f-bwmdv" Sep 30 14:20:58 crc kubenswrapper[4676]: I0930 14:20:58.734635 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5bccc396-3182-4955-8d75-93c5b0b221c6-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-bwmdv\" (UID: \"5bccc396-3182-4955-8d75-93c5b0b221c6\") " pod="openstack/dnsmasq-dns-5784cf869f-bwmdv" Sep 30 14:20:58 crc kubenswrapper[4676]: I0930 14:20:58.735181 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8be2e62-b82f-43cb-b1bb-099d7d0f3478-combined-ca-bundle\") pod \"neutron-6bbb84f9b4-l58vk\" (UID: \"e8be2e62-b82f-43cb-b1bb-099d7d0f3478\") " pod="openstack/neutron-6bbb84f9b4-l58vk" Sep 30 14:20:58 crc kubenswrapper[4676]: I0930 14:20:58.735313 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e8be2e62-b82f-43cb-b1bb-099d7d0f3478-httpd-config\") pod \"neutron-6bbb84f9b4-l58vk\" (UID: \"e8be2e62-b82f-43cb-b1bb-099d7d0f3478\") " pod="openstack/neutron-6bbb84f9b4-l58vk" Sep 30 14:20:58 crc kubenswrapper[4676]: I0930 14:20:58.735334 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e8be2e62-b82f-43cb-b1bb-099d7d0f3478-config\") pod \"neutron-6bbb84f9b4-l58vk\" (UID: \"e8be2e62-b82f-43cb-b1bb-099d7d0f3478\") " pod="openstack/neutron-6bbb84f9b4-l58vk" Sep 30 14:20:58 crc kubenswrapper[4676]: I0930 14:20:58.735837 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bccc396-3182-4955-8d75-93c5b0b221c6-config\") pod \"dnsmasq-dns-5784cf869f-bwmdv\" (UID: \"5bccc396-3182-4955-8d75-93c5b0b221c6\") " pod="openstack/dnsmasq-dns-5784cf869f-bwmdv" Sep 30 14:20:58 crc kubenswrapper[4676]: I0930 14:20:58.736005 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5bccc396-3182-4955-8d75-93c5b0b221c6-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-bwmdv\" (UID: \"5bccc396-3182-4955-8d75-93c5b0b221c6\") " pod="openstack/dnsmasq-dns-5784cf869f-bwmdv" Sep 30 14:20:58 crc kubenswrapper[4676]: I0930 14:20:58.739418 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5bccc396-3182-4955-8d75-93c5b0b221c6-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-bwmdv\" (UID: \"5bccc396-3182-4955-8d75-93c5b0b221c6\") " pod="openstack/dnsmasq-dns-5784cf869f-bwmdv" Sep 30 14:20:58 crc kubenswrapper[4676]: I0930 14:20:58.740443 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh9vj\" (UniqueName: \"kubernetes.io/projected/5bccc396-3182-4955-8d75-93c5b0b221c6-kube-api-access-nh9vj\") pod \"dnsmasq-dns-5784cf869f-bwmdv\" (UID: \"5bccc396-3182-4955-8d75-93c5b0b221c6\") " pod="openstack/dnsmasq-dns-5784cf869f-bwmdv" Sep 30 14:20:58 crc kubenswrapper[4676]: I0930 14:20:58.743067 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfzpt\" (UniqueName: \"kubernetes.io/projected/e8be2e62-b82f-43cb-b1bb-099d7d0f3478-kube-api-access-vfzpt\") pod \"neutron-6bbb84f9b4-l58vk\" (UID: \"e8be2e62-b82f-43cb-b1bb-099d7d0f3478\") " pod="openstack/neutron-6bbb84f9b4-l58vk" Sep 30 14:20:58 crc kubenswrapper[4676]: I0930 14:20:58.779275 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8be2e62-b82f-43cb-b1bb-099d7d0f3478-ovndb-tls-certs\") pod \"neutron-6bbb84f9b4-l58vk\" (UID: \"e8be2e62-b82f-43cb-b1bb-099d7d0f3478\") " pod="openstack/neutron-6bbb84f9b4-l58vk" Sep 30 14:20:58 crc kubenswrapper[4676]: I0930 14:20:58.817634 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-bwmdv" Sep 30 14:20:58 crc kubenswrapper[4676]: E0930 14:20:58.906457 4676 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c59c419_a30f_4ba1_960f_e953943a6d7e.slice/crio-9d1ef98550228a0013cca58ec73418fd5a0bbfa1786f2637580ae547a8560f62.scope\": RecentStats: unable to find data in memory cache]" Sep 30 14:20:59 crc kubenswrapper[4676]: I0930 14:20:59.068395 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6bbb84f9b4-l58vk" Sep 30 14:20:59 crc kubenswrapper[4676]: I0930 14:20:59.279310 4676 generic.go:334] "Generic (PLEG): container finished" podID="5c59c419-a30f-4ba1-960f-e953943a6d7e" containerID="9d1ef98550228a0013cca58ec73418fd5a0bbfa1786f2637580ae547a8560f62" exitCode=0 Sep 30 14:20:59 crc kubenswrapper[4676]: I0930 14:20:59.279376 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69c986f6d7-wszsv" event={"ID":"5c59c419-a30f-4ba1-960f-e953943a6d7e","Type":"ContainerDied","Data":"9d1ef98550228a0013cca58ec73418fd5a0bbfa1786f2637580ae547a8560f62"} Sep 30 14:20:59 crc kubenswrapper[4676]: I0930 14:20:59.281500 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f490182f-5ea6-45fa-85d0-a6b1c02c5849","Type":"ContainerStarted","Data":"baf0d54b5ee2f8c60dcb08f53ed89af9574b5e1fc00eb3368ce79fec8c758110"} Sep 30 14:20:59 crc kubenswrapper[4676]: I0930 14:20:59.290721 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Sep 30 14:20:59 crc kubenswrapper[4676]: I0930 14:20:59.444943 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69c986f6d7-wszsv" Sep 30 14:20:59 crc kubenswrapper[4676]: I0930 14:20:59.522926 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-bwmdv"] Sep 30 14:20:59 crc kubenswrapper[4676]: I0930 14:20:59.539594 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5c59c419-a30f-4ba1-960f-e953943a6d7e-ovsdbserver-nb\") pod \"5c59c419-a30f-4ba1-960f-e953943a6d7e\" (UID: \"5c59c419-a30f-4ba1-960f-e953943a6d7e\") " Sep 30 14:20:59 crc kubenswrapper[4676]: I0930 14:20:59.539743 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5c59c419-a30f-4ba1-960f-e953943a6d7e-dns-swift-storage-0\") pod \"5c59c419-a30f-4ba1-960f-e953943a6d7e\" (UID: \"5c59c419-a30f-4ba1-960f-e953943a6d7e\") " Sep 30 14:20:59 crc kubenswrapper[4676]: I0930 14:20:59.539805 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c59c419-a30f-4ba1-960f-e953943a6d7e-config\") pod \"5c59c419-a30f-4ba1-960f-e953943a6d7e\" (UID: \"5c59c419-a30f-4ba1-960f-e953943a6d7e\") " Sep 30 14:20:59 crc kubenswrapper[4676]: I0930 14:20:59.539900 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5c59c419-a30f-4ba1-960f-e953943a6d7e-dns-svc\") pod \"5c59c419-a30f-4ba1-960f-e953943a6d7e\" (UID: \"5c59c419-a30f-4ba1-960f-e953943a6d7e\") " Sep 30 14:20:59 crc kubenswrapper[4676]: I0930 14:20:59.539931 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ntb8\" (UniqueName: \"kubernetes.io/projected/5c59c419-a30f-4ba1-960f-e953943a6d7e-kube-api-access-5ntb8\") pod \"5c59c419-a30f-4ba1-960f-e953943a6d7e\" (UID: \"5c59c419-a30f-4ba1-960f-e953943a6d7e\") " Sep 30 14:20:59 crc kubenswrapper[4676]: I0930 14:20:59.539993 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5c59c419-a30f-4ba1-960f-e953943a6d7e-ovsdbserver-sb\") pod \"5c59c419-a30f-4ba1-960f-e953943a6d7e\" (UID: \"5c59c419-a30f-4ba1-960f-e953943a6d7e\") " Sep 30 14:20:59 crc kubenswrapper[4676]: I0930 14:20:59.566936 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c59c419-a30f-4ba1-960f-e953943a6d7e-kube-api-access-5ntb8" (OuterVolumeSpecName: "kube-api-access-5ntb8") pod "5c59c419-a30f-4ba1-960f-e953943a6d7e" (UID: "5c59c419-a30f-4ba1-960f-e953943a6d7e"). InnerVolumeSpecName "kube-api-access-5ntb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:20:59 crc kubenswrapper[4676]: I0930 14:20:59.645495 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ntb8\" (UniqueName: \"kubernetes.io/projected/5c59c419-a30f-4ba1-960f-e953943a6d7e-kube-api-access-5ntb8\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:59 crc kubenswrapper[4676]: I0930 14:20:59.662731 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c59c419-a30f-4ba1-960f-e953943a6d7e-config" (OuterVolumeSpecName: "config") pod "5c59c419-a30f-4ba1-960f-e953943a6d7e" (UID: "5c59c419-a30f-4ba1-960f-e953943a6d7e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:20:59 crc kubenswrapper[4676]: I0930 14:20:59.670754 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c59c419-a30f-4ba1-960f-e953943a6d7e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5c59c419-a30f-4ba1-960f-e953943a6d7e" (UID: "5c59c419-a30f-4ba1-960f-e953943a6d7e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:20:59 crc kubenswrapper[4676]: I0930 14:20:59.677597 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c59c419-a30f-4ba1-960f-e953943a6d7e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5c59c419-a30f-4ba1-960f-e953943a6d7e" (UID: "5c59c419-a30f-4ba1-960f-e953943a6d7e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:20:59 crc kubenswrapper[4676]: I0930 14:20:59.682206 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c59c419-a30f-4ba1-960f-e953943a6d7e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5c59c419-a30f-4ba1-960f-e953943a6d7e" (UID: "5c59c419-a30f-4ba1-960f-e953943a6d7e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:20:59 crc kubenswrapper[4676]: I0930 14:20:59.707913 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c59c419-a30f-4ba1-960f-e953943a6d7e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5c59c419-a30f-4ba1-960f-e953943a6d7e" (UID: "5c59c419-a30f-4ba1-960f-e953943a6d7e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:20:59 crc kubenswrapper[4676]: I0930 14:20:59.747865 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c59c419-a30f-4ba1-960f-e953943a6d7e-config\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:59 crc kubenswrapper[4676]: I0930 14:20:59.747943 4676 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5c59c419-a30f-4ba1-960f-e953943a6d7e-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:59 crc kubenswrapper[4676]: I0930 14:20:59.747958 4676 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5c59c419-a30f-4ba1-960f-e953943a6d7e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:59 crc kubenswrapper[4676]: I0930 14:20:59.747968 4676 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5c59c419-a30f-4ba1-960f-e953943a6d7e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:59 crc kubenswrapper[4676]: I0930 14:20:59.747978 4676 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5c59c419-a30f-4ba1-960f-e953943a6d7e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 14:20:59 crc kubenswrapper[4676]: I0930 14:20:59.801473 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Sep 30 14:20:59 crc kubenswrapper[4676]: I0930 14:20:59.980754 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6bbb84f9b4-l58vk"] Sep 30 14:21:00 crc kubenswrapper[4676]: I0930 14:21:00.295961 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bbb84f9b4-l58vk" event={"ID":"e8be2e62-b82f-43cb-b1bb-099d7d0f3478","Type":"ContainerStarted","Data":"5a30999bbbe9b011dc6a502749b1d7d060026c5693c8b509173f2513286bcdc0"} Sep 30 14:21:00 crc kubenswrapper[4676]: I0930 14:21:00.300815 4676 generic.go:334] "Generic (PLEG): container finished" podID="5bccc396-3182-4955-8d75-93c5b0b221c6" containerID="c2cb59ad197d6250a746c1d49fc9115f9f36368e6ecfaa85e386183dd88b5d7e" exitCode=0 Sep 30 14:21:00 crc kubenswrapper[4676]: I0930 14:21:00.300921 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-bwmdv" event={"ID":"5bccc396-3182-4955-8d75-93c5b0b221c6","Type":"ContainerDied","Data":"c2cb59ad197d6250a746c1d49fc9115f9f36368e6ecfaa85e386183dd88b5d7e"} Sep 30 14:21:00 crc kubenswrapper[4676]: I0930 14:21:00.300994 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-bwmdv" event={"ID":"5bccc396-3182-4955-8d75-93c5b0b221c6","Type":"ContainerStarted","Data":"87d6b23ede78bd8fae1eb6c886955a685b894bb2f23c7ba61cea814508eebdea"} Sep 30 14:21:00 crc kubenswrapper[4676]: I0930 14:21:00.320381 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69c986f6d7-wszsv" Sep 30 14:21:00 crc kubenswrapper[4676]: I0930 14:21:00.320945 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69c986f6d7-wszsv" event={"ID":"5c59c419-a30f-4ba1-960f-e953943a6d7e","Type":"ContainerDied","Data":"986402eb6033d7e35cd1ca4a1b6d2158d0bc7868ef940b9ae73d07f036cd7696"} Sep 30 14:21:00 crc kubenswrapper[4676]: I0930 14:21:00.321029 4676 scope.go:117] "RemoveContainer" containerID="9d1ef98550228a0013cca58ec73418fd5a0bbfa1786f2637580ae547a8560f62" Sep 30 14:21:00 crc kubenswrapper[4676]: I0930 14:21:00.340623 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f490182f-5ea6-45fa-85d0-a6b1c02c5849","Type":"ContainerStarted","Data":"39c9db1e1b8e1f7fd491106bb2838b097e23505063b0266acaf12a68599224fe"} Sep 30 14:21:00 crc kubenswrapper[4676]: I0930 14:21:00.340982 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Sep 30 14:21:00 crc kubenswrapper[4676]: I0930 14:21:00.483446 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 14:21:00 crc kubenswrapper[4676]: I0930 14:21:00.496801 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.496783266 podStartE2EDuration="4.496783266s" podCreationTimestamp="2025-09-30 14:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:21:00.400112842 +0000 UTC m=+1364.383201271" watchObservedRunningTime="2025-09-30 14:21:00.496783266 +0000 UTC m=+1364.479871685" Sep 30 14:21:00 crc kubenswrapper[4676]: I0930 14:21:00.525129 4676 scope.go:117] "RemoveContainer" containerID="4ed8299712a7685ee0d452f6172577127f517d1280aa1674a5980448bca1689b" Sep 30 14:21:00 crc kubenswrapper[4676]: I0930 14:21:00.544691 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69c986f6d7-wszsv"] Sep 30 14:21:00 crc kubenswrapper[4676]: I0930 14:21:00.561483 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-69c986f6d7-wszsv"] Sep 30 14:21:01 crc kubenswrapper[4676]: I0930 14:21:01.352948 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bbb84f9b4-l58vk" event={"ID":"e8be2e62-b82f-43cb-b1bb-099d7d0f3478","Type":"ContainerStarted","Data":"7c0af458813ddf93c4dc88ec6ea27a8727dbfa23aae24c5dc0df4b8cb0d0af17"} Sep 30 14:21:01 crc kubenswrapper[4676]: I0930 14:21:01.353278 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bbb84f9b4-l58vk" event={"ID":"e8be2e62-b82f-43cb-b1bb-099d7d0f3478","Type":"ContainerStarted","Data":"3961d18819a828f34caead58848f4b73c474baf1b075cbe22e7270b079e01318"} Sep 30 14:21:01 crc kubenswrapper[4676]: I0930 14:21:01.354376 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6bbb84f9b4-l58vk" Sep 30 14:21:01 crc kubenswrapper[4676]: I0930 14:21:01.356983 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-bwmdv" event={"ID":"5bccc396-3182-4955-8d75-93c5b0b221c6","Type":"ContainerStarted","Data":"f48b8383470f3a455212ee5d9c3258a9e675b448fe0402af5be5813bca42ec5e"} Sep 30 14:21:01 crc kubenswrapper[4676]: I0930 14:21:01.357498 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5784cf869f-bwmdv" Sep 30 14:21:01 crc kubenswrapper[4676]: I0930 14:21:01.360706 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="e6f141d8-0218-4c0e-94b8-7c1a235b2e11" containerName="cinder-scheduler" containerID="cri-o://beea21d7edaa31807923b4791b366b785e78d23ba9a9b0d6e7588d36f69ff128" gracePeriod=30 Sep 30 14:21:01 crc kubenswrapper[4676]: I0930 14:21:01.360957 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="e6f141d8-0218-4c0e-94b8-7c1a235b2e11" containerName="probe" containerID="cri-o://79fc113902778c2d239816856ca21d28cd12b90268b02a7e0ed291427e2d4d94" gracePeriod=30 Sep 30 14:21:01 crc kubenswrapper[4676]: I0930 14:21:01.403620 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5784cf869f-bwmdv" podStartSLOduration=3.403600424 podStartE2EDuration="3.403600424s" podCreationTimestamp="2025-09-30 14:20:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:21:01.402898656 +0000 UTC m=+1365.385987095" watchObservedRunningTime="2025-09-30 14:21:01.403600424 +0000 UTC m=+1365.386688853" Sep 30 14:21:01 crc kubenswrapper[4676]: I0930 14:21:01.407405 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6bbb84f9b4-l58vk" podStartSLOduration=3.407383134 podStartE2EDuration="3.407383134s" podCreationTimestamp="2025-09-30 14:20:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:21:01.379376959 +0000 UTC m=+1365.362465398" watchObservedRunningTime="2025-09-30 14:21:01.407383134 +0000 UTC m=+1365.390471563" Sep 30 14:21:01 crc kubenswrapper[4676]: I0930 14:21:01.452819 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c59c419-a30f-4ba1-960f-e953943a6d7e" path="/var/lib/kubelet/pods/5c59c419-a30f-4ba1-960f-e953943a6d7e/volumes" Sep 30 14:21:01 crc kubenswrapper[4676]: I0930 14:21:01.802244 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6668cdff8d-z8vnk" podUID="ace12602-f0f7-4f29-8c37-72c1e840bacc" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.144:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.144:8443: connect: connection refused" Sep 30 14:21:01 crc kubenswrapper[4676]: I0930 14:21:01.843098 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-56956855f5-jwqlp"] Sep 30 14:21:01 crc kubenswrapper[4676]: E0930 14:21:01.843626 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c59c419-a30f-4ba1-960f-e953943a6d7e" containerName="init" Sep 30 14:21:01 crc kubenswrapper[4676]: I0930 14:21:01.843649 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c59c419-a30f-4ba1-960f-e953943a6d7e" containerName="init" Sep 30 14:21:01 crc kubenswrapper[4676]: E0930 14:21:01.843677 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c59c419-a30f-4ba1-960f-e953943a6d7e" containerName="dnsmasq-dns" Sep 30 14:21:01 crc kubenswrapper[4676]: I0930 14:21:01.843685 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c59c419-a30f-4ba1-960f-e953943a6d7e" containerName="dnsmasq-dns" Sep 30 14:21:01 crc kubenswrapper[4676]: I0930 14:21:01.843918 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c59c419-a30f-4ba1-960f-e953943a6d7e" containerName="dnsmasq-dns" Sep 30 14:21:01 crc kubenswrapper[4676]: I0930 14:21:01.847762 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56956855f5-jwqlp" Sep 30 14:21:01 crc kubenswrapper[4676]: I0930 14:21:01.850059 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Sep 30 14:21:01 crc kubenswrapper[4676]: I0930 14:21:01.850144 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Sep 30 14:21:01 crc kubenswrapper[4676]: I0930 14:21:01.852442 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-56956855f5-jwqlp"] Sep 30 14:21:01 crc kubenswrapper[4676]: I0930 14:21:01.889310 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/af5d35f3-c607-4084-9585-0a750ea54db5-public-tls-certs\") pod \"neutron-56956855f5-jwqlp\" (UID: \"af5d35f3-c607-4084-9585-0a750ea54db5\") " pod="openstack/neutron-56956855f5-jwqlp" Sep 30 14:21:01 crc kubenswrapper[4676]: I0930 14:21:01.889385 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af5d35f3-c607-4084-9585-0a750ea54db5-combined-ca-bundle\") pod \"neutron-56956855f5-jwqlp\" (UID: \"af5d35f3-c607-4084-9585-0a750ea54db5\") " pod="openstack/neutron-56956855f5-jwqlp" Sep 30 14:21:01 crc kubenswrapper[4676]: I0930 14:21:01.889604 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgwvr\" (UniqueName: \"kubernetes.io/projected/af5d35f3-c607-4084-9585-0a750ea54db5-kube-api-access-dgwvr\") pod \"neutron-56956855f5-jwqlp\" (UID: \"af5d35f3-c607-4084-9585-0a750ea54db5\") " pod="openstack/neutron-56956855f5-jwqlp" Sep 30 14:21:01 crc kubenswrapper[4676]: I0930 14:21:01.889655 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/af5d35f3-c607-4084-9585-0a750ea54db5-internal-tls-certs\") pod \"neutron-56956855f5-jwqlp\" (UID: \"af5d35f3-c607-4084-9585-0a750ea54db5\") " pod="openstack/neutron-56956855f5-jwqlp" Sep 30 14:21:01 crc kubenswrapper[4676]: I0930 14:21:01.889680 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/af5d35f3-c607-4084-9585-0a750ea54db5-httpd-config\") pod \"neutron-56956855f5-jwqlp\" (UID: \"af5d35f3-c607-4084-9585-0a750ea54db5\") " pod="openstack/neutron-56956855f5-jwqlp" Sep 30 14:21:01 crc kubenswrapper[4676]: I0930 14:21:01.889742 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/af5d35f3-c607-4084-9585-0a750ea54db5-config\") pod \"neutron-56956855f5-jwqlp\" (UID: \"af5d35f3-c607-4084-9585-0a750ea54db5\") " pod="openstack/neutron-56956855f5-jwqlp" Sep 30 14:21:01 crc kubenswrapper[4676]: I0930 14:21:01.889763 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/af5d35f3-c607-4084-9585-0a750ea54db5-ovndb-tls-certs\") pod \"neutron-56956855f5-jwqlp\" (UID: \"af5d35f3-c607-4084-9585-0a750ea54db5\") " pod="openstack/neutron-56956855f5-jwqlp" Sep 30 14:21:01 crc kubenswrapper[4676]: I0930 14:21:01.995471 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/af5d35f3-c607-4084-9585-0a750ea54db5-config\") pod \"neutron-56956855f5-jwqlp\" (UID: \"af5d35f3-c607-4084-9585-0a750ea54db5\") " pod="openstack/neutron-56956855f5-jwqlp" Sep 30 14:21:01 crc kubenswrapper[4676]: I0930 14:21:01.995541 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/af5d35f3-c607-4084-9585-0a750ea54db5-ovndb-tls-certs\") pod \"neutron-56956855f5-jwqlp\" (UID: \"af5d35f3-c607-4084-9585-0a750ea54db5\") " pod="openstack/neutron-56956855f5-jwqlp" Sep 30 14:21:01 crc kubenswrapper[4676]: I0930 14:21:01.995649 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/af5d35f3-c607-4084-9585-0a750ea54db5-public-tls-certs\") pod \"neutron-56956855f5-jwqlp\" (UID: \"af5d35f3-c607-4084-9585-0a750ea54db5\") " pod="openstack/neutron-56956855f5-jwqlp" Sep 30 14:21:01 crc kubenswrapper[4676]: I0930 14:21:01.995693 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af5d35f3-c607-4084-9585-0a750ea54db5-combined-ca-bundle\") pod \"neutron-56956855f5-jwqlp\" (UID: \"af5d35f3-c607-4084-9585-0a750ea54db5\") " pod="openstack/neutron-56956855f5-jwqlp" Sep 30 14:21:01 crc kubenswrapper[4676]: I0930 14:21:01.995716 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgwvr\" (UniqueName: \"kubernetes.io/projected/af5d35f3-c607-4084-9585-0a750ea54db5-kube-api-access-dgwvr\") pod \"neutron-56956855f5-jwqlp\" (UID: \"af5d35f3-c607-4084-9585-0a750ea54db5\") " pod="openstack/neutron-56956855f5-jwqlp" Sep 30 14:21:01 crc kubenswrapper[4676]: I0930 14:21:01.995762 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/af5d35f3-c607-4084-9585-0a750ea54db5-internal-tls-certs\") pod \"neutron-56956855f5-jwqlp\" (UID: \"af5d35f3-c607-4084-9585-0a750ea54db5\") " pod="openstack/neutron-56956855f5-jwqlp" Sep 30 14:21:01 crc kubenswrapper[4676]: I0930 14:21:01.995791 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/af5d35f3-c607-4084-9585-0a750ea54db5-httpd-config\") pod \"neutron-56956855f5-jwqlp\" (UID: \"af5d35f3-c607-4084-9585-0a750ea54db5\") " pod="openstack/neutron-56956855f5-jwqlp" Sep 30 14:21:02 crc kubenswrapper[4676]: I0930 14:21:02.004231 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/af5d35f3-c607-4084-9585-0a750ea54db5-internal-tls-certs\") pod \"neutron-56956855f5-jwqlp\" (UID: \"af5d35f3-c607-4084-9585-0a750ea54db5\") " pod="openstack/neutron-56956855f5-jwqlp" Sep 30 14:21:02 crc kubenswrapper[4676]: I0930 14:21:02.005292 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/af5d35f3-c607-4084-9585-0a750ea54db5-config\") pod \"neutron-56956855f5-jwqlp\" (UID: \"af5d35f3-c607-4084-9585-0a750ea54db5\") " pod="openstack/neutron-56956855f5-jwqlp" Sep 30 14:21:02 crc kubenswrapper[4676]: I0930 14:21:02.005371 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/af5d35f3-c607-4084-9585-0a750ea54db5-httpd-config\") pod \"neutron-56956855f5-jwqlp\" (UID: \"af5d35f3-c607-4084-9585-0a750ea54db5\") " pod="openstack/neutron-56956855f5-jwqlp" Sep 30 14:21:02 crc kubenswrapper[4676]: I0930 14:21:02.006993 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af5d35f3-c607-4084-9585-0a750ea54db5-combined-ca-bundle\") pod \"neutron-56956855f5-jwqlp\" (UID: \"af5d35f3-c607-4084-9585-0a750ea54db5\") " pod="openstack/neutron-56956855f5-jwqlp" Sep 30 14:21:02 crc kubenswrapper[4676]: I0930 14:21:02.007094 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/af5d35f3-c607-4084-9585-0a750ea54db5-public-tls-certs\") pod \"neutron-56956855f5-jwqlp\" (UID: \"af5d35f3-c607-4084-9585-0a750ea54db5\") " pod="openstack/neutron-56956855f5-jwqlp" Sep 30 14:21:02 crc kubenswrapper[4676]: I0930 14:21:02.016349 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/af5d35f3-c607-4084-9585-0a750ea54db5-ovndb-tls-certs\") pod \"neutron-56956855f5-jwqlp\" (UID: \"af5d35f3-c607-4084-9585-0a750ea54db5\") " pod="openstack/neutron-56956855f5-jwqlp" Sep 30 14:21:02 crc kubenswrapper[4676]: I0930 14:21:02.026398 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgwvr\" (UniqueName: \"kubernetes.io/projected/af5d35f3-c607-4084-9585-0a750ea54db5-kube-api-access-dgwvr\") pod \"neutron-56956855f5-jwqlp\" (UID: \"af5d35f3-c607-4084-9585-0a750ea54db5\") " pod="openstack/neutron-56956855f5-jwqlp" Sep 30 14:21:02 crc kubenswrapper[4676]: I0930 14:21:02.177027 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56956855f5-jwqlp" Sep 30 14:21:02 crc kubenswrapper[4676]: I0930 14:21:02.782806 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-56956855f5-jwqlp"] Sep 30 14:21:02 crc kubenswrapper[4676]: W0930 14:21:02.792049 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf5d35f3_c607_4084_9585_0a750ea54db5.slice/crio-c6854c6b5d0cff1c311623a05368b82ccfee6aa75984e0b4faba997ca604692b WatchSource:0}: Error finding container c6854c6b5d0cff1c311623a05368b82ccfee6aa75984e0b4faba997ca604692b: Status 404 returned error can't find the container with id c6854c6b5d0cff1c311623a05368b82ccfee6aa75984e0b4faba997ca604692b Sep 30 14:21:03 crc kubenswrapper[4676]: I0930 14:21:03.071770 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-6f9769955f-6wd7m"] Sep 30 14:21:03 crc kubenswrapper[4676]: I0930 14:21:03.074177 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6f9769955f-6wd7m" Sep 30 14:21:03 crc kubenswrapper[4676]: I0930 14:21:03.077117 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Sep 30 14:21:03 crc kubenswrapper[4676]: I0930 14:21:03.077311 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Sep 30 14:21:03 crc kubenswrapper[4676]: I0930 14:21:03.079909 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Sep 30 14:21:03 crc kubenswrapper[4676]: I0930 14:21:03.092820 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6f9769955f-6wd7m"] Sep 30 14:21:03 crc kubenswrapper[4676]: I0930 14:21:03.147243 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wvqj\" (UniqueName: \"kubernetes.io/projected/b18c2fcd-dc66-434b-b3ef-61215f24a511-kube-api-access-7wvqj\") pod \"swift-proxy-6f9769955f-6wd7m\" (UID: \"b18c2fcd-dc66-434b-b3ef-61215f24a511\") " pod="openstack/swift-proxy-6f9769955f-6wd7m" Sep 30 14:21:03 crc kubenswrapper[4676]: I0930 14:21:03.147578 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b18c2fcd-dc66-434b-b3ef-61215f24a511-run-httpd\") pod \"swift-proxy-6f9769955f-6wd7m\" (UID: \"b18c2fcd-dc66-434b-b3ef-61215f24a511\") " pod="openstack/swift-proxy-6f9769955f-6wd7m" Sep 30 14:21:03 crc kubenswrapper[4676]: I0930 14:21:03.147794 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b18c2fcd-dc66-434b-b3ef-61215f24a511-log-httpd\") pod \"swift-proxy-6f9769955f-6wd7m\" (UID: \"b18c2fcd-dc66-434b-b3ef-61215f24a511\") " pod="openstack/swift-proxy-6f9769955f-6wd7m" Sep 30 14:21:03 crc kubenswrapper[4676]: I0930 14:21:03.147944 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b18c2fcd-dc66-434b-b3ef-61215f24a511-combined-ca-bundle\") pod \"swift-proxy-6f9769955f-6wd7m\" (UID: \"b18c2fcd-dc66-434b-b3ef-61215f24a511\") " pod="openstack/swift-proxy-6f9769955f-6wd7m" Sep 30 14:21:03 crc kubenswrapper[4676]: I0930 14:21:03.148110 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b18c2fcd-dc66-434b-b3ef-61215f24a511-etc-swift\") pod \"swift-proxy-6f9769955f-6wd7m\" (UID: \"b18c2fcd-dc66-434b-b3ef-61215f24a511\") " pod="openstack/swift-proxy-6f9769955f-6wd7m" Sep 30 14:21:03 crc kubenswrapper[4676]: I0930 14:21:03.148191 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b18c2fcd-dc66-434b-b3ef-61215f24a511-public-tls-certs\") pod \"swift-proxy-6f9769955f-6wd7m\" (UID: \"b18c2fcd-dc66-434b-b3ef-61215f24a511\") " pod="openstack/swift-proxy-6f9769955f-6wd7m" Sep 30 14:21:03 crc kubenswrapper[4676]: I0930 14:21:03.148300 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b18c2fcd-dc66-434b-b3ef-61215f24a511-internal-tls-certs\") pod \"swift-proxy-6f9769955f-6wd7m\" (UID: \"b18c2fcd-dc66-434b-b3ef-61215f24a511\") " pod="openstack/swift-proxy-6f9769955f-6wd7m" Sep 30 14:21:03 crc kubenswrapper[4676]: I0930 14:21:03.148412 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b18c2fcd-dc66-434b-b3ef-61215f24a511-config-data\") pod \"swift-proxy-6f9769955f-6wd7m\" (UID: \"b18c2fcd-dc66-434b-b3ef-61215f24a511\") " pod="openstack/swift-proxy-6f9769955f-6wd7m" Sep 30 14:21:03 crc kubenswrapper[4676]: I0930 14:21:03.250757 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wvqj\" (UniqueName: \"kubernetes.io/projected/b18c2fcd-dc66-434b-b3ef-61215f24a511-kube-api-access-7wvqj\") pod \"swift-proxy-6f9769955f-6wd7m\" (UID: \"b18c2fcd-dc66-434b-b3ef-61215f24a511\") " pod="openstack/swift-proxy-6f9769955f-6wd7m" Sep 30 14:21:03 crc kubenswrapper[4676]: I0930 14:21:03.250843 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b18c2fcd-dc66-434b-b3ef-61215f24a511-run-httpd\") pod \"swift-proxy-6f9769955f-6wd7m\" (UID: \"b18c2fcd-dc66-434b-b3ef-61215f24a511\") " pod="openstack/swift-proxy-6f9769955f-6wd7m" Sep 30 14:21:03 crc kubenswrapper[4676]: I0930 14:21:03.250933 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b18c2fcd-dc66-434b-b3ef-61215f24a511-log-httpd\") pod \"swift-proxy-6f9769955f-6wd7m\" (UID: \"b18c2fcd-dc66-434b-b3ef-61215f24a511\") " pod="openstack/swift-proxy-6f9769955f-6wd7m" Sep 30 14:21:03 crc kubenswrapper[4676]: I0930 14:21:03.250965 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b18c2fcd-dc66-434b-b3ef-61215f24a511-combined-ca-bundle\") pod \"swift-proxy-6f9769955f-6wd7m\" (UID: \"b18c2fcd-dc66-434b-b3ef-61215f24a511\") " pod="openstack/swift-proxy-6f9769955f-6wd7m" Sep 30 14:21:03 crc kubenswrapper[4676]: I0930 14:21:03.251025 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b18c2fcd-dc66-434b-b3ef-61215f24a511-etc-swift\") pod \"swift-proxy-6f9769955f-6wd7m\" (UID: \"b18c2fcd-dc66-434b-b3ef-61215f24a511\") " pod="openstack/swift-proxy-6f9769955f-6wd7m" Sep 30 14:21:03 crc kubenswrapper[4676]: I0930 14:21:03.251051 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b18c2fcd-dc66-434b-b3ef-61215f24a511-public-tls-certs\") pod \"swift-proxy-6f9769955f-6wd7m\" (UID: \"b18c2fcd-dc66-434b-b3ef-61215f24a511\") " pod="openstack/swift-proxy-6f9769955f-6wd7m" Sep 30 14:21:03 crc kubenswrapper[4676]: I0930 14:21:03.251089 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b18c2fcd-dc66-434b-b3ef-61215f24a511-internal-tls-certs\") pod \"swift-proxy-6f9769955f-6wd7m\" (UID: \"b18c2fcd-dc66-434b-b3ef-61215f24a511\") " pod="openstack/swift-proxy-6f9769955f-6wd7m" Sep 30 14:21:03 crc kubenswrapper[4676]: I0930 14:21:03.251126 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b18c2fcd-dc66-434b-b3ef-61215f24a511-config-data\") pod \"swift-proxy-6f9769955f-6wd7m\" (UID: \"b18c2fcd-dc66-434b-b3ef-61215f24a511\") " pod="openstack/swift-proxy-6f9769955f-6wd7m" Sep 30 14:21:03 crc kubenswrapper[4676]: I0930 14:21:03.251744 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b18c2fcd-dc66-434b-b3ef-61215f24a511-run-httpd\") pod \"swift-proxy-6f9769955f-6wd7m\" (UID: \"b18c2fcd-dc66-434b-b3ef-61215f24a511\") " pod="openstack/swift-proxy-6f9769955f-6wd7m" Sep 30 14:21:03 crc kubenswrapper[4676]: I0930 14:21:03.252117 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b18c2fcd-dc66-434b-b3ef-61215f24a511-log-httpd\") pod \"swift-proxy-6f9769955f-6wd7m\" (UID: \"b18c2fcd-dc66-434b-b3ef-61215f24a511\") " pod="openstack/swift-proxy-6f9769955f-6wd7m" Sep 30 14:21:03 crc kubenswrapper[4676]: I0930 14:21:03.256858 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b18c2fcd-dc66-434b-b3ef-61215f24a511-public-tls-certs\") pod \"swift-proxy-6f9769955f-6wd7m\" (UID: \"b18c2fcd-dc66-434b-b3ef-61215f24a511\") " pod="openstack/swift-proxy-6f9769955f-6wd7m" Sep 30 14:21:03 crc kubenswrapper[4676]: I0930 14:21:03.257420 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b18c2fcd-dc66-434b-b3ef-61215f24a511-config-data\") pod \"swift-proxy-6f9769955f-6wd7m\" (UID: \"b18c2fcd-dc66-434b-b3ef-61215f24a511\") " pod="openstack/swift-proxy-6f9769955f-6wd7m" Sep 30 14:21:03 crc kubenswrapper[4676]: I0930 14:21:03.258452 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b18c2fcd-dc66-434b-b3ef-61215f24a511-etc-swift\") pod \"swift-proxy-6f9769955f-6wd7m\" (UID: \"b18c2fcd-dc66-434b-b3ef-61215f24a511\") " pod="openstack/swift-proxy-6f9769955f-6wd7m" Sep 30 14:21:03 crc kubenswrapper[4676]: I0930 14:21:03.260613 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b18c2fcd-dc66-434b-b3ef-61215f24a511-internal-tls-certs\") pod \"swift-proxy-6f9769955f-6wd7m\" (UID: \"b18c2fcd-dc66-434b-b3ef-61215f24a511\") " pod="openstack/swift-proxy-6f9769955f-6wd7m" Sep 30 14:21:03 crc kubenswrapper[4676]: I0930 14:21:03.261592 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b18c2fcd-dc66-434b-b3ef-61215f24a511-combined-ca-bundle\") pod \"swift-proxy-6f9769955f-6wd7m\" (UID: \"b18c2fcd-dc66-434b-b3ef-61215f24a511\") " pod="openstack/swift-proxy-6f9769955f-6wd7m" Sep 30 14:21:03 crc kubenswrapper[4676]: I0930 14:21:03.278736 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wvqj\" (UniqueName: \"kubernetes.io/projected/b18c2fcd-dc66-434b-b3ef-61215f24a511-kube-api-access-7wvqj\") pod \"swift-proxy-6f9769955f-6wd7m\" (UID: \"b18c2fcd-dc66-434b-b3ef-61215f24a511\") " pod="openstack/swift-proxy-6f9769955f-6wd7m" Sep 30 14:21:03 crc kubenswrapper[4676]: I0930 14:21:03.388708 4676 generic.go:334] "Generic (PLEG): container finished" podID="e6f141d8-0218-4c0e-94b8-7c1a235b2e11" containerID="79fc113902778c2d239816856ca21d28cd12b90268b02a7e0ed291427e2d4d94" exitCode=0 Sep 30 14:21:03 crc kubenswrapper[4676]: I0930 14:21:03.388806 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e6f141d8-0218-4c0e-94b8-7c1a235b2e11","Type":"ContainerDied","Data":"79fc113902778c2d239816856ca21d28cd12b90268b02a7e0ed291427e2d4d94"} Sep 30 14:21:03 crc kubenswrapper[4676]: I0930 14:21:03.392728 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56956855f5-jwqlp" event={"ID":"af5d35f3-c607-4084-9585-0a750ea54db5","Type":"ContainerStarted","Data":"1b0bce3a44e67b970c0acc75a9a915e5fad5a9a522f78f9946870cb7acf2526b"} Sep 30 14:21:03 crc kubenswrapper[4676]: I0930 14:21:03.393610 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56956855f5-jwqlp" event={"ID":"af5d35f3-c607-4084-9585-0a750ea54db5","Type":"ContainerStarted","Data":"c6854c6b5d0cff1c311623a05368b82ccfee6aa75984e0b4faba997ca604692b"} Sep 30 14:21:03 crc kubenswrapper[4676]: I0930 14:21:03.398017 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6f9769955f-6wd7m" Sep 30 14:21:04 crc kubenswrapper[4676]: I0930 14:21:04.102294 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6f9769955f-6wd7m"] Sep 30 14:21:04 crc kubenswrapper[4676]: I0930 14:21:04.403112 4676 generic.go:334] "Generic (PLEG): container finished" podID="e6f141d8-0218-4c0e-94b8-7c1a235b2e11" containerID="beea21d7edaa31807923b4791b366b785e78d23ba9a9b0d6e7588d36f69ff128" exitCode=0 Sep 30 14:21:04 crc kubenswrapper[4676]: I0930 14:21:04.403149 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e6f141d8-0218-4c0e-94b8-7c1a235b2e11","Type":"ContainerDied","Data":"beea21d7edaa31807923b4791b366b785e78d23ba9a9b0d6e7588d36f69ff128"} Sep 30 14:21:04 crc kubenswrapper[4676]: I0930 14:21:04.410348 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56956855f5-jwqlp" event={"ID":"af5d35f3-c607-4084-9585-0a750ea54db5","Type":"ContainerStarted","Data":"85379e9883369cc8ba8e06f4fdeb9d38ef2e3af9571d3e7602083d61a0e92f88"} Sep 30 14:21:04 crc kubenswrapper[4676]: I0930 14:21:04.410568 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-56956855f5-jwqlp" Sep 30 14:21:04 crc kubenswrapper[4676]: I0930 14:21:04.417448 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6f9769955f-6wd7m" event={"ID":"b18c2fcd-dc66-434b-b3ef-61215f24a511","Type":"ContainerStarted","Data":"b7d4f179bbb4fe138dc7492467158b2b071bf1b7b95bc015d49309908d347a42"} Sep 30 14:21:04 crc kubenswrapper[4676]: I0930 14:21:04.428810 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 14:21:04 crc kubenswrapper[4676]: I0930 14:21:04.429251 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3822c455-1e30-49b4-8e75-1e880a010303" containerName="ceilometer-central-agent" containerID="cri-o://af18c1802fe77e308e9a28b523e216750c8d949edb8500d897eb5240a5070002" gracePeriod=30 Sep 30 14:21:04 crc kubenswrapper[4676]: I0930 14:21:04.429425 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3822c455-1e30-49b4-8e75-1e880a010303" containerName="proxy-httpd" containerID="cri-o://2e616340cf3700bfda5442a6aef8b774e64a5391c2e1c3e45aaee3a8f716024f" gracePeriod=30 Sep 30 14:21:04 crc kubenswrapper[4676]: I0930 14:21:04.429477 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3822c455-1e30-49b4-8e75-1e880a010303" containerName="sg-core" containerID="cri-o://d9f5192673622520d3c069bb015c676dfa1302bca8958f91b0725bb235768f84" gracePeriod=30 Sep 30 14:21:04 crc kubenswrapper[4676]: I0930 14:21:04.429514 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3822c455-1e30-49b4-8e75-1e880a010303" containerName="ceilometer-notification-agent" containerID="cri-o://e78bcec643723cd2edcb42a790fb63da5fcd0ffd6359c261bb5ac04018cb39be" gracePeriod=30 Sep 30 14:21:04 crc kubenswrapper[4676]: I0930 14:21:04.438009 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-56956855f5-jwqlp" podStartSLOduration=3.43797909 podStartE2EDuration="3.43797909s" podCreationTimestamp="2025-09-30 14:21:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:21:04.435569576 +0000 UTC m=+1368.418658005" watchObservedRunningTime="2025-09-30 14:21:04.43797909 +0000 UTC m=+1368.421067519" Sep 30 14:21:04 crc kubenswrapper[4676]: I0930 14:21:04.452850 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="3822c455-1e30-49b4-8e75-1e880a010303" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.156:3000/\": EOF" Sep 30 14:21:04 crc kubenswrapper[4676]: I0930 14:21:04.925844 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="3822c455-1e30-49b4-8e75-1e880a010303" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.156:3000/\": dial tcp 10.217.0.156:3000: connect: connection refused" Sep 30 14:21:05 crc kubenswrapper[4676]: I0930 14:21:05.224306 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 14:21:05 crc kubenswrapper[4676]: I0930 14:21:05.321882 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6f141d8-0218-4c0e-94b8-7c1a235b2e11-config-data\") pod \"e6f141d8-0218-4c0e-94b8-7c1a235b2e11\" (UID: \"e6f141d8-0218-4c0e-94b8-7c1a235b2e11\") " Sep 30 14:21:05 crc kubenswrapper[4676]: I0930 14:21:05.321983 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e6f141d8-0218-4c0e-94b8-7c1a235b2e11-config-data-custom\") pod \"e6f141d8-0218-4c0e-94b8-7c1a235b2e11\" (UID: \"e6f141d8-0218-4c0e-94b8-7c1a235b2e11\") " Sep 30 14:21:05 crc kubenswrapper[4676]: I0930 14:21:05.322038 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6f141d8-0218-4c0e-94b8-7c1a235b2e11-scripts\") pod \"e6f141d8-0218-4c0e-94b8-7c1a235b2e11\" (UID: \"e6f141d8-0218-4c0e-94b8-7c1a235b2e11\") " Sep 30 14:21:05 crc kubenswrapper[4676]: I0930 14:21:05.322070 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pndxc\" (UniqueName: \"kubernetes.io/projected/e6f141d8-0218-4c0e-94b8-7c1a235b2e11-kube-api-access-pndxc\") pod \"e6f141d8-0218-4c0e-94b8-7c1a235b2e11\" (UID: \"e6f141d8-0218-4c0e-94b8-7c1a235b2e11\") " Sep 30 14:21:05 crc kubenswrapper[4676]: I0930 14:21:05.322199 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e6f141d8-0218-4c0e-94b8-7c1a235b2e11-etc-machine-id\") pod \"e6f141d8-0218-4c0e-94b8-7c1a235b2e11\" (UID: \"e6f141d8-0218-4c0e-94b8-7c1a235b2e11\") " Sep 30 14:21:05 crc kubenswrapper[4676]: I0930 14:21:05.322220 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6f141d8-0218-4c0e-94b8-7c1a235b2e11-combined-ca-bundle\") pod \"e6f141d8-0218-4c0e-94b8-7c1a235b2e11\" (UID: \"e6f141d8-0218-4c0e-94b8-7c1a235b2e11\") " Sep 30 14:21:05 crc kubenswrapper[4676]: I0930 14:21:05.327361 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e6f141d8-0218-4c0e-94b8-7c1a235b2e11-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e6f141d8-0218-4c0e-94b8-7c1a235b2e11" (UID: "e6f141d8-0218-4c0e-94b8-7c1a235b2e11"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 14:21:05 crc kubenswrapper[4676]: I0930 14:21:05.358510 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6f141d8-0218-4c0e-94b8-7c1a235b2e11-scripts" (OuterVolumeSpecName: "scripts") pod "e6f141d8-0218-4c0e-94b8-7c1a235b2e11" (UID: "e6f141d8-0218-4c0e-94b8-7c1a235b2e11"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:21:05 crc kubenswrapper[4676]: I0930 14:21:05.367483 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6f141d8-0218-4c0e-94b8-7c1a235b2e11-kube-api-access-pndxc" (OuterVolumeSpecName: "kube-api-access-pndxc") pod "e6f141d8-0218-4c0e-94b8-7c1a235b2e11" (UID: "e6f141d8-0218-4c0e-94b8-7c1a235b2e11"). InnerVolumeSpecName "kube-api-access-pndxc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:21:05 crc kubenswrapper[4676]: I0930 14:21:05.373836 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6f141d8-0218-4c0e-94b8-7c1a235b2e11-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e6f141d8-0218-4c0e-94b8-7c1a235b2e11" (UID: "e6f141d8-0218-4c0e-94b8-7c1a235b2e11"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:21:05 crc kubenswrapper[4676]: I0930 14:21:05.428295 4676 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e6f141d8-0218-4c0e-94b8-7c1a235b2e11-config-data-custom\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:05 crc kubenswrapper[4676]: I0930 14:21:05.428345 4676 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6f141d8-0218-4c0e-94b8-7c1a235b2e11-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:05 crc kubenswrapper[4676]: I0930 14:21:05.428356 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pndxc\" (UniqueName: \"kubernetes.io/projected/e6f141d8-0218-4c0e-94b8-7c1a235b2e11-kube-api-access-pndxc\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:05 crc kubenswrapper[4676]: I0930 14:21:05.428368 4676 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e6f141d8-0218-4c0e-94b8-7c1a235b2e11-etc-machine-id\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:05 crc kubenswrapper[4676]: I0930 14:21:05.450373 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 14:21:05 crc kubenswrapper[4676]: I0930 14:21:05.526828 4676 generic.go:334] "Generic (PLEG): container finished" podID="3822c455-1e30-49b4-8e75-1e880a010303" containerID="2e616340cf3700bfda5442a6aef8b774e64a5391c2e1c3e45aaee3a8f716024f" exitCode=0 Sep 30 14:21:05 crc kubenswrapper[4676]: I0930 14:21:05.526898 4676 generic.go:334] "Generic (PLEG): container finished" podID="3822c455-1e30-49b4-8e75-1e880a010303" containerID="d9f5192673622520d3c069bb015c676dfa1302bca8958f91b0725bb235768f84" exitCode=2 Sep 30 14:21:05 crc kubenswrapper[4676]: I0930 14:21:05.526911 4676 generic.go:334] "Generic (PLEG): container finished" podID="3822c455-1e30-49b4-8e75-1e880a010303" containerID="af18c1802fe77e308e9a28b523e216750c8d949edb8500d897eb5240a5070002" exitCode=0 Sep 30 14:21:05 crc kubenswrapper[4676]: I0930 14:21:05.528762 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-6f9769955f-6wd7m" podStartSLOduration=2.52874375 podStartE2EDuration="2.52874375s" podCreationTimestamp="2025-09-30 14:21:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:21:05.51998626 +0000 UTC m=+1369.503074699" watchObservedRunningTime="2025-09-30 14:21:05.52874375 +0000 UTC m=+1369.511832179" Sep 30 14:21:05 crc kubenswrapper[4676]: I0930 14:21:05.570097 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6f141d8-0218-4c0e-94b8-7c1a235b2e11-config-data" (OuterVolumeSpecName: "config-data") pod "e6f141d8-0218-4c0e-94b8-7c1a235b2e11" (UID: "e6f141d8-0218-4c0e-94b8-7c1a235b2e11"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:21:05 crc kubenswrapper[4676]: I0930 14:21:05.593762 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6f141d8-0218-4c0e-94b8-7c1a235b2e11-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e6f141d8-0218-4c0e-94b8-7c1a235b2e11" (UID: "e6f141d8-0218-4c0e-94b8-7c1a235b2e11"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:21:05 crc kubenswrapper[4676]: I0930 14:21:05.595137 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e6f141d8-0218-4c0e-94b8-7c1a235b2e11","Type":"ContainerDied","Data":"8f04cae86892be24c8a795c55272daf680d7426131ba24445610681789ad61d2"} Sep 30 14:21:05 crc kubenswrapper[4676]: I0930 14:21:05.595204 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6f9769955f-6wd7m" event={"ID":"b18c2fcd-dc66-434b-b3ef-61215f24a511","Type":"ContainerStarted","Data":"a7f1b86362f93a2ef672be3c12a8b3bb5a830c6214fdc11da5ca7dfbde3e1d66"} Sep 30 14:21:05 crc kubenswrapper[4676]: I0930 14:21:05.595233 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6f9769955f-6wd7m" Sep 30 14:21:05 crc kubenswrapper[4676]: I0930 14:21:05.595240 4676 scope.go:117] "RemoveContainer" containerID="79fc113902778c2d239816856ca21d28cd12b90268b02a7e0ed291427e2d4d94" Sep 30 14:21:05 crc kubenswrapper[4676]: I0930 14:21:05.595249 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6f9769955f-6wd7m" event={"ID":"b18c2fcd-dc66-434b-b3ef-61215f24a511","Type":"ContainerStarted","Data":"96067ea1660fcb5a46a0d80cf6fb0d0e79fbe8fd7f75ad749ec298084380dffe"} Sep 30 14:21:05 crc kubenswrapper[4676]: I0930 14:21:05.595298 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6f9769955f-6wd7m" Sep 30 14:21:05 crc kubenswrapper[4676]: I0930 14:21:05.595309 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3822c455-1e30-49b4-8e75-1e880a010303","Type":"ContainerDied","Data":"2e616340cf3700bfda5442a6aef8b774e64a5391c2e1c3e45aaee3a8f716024f"} Sep 30 14:21:05 crc kubenswrapper[4676]: I0930 14:21:05.595323 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3822c455-1e30-49b4-8e75-1e880a010303","Type":"ContainerDied","Data":"d9f5192673622520d3c069bb015c676dfa1302bca8958f91b0725bb235768f84"} Sep 30 14:21:05 crc kubenswrapper[4676]: I0930 14:21:05.595474 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3822c455-1e30-49b4-8e75-1e880a010303","Type":"ContainerDied","Data":"af18c1802fe77e308e9a28b523e216750c8d949edb8500d897eb5240a5070002"} Sep 30 14:21:05 crc kubenswrapper[4676]: I0930 14:21:05.633194 4676 scope.go:117] "RemoveContainer" containerID="beea21d7edaa31807923b4791b366b785e78d23ba9a9b0d6e7588d36f69ff128" Sep 30 14:21:05 crc kubenswrapper[4676]: I0930 14:21:05.634656 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6f141d8-0218-4c0e-94b8-7c1a235b2e11-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:05 crc kubenswrapper[4676]: I0930 14:21:05.634692 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6f141d8-0218-4c0e-94b8-7c1a235b2e11-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:05 crc kubenswrapper[4676]: I0930 14:21:05.791096 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 14:21:05 crc kubenswrapper[4676]: I0930 14:21:05.811727 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 14:21:05 crc kubenswrapper[4676]: I0930 14:21:05.827269 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 14:21:05 crc kubenswrapper[4676]: E0930 14:21:05.827848 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6f141d8-0218-4c0e-94b8-7c1a235b2e11" containerName="probe" Sep 30 14:21:05 crc kubenswrapper[4676]: I0930 14:21:05.827875 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6f141d8-0218-4c0e-94b8-7c1a235b2e11" containerName="probe" Sep 30 14:21:05 crc kubenswrapper[4676]: E0930 14:21:05.827905 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6f141d8-0218-4c0e-94b8-7c1a235b2e11" containerName="cinder-scheduler" Sep 30 14:21:05 crc kubenswrapper[4676]: I0930 14:21:05.827937 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6f141d8-0218-4c0e-94b8-7c1a235b2e11" containerName="cinder-scheduler" Sep 30 14:21:05 crc kubenswrapper[4676]: I0930 14:21:05.828196 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6f141d8-0218-4c0e-94b8-7c1a235b2e11" containerName="probe" Sep 30 14:21:05 crc kubenswrapper[4676]: I0930 14:21:05.828237 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6f141d8-0218-4c0e-94b8-7c1a235b2e11" containerName="cinder-scheduler" Sep 30 14:21:05 crc kubenswrapper[4676]: I0930 14:21:05.829656 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 14:21:05 crc kubenswrapper[4676]: I0930 14:21:05.835176 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Sep 30 14:21:05 crc kubenswrapper[4676]: I0930 14:21:05.883693 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 14:21:05 crc kubenswrapper[4676]: I0930 14:21:05.940667 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21c274fe-4499-4294-b725-96e48b657186-scripts\") pod \"cinder-scheduler-0\" (UID: \"21c274fe-4499-4294-b725-96e48b657186\") " pod="openstack/cinder-scheduler-0" Sep 30 14:21:05 crc kubenswrapper[4676]: I0930 14:21:05.941025 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/21c274fe-4499-4294-b725-96e48b657186-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"21c274fe-4499-4294-b725-96e48b657186\") " pod="openstack/cinder-scheduler-0" Sep 30 14:21:05 crc kubenswrapper[4676]: I0930 14:21:05.941132 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/21c274fe-4499-4294-b725-96e48b657186-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"21c274fe-4499-4294-b725-96e48b657186\") " pod="openstack/cinder-scheduler-0" Sep 30 14:21:05 crc kubenswrapper[4676]: I0930 14:21:05.941288 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21c274fe-4499-4294-b725-96e48b657186-config-data\") pod \"cinder-scheduler-0\" (UID: \"21c274fe-4499-4294-b725-96e48b657186\") " pod="openstack/cinder-scheduler-0" Sep 30 14:21:05 crc kubenswrapper[4676]: I0930 14:21:05.941389 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21c274fe-4499-4294-b725-96e48b657186-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"21c274fe-4499-4294-b725-96e48b657186\") " pod="openstack/cinder-scheduler-0" Sep 30 14:21:05 crc kubenswrapper[4676]: I0930 14:21:05.941515 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzwvt\" (UniqueName: \"kubernetes.io/projected/21c274fe-4499-4294-b725-96e48b657186-kube-api-access-gzwvt\") pod \"cinder-scheduler-0\" (UID: \"21c274fe-4499-4294-b725-96e48b657186\") " pod="openstack/cinder-scheduler-0" Sep 30 14:21:06 crc kubenswrapper[4676]: I0930 14:21:06.043606 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21c274fe-4499-4294-b725-96e48b657186-config-data\") pod \"cinder-scheduler-0\" (UID: \"21c274fe-4499-4294-b725-96e48b657186\") " pod="openstack/cinder-scheduler-0" Sep 30 14:21:06 crc kubenswrapper[4676]: I0930 14:21:06.043676 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21c274fe-4499-4294-b725-96e48b657186-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"21c274fe-4499-4294-b725-96e48b657186\") " pod="openstack/cinder-scheduler-0" Sep 30 14:21:06 crc kubenswrapper[4676]: I0930 14:21:06.043729 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzwvt\" (UniqueName: \"kubernetes.io/projected/21c274fe-4499-4294-b725-96e48b657186-kube-api-access-gzwvt\") pod \"cinder-scheduler-0\" (UID: \"21c274fe-4499-4294-b725-96e48b657186\") " pod="openstack/cinder-scheduler-0" Sep 30 14:21:06 crc kubenswrapper[4676]: I0930 14:21:06.043778 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21c274fe-4499-4294-b725-96e48b657186-scripts\") pod \"cinder-scheduler-0\" (UID: \"21c274fe-4499-4294-b725-96e48b657186\") " pod="openstack/cinder-scheduler-0" Sep 30 14:21:06 crc kubenswrapper[4676]: I0930 14:21:06.044162 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/21c274fe-4499-4294-b725-96e48b657186-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"21c274fe-4499-4294-b725-96e48b657186\") " pod="openstack/cinder-scheduler-0" Sep 30 14:21:06 crc kubenswrapper[4676]: I0930 14:21:06.044205 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/21c274fe-4499-4294-b725-96e48b657186-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"21c274fe-4499-4294-b725-96e48b657186\") " pod="openstack/cinder-scheduler-0" Sep 30 14:21:06 crc kubenswrapper[4676]: I0930 14:21:06.044280 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/21c274fe-4499-4294-b725-96e48b657186-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"21c274fe-4499-4294-b725-96e48b657186\") " pod="openstack/cinder-scheduler-0" Sep 30 14:21:06 crc kubenswrapper[4676]: I0930 14:21:06.049547 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21c274fe-4499-4294-b725-96e48b657186-config-data\") pod \"cinder-scheduler-0\" (UID: \"21c274fe-4499-4294-b725-96e48b657186\") " pod="openstack/cinder-scheduler-0" Sep 30 14:21:06 crc kubenswrapper[4676]: I0930 14:21:06.050459 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/21c274fe-4499-4294-b725-96e48b657186-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"21c274fe-4499-4294-b725-96e48b657186\") " pod="openstack/cinder-scheduler-0" Sep 30 14:21:06 crc kubenswrapper[4676]: I0930 14:21:06.053047 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21c274fe-4499-4294-b725-96e48b657186-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"21c274fe-4499-4294-b725-96e48b657186\") " pod="openstack/cinder-scheduler-0" Sep 30 14:21:06 crc kubenswrapper[4676]: I0930 14:21:06.058006 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21c274fe-4499-4294-b725-96e48b657186-scripts\") pod \"cinder-scheduler-0\" (UID: \"21c274fe-4499-4294-b725-96e48b657186\") " pod="openstack/cinder-scheduler-0" Sep 30 14:21:06 crc kubenswrapper[4676]: I0930 14:21:06.064582 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzwvt\" (UniqueName: \"kubernetes.io/projected/21c274fe-4499-4294-b725-96e48b657186-kube-api-access-gzwvt\") pod \"cinder-scheduler-0\" (UID: \"21c274fe-4499-4294-b725-96e48b657186\") " pod="openstack/cinder-scheduler-0" Sep 30 14:21:06 crc kubenswrapper[4676]: I0930 14:21:06.164726 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 14:21:06 crc kubenswrapper[4676]: I0930 14:21:06.743452 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 14:21:06 crc kubenswrapper[4676]: W0930 14:21:06.792642 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21c274fe_4499_4294_b725_96e48b657186.slice/crio-c59631000fd6b4f6e5c0fa10b8542efa0c1069f44d02e56090f7f595fad8f5f3 WatchSource:0}: Error finding container c59631000fd6b4f6e5c0fa10b8542efa0c1069f44d02e56090f7f595fad8f5f3: Status 404 returned error can't find the container with id c59631000fd6b4f6e5c0fa10b8542efa0c1069f44d02e56090f7f595fad8f5f3 Sep 30 14:21:07 crc kubenswrapper[4676]: I0930 14:21:07.452337 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6f141d8-0218-4c0e-94b8-7c1a235b2e11" path="/var/lib/kubelet/pods/e6f141d8-0218-4c0e-94b8-7c1a235b2e11/volumes" Sep 30 14:21:07 crc kubenswrapper[4676]: I0930 14:21:07.567279 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"21c274fe-4499-4294-b725-96e48b657186","Type":"ContainerStarted","Data":"c59631000fd6b4f6e5c0fa10b8542efa0c1069f44d02e56090f7f595fad8f5f3"} Sep 30 14:21:08 crc kubenswrapper[4676]: I0930 14:21:08.819058 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5784cf869f-bwmdv" Sep 30 14:21:08 crc kubenswrapper[4676]: I0930 14:21:08.879121 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59d5ff467f-hnlsg"] Sep 30 14:21:08 crc kubenswrapper[4676]: I0930 14:21:08.879397 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-59d5ff467f-hnlsg" podUID="d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e" containerName="dnsmasq-dns" containerID="cri-o://f850c5bf18d71b920a7d2d8e9851076135152ad64a9dae01f8edfd5dd080117c" gracePeriod=10 Sep 30 14:21:09 crc kubenswrapper[4676]: I0930 14:21:09.594429 4676 generic.go:334] "Generic (PLEG): container finished" podID="3822c455-1e30-49b4-8e75-1e880a010303" containerID="e78bcec643723cd2edcb42a790fb63da5fcd0ffd6359c261bb5ac04018cb39be" exitCode=0 Sep 30 14:21:09 crc kubenswrapper[4676]: I0930 14:21:09.594575 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3822c455-1e30-49b4-8e75-1e880a010303","Type":"ContainerDied","Data":"e78bcec643723cd2edcb42a790fb63da5fcd0ffd6359c261bb5ac04018cb39be"} Sep 30 14:21:09 crc kubenswrapper[4676]: I0930 14:21:09.607190 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"21c274fe-4499-4294-b725-96e48b657186","Type":"ContainerStarted","Data":"dac86341ed4b7bfbb131d79dfd07f464a5442e0591b51aad40f118cfebd4a651"} Sep 30 14:21:09 crc kubenswrapper[4676]: I0930 14:21:09.611009 4676 generic.go:334] "Generic (PLEG): container finished" podID="d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e" containerID="f850c5bf18d71b920a7d2d8e9851076135152ad64a9dae01f8edfd5dd080117c" exitCode=0 Sep 30 14:21:09 crc kubenswrapper[4676]: I0930 14:21:09.611042 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59d5ff467f-hnlsg" event={"ID":"d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e","Type":"ContainerDied","Data":"f850c5bf18d71b920a7d2d8e9851076135152ad64a9dae01f8edfd5dd080117c"} Sep 30 14:21:10 crc kubenswrapper[4676]: I0930 14:21:10.488034 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Sep 30 14:21:11 crc kubenswrapper[4676]: I0930 14:21:11.801524 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6668cdff8d-z8vnk" podUID="ace12602-f0f7-4f29-8c37-72c1e840bacc" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.144:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.144:8443: connect: connection refused" Sep 30 14:21:11 crc kubenswrapper[4676]: I0930 14:21:11.802435 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6668cdff8d-z8vnk" Sep 30 14:21:12 crc kubenswrapper[4676]: I0930 14:21:12.844860 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-59d5ff467f-hnlsg" podUID="d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.154:5353: connect: connection refused" Sep 30 14:21:13 crc kubenswrapper[4676]: I0930 14:21:13.404477 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6f9769955f-6wd7m" Sep 30 14:21:13 crc kubenswrapper[4676]: I0930 14:21:13.405467 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6f9769955f-6wd7m" Sep 30 14:21:14 crc kubenswrapper[4676]: I0930 14:21:14.545210 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 14:21:14 crc kubenswrapper[4676]: I0930 14:21:14.552570 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59d5ff467f-hnlsg" Sep 30 14:21:14 crc kubenswrapper[4676]: I0930 14:21:14.618476 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e-dns-svc\") pod \"d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e\" (UID: \"d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e\") " Sep 30 14:21:14 crc kubenswrapper[4676]: I0930 14:21:14.618587 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhc5d\" (UniqueName: \"kubernetes.io/projected/d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e-kube-api-access-lhc5d\") pod \"d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e\" (UID: \"d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e\") " Sep 30 14:21:14 crc kubenswrapper[4676]: I0930 14:21:14.618666 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3822c455-1e30-49b4-8e75-1e880a010303-log-httpd\") pod \"3822c455-1e30-49b4-8e75-1e880a010303\" (UID: \"3822c455-1e30-49b4-8e75-1e880a010303\") " Sep 30 14:21:14 crc kubenswrapper[4676]: I0930 14:21:14.618704 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e-ovsdbserver-nb\") pod \"d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e\" (UID: \"d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e\") " Sep 30 14:21:14 crc kubenswrapper[4676]: I0930 14:21:14.618736 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbkjd\" (UniqueName: \"kubernetes.io/projected/3822c455-1e30-49b4-8e75-1e880a010303-kube-api-access-kbkjd\") pod \"3822c455-1e30-49b4-8e75-1e880a010303\" (UID: \"3822c455-1e30-49b4-8e75-1e880a010303\") " Sep 30 14:21:14 crc kubenswrapper[4676]: I0930 14:21:14.618790 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3822c455-1e30-49b4-8e75-1e880a010303-config-data\") pod \"3822c455-1e30-49b4-8e75-1e880a010303\" (UID: \"3822c455-1e30-49b4-8e75-1e880a010303\") " Sep 30 14:21:14 crc kubenswrapper[4676]: I0930 14:21:14.618826 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3822c455-1e30-49b4-8e75-1e880a010303-scripts\") pod \"3822c455-1e30-49b4-8e75-1e880a010303\" (UID: \"3822c455-1e30-49b4-8e75-1e880a010303\") " Sep 30 14:21:14 crc kubenswrapper[4676]: I0930 14:21:14.618849 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e-dns-swift-storage-0\") pod \"d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e\" (UID: \"d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e\") " Sep 30 14:21:14 crc kubenswrapper[4676]: I0930 14:21:14.618907 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e-ovsdbserver-sb\") pod \"d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e\" (UID: \"d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e\") " Sep 30 14:21:14 crc kubenswrapper[4676]: I0930 14:21:14.618925 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3822c455-1e30-49b4-8e75-1e880a010303-run-httpd\") pod \"3822c455-1e30-49b4-8e75-1e880a010303\" (UID: \"3822c455-1e30-49b4-8e75-1e880a010303\") " Sep 30 14:21:14 crc kubenswrapper[4676]: I0930 14:21:14.618976 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3822c455-1e30-49b4-8e75-1e880a010303-sg-core-conf-yaml\") pod \"3822c455-1e30-49b4-8e75-1e880a010303\" (UID: \"3822c455-1e30-49b4-8e75-1e880a010303\") " Sep 30 14:21:14 crc kubenswrapper[4676]: I0930 14:21:14.619048 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e-config\") pod \"d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e\" (UID: \"d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e\") " Sep 30 14:21:14 crc kubenswrapper[4676]: I0930 14:21:14.619101 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3822c455-1e30-49b4-8e75-1e880a010303-combined-ca-bundle\") pod \"3822c455-1e30-49b4-8e75-1e880a010303\" (UID: \"3822c455-1e30-49b4-8e75-1e880a010303\") " Sep 30 14:21:14 crc kubenswrapper[4676]: I0930 14:21:14.626100 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3822c455-1e30-49b4-8e75-1e880a010303-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3822c455-1e30-49b4-8e75-1e880a010303" (UID: "3822c455-1e30-49b4-8e75-1e880a010303"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:21:14 crc kubenswrapper[4676]: I0930 14:21:14.626382 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3822c455-1e30-49b4-8e75-1e880a010303-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3822c455-1e30-49b4-8e75-1e880a010303" (UID: "3822c455-1e30-49b4-8e75-1e880a010303"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:21:14 crc kubenswrapper[4676]: I0930 14:21:14.644481 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e-kube-api-access-lhc5d" (OuterVolumeSpecName: "kube-api-access-lhc5d") pod "d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e" (UID: "d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e"). InnerVolumeSpecName "kube-api-access-lhc5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:21:14 crc kubenswrapper[4676]: I0930 14:21:14.644520 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3822c455-1e30-49b4-8e75-1e880a010303-scripts" (OuterVolumeSpecName: "scripts") pod "3822c455-1e30-49b4-8e75-1e880a010303" (UID: "3822c455-1e30-49b4-8e75-1e880a010303"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:21:14 crc kubenswrapper[4676]: I0930 14:21:14.644668 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3822c455-1e30-49b4-8e75-1e880a010303-kube-api-access-kbkjd" (OuterVolumeSpecName: "kube-api-access-kbkjd") pod "3822c455-1e30-49b4-8e75-1e880a010303" (UID: "3822c455-1e30-49b4-8e75-1e880a010303"). InnerVolumeSpecName "kube-api-access-kbkjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:21:14 crc kubenswrapper[4676]: I0930 14:21:14.679637 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59d5ff467f-hnlsg" event={"ID":"d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e","Type":"ContainerDied","Data":"2f276f3570a080005a25bc08e9f6f6d81cf1403e8ff41344c84940724477d7a4"} Sep 30 14:21:14 crc kubenswrapper[4676]: I0930 14:21:14.679749 4676 scope.go:117] "RemoveContainer" containerID="f850c5bf18d71b920a7d2d8e9851076135152ad64a9dae01f8edfd5dd080117c" Sep 30 14:21:14 crc kubenswrapper[4676]: I0930 14:21:14.679924 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59d5ff467f-hnlsg" Sep 30 14:21:14 crc kubenswrapper[4676]: I0930 14:21:14.688180 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3822c455-1e30-49b4-8e75-1e880a010303","Type":"ContainerDied","Data":"c7d8c016700e3bf579da0fc035cee26451c0a8214ac4a4dce55c61c798056b44"} Sep 30 14:21:14 crc kubenswrapper[4676]: I0930 14:21:14.688471 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 14:21:14 crc kubenswrapper[4676]: I0930 14:21:14.710562 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3822c455-1e30-49b4-8e75-1e880a010303-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3822c455-1e30-49b4-8e75-1e880a010303" (UID: "3822c455-1e30-49b4-8e75-1e880a010303"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:21:14 crc kubenswrapper[4676]: I0930 14:21:14.720795 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhc5d\" (UniqueName: \"kubernetes.io/projected/d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e-kube-api-access-lhc5d\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:14 crc kubenswrapper[4676]: I0930 14:21:14.720826 4676 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3822c455-1e30-49b4-8e75-1e880a010303-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:14 crc kubenswrapper[4676]: I0930 14:21:14.720838 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbkjd\" (UniqueName: \"kubernetes.io/projected/3822c455-1e30-49b4-8e75-1e880a010303-kube-api-access-kbkjd\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:14 crc kubenswrapper[4676]: I0930 14:21:14.720849 4676 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3822c455-1e30-49b4-8e75-1e880a010303-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:14 crc kubenswrapper[4676]: I0930 14:21:14.720860 4676 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3822c455-1e30-49b4-8e75-1e880a010303-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:14 crc kubenswrapper[4676]: I0930 14:21:14.720877 4676 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3822c455-1e30-49b4-8e75-1e880a010303-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:15 crc kubenswrapper[4676]: I0930 14:21:15.109256 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e-config" (OuterVolumeSpecName: "config") pod "d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e" (UID: "d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:21:15 crc kubenswrapper[4676]: I0930 14:21:15.124934 4676 scope.go:117] "RemoveContainer" containerID="e0ff3dafec2fa5e65f44a99bad7928dfa0aed5664e7b4c217b90419015664cae" Sep 30 14:21:15 crc kubenswrapper[4676]: I0930 14:21:15.130695 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e-config\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:15 crc kubenswrapper[4676]: I0930 14:21:15.135106 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e" (UID: "d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:21:15 crc kubenswrapper[4676]: I0930 14:21:15.141315 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e" (UID: "d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:21:15 crc kubenswrapper[4676]: I0930 14:21:15.142274 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3822c455-1e30-49b4-8e75-1e880a010303-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3822c455-1e30-49b4-8e75-1e880a010303" (UID: "3822c455-1e30-49b4-8e75-1e880a010303"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:21:15 crc kubenswrapper[4676]: I0930 14:21:15.164452 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e" (UID: "d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:21:15 crc kubenswrapper[4676]: I0930 14:21:15.173429 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e" (UID: "d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:21:15 crc kubenswrapper[4676]: I0930 14:21:15.221055 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3822c455-1e30-49b4-8e75-1e880a010303-config-data" (OuterVolumeSpecName: "config-data") pod "3822c455-1e30-49b4-8e75-1e880a010303" (UID: "3822c455-1e30-49b4-8e75-1e880a010303"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:21:15 crc kubenswrapper[4676]: I0930 14:21:15.233262 4676 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:15 crc kubenswrapper[4676]: I0930 14:21:15.233641 4676 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:15 crc kubenswrapper[4676]: I0930 14:21:15.233656 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3822c455-1e30-49b4-8e75-1e880a010303-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:15 crc kubenswrapper[4676]: I0930 14:21:15.233670 4676 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:15 crc kubenswrapper[4676]: I0930 14:21:15.233682 4676 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:15 crc kubenswrapper[4676]: I0930 14:21:15.233690 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3822c455-1e30-49b4-8e75-1e880a010303-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:15 crc kubenswrapper[4676]: I0930 14:21:15.288570 4676 scope.go:117] "RemoveContainer" containerID="2e616340cf3700bfda5442a6aef8b774e64a5391c2e1c3e45aaee3a8f716024f" Sep 30 14:21:15 crc kubenswrapper[4676]: I0930 14:21:15.325217 4676 scope.go:117] "RemoveContainer" containerID="d9f5192673622520d3c069bb015c676dfa1302bca8958f91b0725bb235768f84" Sep 30 14:21:15 crc kubenswrapper[4676]: I0930 14:21:15.365061 4676 scope.go:117] "RemoveContainer" containerID="e78bcec643723cd2edcb42a790fb63da5fcd0ffd6359c261bb5ac04018cb39be" Sep 30 14:21:15 crc kubenswrapper[4676]: I0930 14:21:15.374967 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 14:21:15 crc kubenswrapper[4676]: I0930 14:21:15.393963 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 14:21:15 crc kubenswrapper[4676]: I0930 14:21:15.406534 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59d5ff467f-hnlsg"] Sep 30 14:21:15 crc kubenswrapper[4676]: I0930 14:21:15.417420 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-59d5ff467f-hnlsg"] Sep 30 14:21:15 crc kubenswrapper[4676]: I0930 14:21:15.473988 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3822c455-1e30-49b4-8e75-1e880a010303" path="/var/lib/kubelet/pods/3822c455-1e30-49b4-8e75-1e880a010303/volumes" Sep 30 14:21:15 crc kubenswrapper[4676]: I0930 14:21:15.511642 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e" path="/var/lib/kubelet/pods/d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e/volumes" Sep 30 14:21:15 crc kubenswrapper[4676]: I0930 14:21:15.512337 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 14:21:15 crc kubenswrapper[4676]: E0930 14:21:15.512703 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e" containerName="dnsmasq-dns" Sep 30 14:21:15 crc kubenswrapper[4676]: I0930 14:21:15.512719 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e" containerName="dnsmasq-dns" Sep 30 14:21:15 crc kubenswrapper[4676]: E0930 14:21:15.512732 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3822c455-1e30-49b4-8e75-1e880a010303" containerName="ceilometer-notification-agent" Sep 30 14:21:15 crc kubenswrapper[4676]: I0930 14:21:15.512738 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="3822c455-1e30-49b4-8e75-1e880a010303" containerName="ceilometer-notification-agent" Sep 30 14:21:15 crc kubenswrapper[4676]: E0930 14:21:15.512752 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e" containerName="init" Sep 30 14:21:15 crc kubenswrapper[4676]: I0930 14:21:15.512758 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e" containerName="init" Sep 30 14:21:15 crc kubenswrapper[4676]: E0930 14:21:15.512768 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3822c455-1e30-49b4-8e75-1e880a010303" containerName="sg-core" Sep 30 14:21:15 crc kubenswrapper[4676]: I0930 14:21:15.512773 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="3822c455-1e30-49b4-8e75-1e880a010303" containerName="sg-core" Sep 30 14:21:15 crc kubenswrapper[4676]: E0930 14:21:15.512794 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3822c455-1e30-49b4-8e75-1e880a010303" containerName="proxy-httpd" Sep 30 14:21:15 crc kubenswrapper[4676]: I0930 14:21:15.512800 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="3822c455-1e30-49b4-8e75-1e880a010303" containerName="proxy-httpd" Sep 30 14:21:15 crc kubenswrapper[4676]: E0930 14:21:15.512810 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3822c455-1e30-49b4-8e75-1e880a010303" containerName="ceilometer-central-agent" Sep 30 14:21:15 crc kubenswrapper[4676]: I0930 14:21:15.512816 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="3822c455-1e30-49b4-8e75-1e880a010303" containerName="ceilometer-central-agent" Sep 30 14:21:15 crc kubenswrapper[4676]: I0930 14:21:15.513008 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8232ccd-2bcc-4cbe-b7b7-92bbe14cb14e" containerName="dnsmasq-dns" Sep 30 14:21:15 crc kubenswrapper[4676]: I0930 14:21:15.513026 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="3822c455-1e30-49b4-8e75-1e880a010303" containerName="ceilometer-central-agent" Sep 30 14:21:15 crc kubenswrapper[4676]: I0930 14:21:15.513034 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="3822c455-1e30-49b4-8e75-1e880a010303" containerName="sg-core" Sep 30 14:21:15 crc kubenswrapper[4676]: I0930 14:21:15.513043 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="3822c455-1e30-49b4-8e75-1e880a010303" containerName="proxy-httpd" Sep 30 14:21:15 crc kubenswrapper[4676]: I0930 14:21:15.513058 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="3822c455-1e30-49b4-8e75-1e880a010303" containerName="ceilometer-notification-agent" Sep 30 14:21:15 crc kubenswrapper[4676]: I0930 14:21:15.515428 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 14:21:15 crc kubenswrapper[4676]: I0930 14:21:15.515550 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 14:21:15 crc kubenswrapper[4676]: I0930 14:21:15.519221 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 14:21:15 crc kubenswrapper[4676]: I0930 14:21:15.519974 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 14:21:15 crc kubenswrapper[4676]: I0930 14:21:15.527414 4676 scope.go:117] "RemoveContainer" containerID="af18c1802fe77e308e9a28b523e216750c8d949edb8500d897eb5240a5070002" Sep 30 14:21:15 crc kubenswrapper[4676]: I0930 14:21:15.650349 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33f3cbef-5c84-4550-a054-243260e64ccb-scripts\") pod \"ceilometer-0\" (UID: \"33f3cbef-5c84-4550-a054-243260e64ccb\") " pod="openstack/ceilometer-0" Sep 30 14:21:15 crc kubenswrapper[4676]: I0930 14:21:15.650400 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33f3cbef-5c84-4550-a054-243260e64ccb-config-data\") pod \"ceilometer-0\" (UID: \"33f3cbef-5c84-4550-a054-243260e64ccb\") " pod="openstack/ceilometer-0" Sep 30 14:21:15 crc kubenswrapper[4676]: I0930 14:21:15.651779 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33f3cbef-5c84-4550-a054-243260e64ccb-log-httpd\") pod \"ceilometer-0\" (UID: \"33f3cbef-5c84-4550-a054-243260e64ccb\") " pod="openstack/ceilometer-0" Sep 30 14:21:15 crc kubenswrapper[4676]: I0930 14:21:15.652241 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/33f3cbef-5c84-4550-a054-243260e64ccb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"33f3cbef-5c84-4550-a054-243260e64ccb\") " pod="openstack/ceilometer-0" Sep 30 14:21:15 crc kubenswrapper[4676]: I0930 14:21:15.652275 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33f3cbef-5c84-4550-a054-243260e64ccb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"33f3cbef-5c84-4550-a054-243260e64ccb\") " pod="openstack/ceilometer-0" Sep 30 14:21:15 crc kubenswrapper[4676]: I0930 14:21:15.652297 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msgjl\" (UniqueName: \"kubernetes.io/projected/33f3cbef-5c84-4550-a054-243260e64ccb-kube-api-access-msgjl\") pod \"ceilometer-0\" (UID: \"33f3cbef-5c84-4550-a054-243260e64ccb\") " pod="openstack/ceilometer-0" Sep 30 14:21:15 crc kubenswrapper[4676]: I0930 14:21:15.652318 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33f3cbef-5c84-4550-a054-243260e64ccb-run-httpd\") pod \"ceilometer-0\" (UID: \"33f3cbef-5c84-4550-a054-243260e64ccb\") " pod="openstack/ceilometer-0" Sep 30 14:21:15 crc kubenswrapper[4676]: I0930 14:21:15.702684 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"21c274fe-4499-4294-b725-96e48b657186","Type":"ContainerStarted","Data":"90b0301d125f356ce23dc52bca660bb1ae013a5860ea1d2bf3d82954f83c890b"} Sep 30 14:21:15 crc kubenswrapper[4676]: I0930 14:21:15.735092 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=10.735065329 podStartE2EDuration="10.735065329s" podCreationTimestamp="2025-09-30 14:21:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:21:15.726100235 +0000 UTC m=+1379.709188664" watchObservedRunningTime="2025-09-30 14:21:15.735065329 +0000 UTC m=+1379.718153758" Sep 30 14:21:15 crc kubenswrapper[4676]: I0930 14:21:15.755458 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33f3cbef-5c84-4550-a054-243260e64ccb-scripts\") pod \"ceilometer-0\" (UID: \"33f3cbef-5c84-4550-a054-243260e64ccb\") " pod="openstack/ceilometer-0" Sep 30 14:21:15 crc kubenswrapper[4676]: I0930 14:21:15.755545 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33f3cbef-5c84-4550-a054-243260e64ccb-config-data\") pod \"ceilometer-0\" (UID: \"33f3cbef-5c84-4550-a054-243260e64ccb\") " pod="openstack/ceilometer-0" Sep 30 14:21:15 crc kubenswrapper[4676]: I0930 14:21:15.755770 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33f3cbef-5c84-4550-a054-243260e64ccb-log-httpd\") pod \"ceilometer-0\" (UID: \"33f3cbef-5c84-4550-a054-243260e64ccb\") " pod="openstack/ceilometer-0" Sep 30 14:21:15 crc kubenswrapper[4676]: I0930 14:21:15.756071 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/33f3cbef-5c84-4550-a054-243260e64ccb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"33f3cbef-5c84-4550-a054-243260e64ccb\") " pod="openstack/ceilometer-0" Sep 30 14:21:15 crc kubenswrapper[4676]: I0930 14:21:15.756101 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33f3cbef-5c84-4550-a054-243260e64ccb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"33f3cbef-5c84-4550-a054-243260e64ccb\") " pod="openstack/ceilometer-0" Sep 30 14:21:15 crc kubenswrapper[4676]: I0930 14:21:15.756139 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msgjl\" (UniqueName: \"kubernetes.io/projected/33f3cbef-5c84-4550-a054-243260e64ccb-kube-api-access-msgjl\") pod \"ceilometer-0\" (UID: \"33f3cbef-5c84-4550-a054-243260e64ccb\") " pod="openstack/ceilometer-0" Sep 30 14:21:15 crc kubenswrapper[4676]: I0930 14:21:15.756169 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33f3cbef-5c84-4550-a054-243260e64ccb-run-httpd\") pod \"ceilometer-0\" (UID: \"33f3cbef-5c84-4550-a054-243260e64ccb\") " pod="openstack/ceilometer-0" Sep 30 14:21:15 crc kubenswrapper[4676]: I0930 14:21:15.759599 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33f3cbef-5c84-4550-a054-243260e64ccb-log-httpd\") pod \"ceilometer-0\" (UID: \"33f3cbef-5c84-4550-a054-243260e64ccb\") " pod="openstack/ceilometer-0" Sep 30 14:21:15 crc kubenswrapper[4676]: I0930 14:21:15.763530 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/33f3cbef-5c84-4550-a054-243260e64ccb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"33f3cbef-5c84-4550-a054-243260e64ccb\") " pod="openstack/ceilometer-0" Sep 30 14:21:15 crc kubenswrapper[4676]: I0930 14:21:15.768069 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33f3cbef-5c84-4550-a054-243260e64ccb-config-data\") pod \"ceilometer-0\" (UID: \"33f3cbef-5c84-4550-a054-243260e64ccb\") " pod="openstack/ceilometer-0" Sep 30 14:21:15 crc kubenswrapper[4676]: I0930 14:21:15.768921 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33f3cbef-5c84-4550-a054-243260e64ccb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"33f3cbef-5c84-4550-a054-243260e64ccb\") " pod="openstack/ceilometer-0" Sep 30 14:21:15 crc kubenswrapper[4676]: I0930 14:21:15.769142 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33f3cbef-5c84-4550-a054-243260e64ccb-run-httpd\") pod \"ceilometer-0\" (UID: \"33f3cbef-5c84-4550-a054-243260e64ccb\") " pod="openstack/ceilometer-0" Sep 30 14:21:15 crc kubenswrapper[4676]: I0930 14:21:15.769271 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33f3cbef-5c84-4550-a054-243260e64ccb-scripts\") pod \"ceilometer-0\" (UID: \"33f3cbef-5c84-4550-a054-243260e64ccb\") " pod="openstack/ceilometer-0" Sep 30 14:21:15 crc kubenswrapper[4676]: I0930 14:21:15.783275 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msgjl\" (UniqueName: \"kubernetes.io/projected/33f3cbef-5c84-4550-a054-243260e64ccb-kube-api-access-msgjl\") pod \"ceilometer-0\" (UID: \"33f3cbef-5c84-4550-a054-243260e64ccb\") " pod="openstack/ceilometer-0" Sep 30 14:21:15 crc kubenswrapper[4676]: I0930 14:21:15.845647 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 14:21:16 crc kubenswrapper[4676]: I0930 14:21:16.165167 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Sep 30 14:21:16 crc kubenswrapper[4676]: I0930 14:21:16.166707 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/cinder-scheduler-0" podUID="21c274fe-4499-4294-b725-96e48b657186" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.0.169:8080/\": dial tcp 10.217.0.169:8080: connect: connection refused" Sep 30 14:21:18 crc kubenswrapper[4676]: I0930 14:21:18.739703 4676 generic.go:334] "Generic (PLEG): container finished" podID="ace12602-f0f7-4f29-8c37-72c1e840bacc" containerID="e982bb57f0c69fac6b00b72b2db739d79c8214c812ea9ff516f0a7d6e34a9939" exitCode=137 Sep 30 14:21:18 crc kubenswrapper[4676]: I0930 14:21:18.739959 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6668cdff8d-z8vnk" event={"ID":"ace12602-f0f7-4f29-8c37-72c1e840bacc","Type":"ContainerDied","Data":"e982bb57f0c69fac6b00b72b2db739d79c8214c812ea9ff516f0a7d6e34a9939"} Sep 30 14:21:19 crc kubenswrapper[4676]: I0930 14:21:19.122799 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6668cdff8d-z8vnk" Sep 30 14:21:19 crc kubenswrapper[4676]: I0930 14:21:19.222651 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ace12602-f0f7-4f29-8c37-72c1e840bacc-config-data\") pod \"ace12602-f0f7-4f29-8c37-72c1e840bacc\" (UID: \"ace12602-f0f7-4f29-8c37-72c1e840bacc\") " Sep 30 14:21:19 crc kubenswrapper[4676]: I0930 14:21:19.222699 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ace12602-f0f7-4f29-8c37-72c1e840bacc-combined-ca-bundle\") pod \"ace12602-f0f7-4f29-8c37-72c1e840bacc\" (UID: \"ace12602-f0f7-4f29-8c37-72c1e840bacc\") " Sep 30 14:21:19 crc kubenswrapper[4676]: I0930 14:21:19.222775 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ace12602-f0f7-4f29-8c37-72c1e840bacc-horizon-tls-certs\") pod \"ace12602-f0f7-4f29-8c37-72c1e840bacc\" (UID: \"ace12602-f0f7-4f29-8c37-72c1e840bacc\") " Sep 30 14:21:19 crc kubenswrapper[4676]: I0930 14:21:19.223398 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ace12602-f0f7-4f29-8c37-72c1e840bacc-horizon-secret-key\") pod \"ace12602-f0f7-4f29-8c37-72c1e840bacc\" (UID: \"ace12602-f0f7-4f29-8c37-72c1e840bacc\") " Sep 30 14:21:19 crc kubenswrapper[4676]: I0930 14:21:19.223486 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ace12602-f0f7-4f29-8c37-72c1e840bacc-logs\") pod \"ace12602-f0f7-4f29-8c37-72c1e840bacc\" (UID: \"ace12602-f0f7-4f29-8c37-72c1e840bacc\") " Sep 30 14:21:19 crc kubenswrapper[4676]: I0930 14:21:19.223525 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ace12602-f0f7-4f29-8c37-72c1e840bacc-scripts\") pod \"ace12602-f0f7-4f29-8c37-72c1e840bacc\" (UID: \"ace12602-f0f7-4f29-8c37-72c1e840bacc\") " Sep 30 14:21:19 crc kubenswrapper[4676]: I0930 14:21:19.223609 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrhhz\" (UniqueName: \"kubernetes.io/projected/ace12602-f0f7-4f29-8c37-72c1e840bacc-kube-api-access-rrhhz\") pod \"ace12602-f0f7-4f29-8c37-72c1e840bacc\" (UID: \"ace12602-f0f7-4f29-8c37-72c1e840bacc\") " Sep 30 14:21:19 crc kubenswrapper[4676]: I0930 14:21:19.225091 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ace12602-f0f7-4f29-8c37-72c1e840bacc-logs" (OuterVolumeSpecName: "logs") pod "ace12602-f0f7-4f29-8c37-72c1e840bacc" (UID: "ace12602-f0f7-4f29-8c37-72c1e840bacc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:21:19 crc kubenswrapper[4676]: I0930 14:21:19.227814 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ace12602-f0f7-4f29-8c37-72c1e840bacc-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "ace12602-f0f7-4f29-8c37-72c1e840bacc" (UID: "ace12602-f0f7-4f29-8c37-72c1e840bacc"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:21:19 crc kubenswrapper[4676]: I0930 14:21:19.229519 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ace12602-f0f7-4f29-8c37-72c1e840bacc-kube-api-access-rrhhz" (OuterVolumeSpecName: "kube-api-access-rrhhz") pod "ace12602-f0f7-4f29-8c37-72c1e840bacc" (UID: "ace12602-f0f7-4f29-8c37-72c1e840bacc"). InnerVolumeSpecName "kube-api-access-rrhhz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:21:19 crc kubenswrapper[4676]: I0930 14:21:19.259666 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ace12602-f0f7-4f29-8c37-72c1e840bacc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ace12602-f0f7-4f29-8c37-72c1e840bacc" (UID: "ace12602-f0f7-4f29-8c37-72c1e840bacc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:21:19 crc kubenswrapper[4676]: I0930 14:21:19.264159 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ace12602-f0f7-4f29-8c37-72c1e840bacc-config-data" (OuterVolumeSpecName: "config-data") pod "ace12602-f0f7-4f29-8c37-72c1e840bacc" (UID: "ace12602-f0f7-4f29-8c37-72c1e840bacc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:21:19 crc kubenswrapper[4676]: I0930 14:21:19.268802 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ace12602-f0f7-4f29-8c37-72c1e840bacc-scripts" (OuterVolumeSpecName: "scripts") pod "ace12602-f0f7-4f29-8c37-72c1e840bacc" (UID: "ace12602-f0f7-4f29-8c37-72c1e840bacc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:21:19 crc kubenswrapper[4676]: I0930 14:21:19.295641 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ace12602-f0f7-4f29-8c37-72c1e840bacc-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "ace12602-f0f7-4f29-8c37-72c1e840bacc" (UID: "ace12602-f0f7-4f29-8c37-72c1e840bacc"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:21:19 crc kubenswrapper[4676]: I0930 14:21:19.325634 4676 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ace12602-f0f7-4f29-8c37-72c1e840bacc-logs\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:19 crc kubenswrapper[4676]: I0930 14:21:19.325676 4676 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ace12602-f0f7-4f29-8c37-72c1e840bacc-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:19 crc kubenswrapper[4676]: I0930 14:21:19.325689 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrhhz\" (UniqueName: \"kubernetes.io/projected/ace12602-f0f7-4f29-8c37-72c1e840bacc-kube-api-access-rrhhz\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:19 crc kubenswrapper[4676]: I0930 14:21:19.325708 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ace12602-f0f7-4f29-8c37-72c1e840bacc-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:19 crc kubenswrapper[4676]: I0930 14:21:19.325721 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ace12602-f0f7-4f29-8c37-72c1e840bacc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:19 crc kubenswrapper[4676]: I0930 14:21:19.325733 4676 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ace12602-f0f7-4f29-8c37-72c1e840bacc-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:19 crc kubenswrapper[4676]: I0930 14:21:19.325743 4676 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ace12602-f0f7-4f29-8c37-72c1e840bacc-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:19 crc kubenswrapper[4676]: W0930 14:21:19.404049 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33f3cbef_5c84_4550_a054_243260e64ccb.slice/crio-c949ad5a21d85bc8ce9d0652771dc110dad93360d3b07e5db3a33865fa52fbd6 WatchSource:0}: Error finding container c949ad5a21d85bc8ce9d0652771dc110dad93360d3b07e5db3a33865fa52fbd6: Status 404 returned error can't find the container with id c949ad5a21d85bc8ce9d0652771dc110dad93360d3b07e5db3a33865fa52fbd6 Sep 30 14:21:19 crc kubenswrapper[4676]: I0930 14:21:19.407497 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 14:21:19 crc kubenswrapper[4676]: E0930 14:21:19.545752 4676 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podace12602_f0f7_4f29_8c37_72c1e840bacc.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podace12602_f0f7_4f29_8c37_72c1e840bacc.slice/crio-03b907165789de2b1a209bdfadd9e19d3c353ee52f5e77bfa891efbacc53c12c\": RecentStats: unable to find data in memory cache]" Sep 30 14:21:19 crc kubenswrapper[4676]: I0930 14:21:19.763143 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6668cdff8d-z8vnk" event={"ID":"ace12602-f0f7-4f29-8c37-72c1e840bacc","Type":"ContainerDied","Data":"03b907165789de2b1a209bdfadd9e19d3c353ee52f5e77bfa891efbacc53c12c"} Sep 30 14:21:19 crc kubenswrapper[4676]: I0930 14:21:19.764829 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"33f3cbef-5c84-4550-a054-243260e64ccb","Type":"ContainerStarted","Data":"c949ad5a21d85bc8ce9d0652771dc110dad93360d3b07e5db3a33865fa52fbd6"} Sep 30 14:21:19 crc kubenswrapper[4676]: I0930 14:21:19.764928 4676 scope.go:117] "RemoveContainer" containerID="41f7da45c7537f91933be6cb3d5a7cc0b180938f05b9eead2baa6c684dcaccdd" Sep 30 14:21:19 crc kubenswrapper[4676]: I0930 14:21:19.763154 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6668cdff8d-z8vnk" Sep 30 14:21:19 crc kubenswrapper[4676]: I0930 14:21:19.768148 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"f682d501-bba0-4b08-98aa-0ee2a0603939","Type":"ContainerStarted","Data":"49c64ab82d063bb64b7df557369f339b85370d11b2edac061bcc15ef7a2699cd"} Sep 30 14:21:19 crc kubenswrapper[4676]: I0930 14:21:19.787356 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.9278334 podStartE2EDuration="23.787339474s" podCreationTimestamp="2025-09-30 14:20:56 +0000 UTC" firstStartedPulling="2025-09-30 14:20:57.027024419 +0000 UTC m=+1361.010112838" lastFinishedPulling="2025-09-30 14:21:18.886530483 +0000 UTC m=+1382.869618912" observedRunningTime="2025-09-30 14:21:19.784776297 +0000 UTC m=+1383.767864726" watchObservedRunningTime="2025-09-30 14:21:19.787339474 +0000 UTC m=+1383.770427913" Sep 30 14:21:19 crc kubenswrapper[4676]: I0930 14:21:19.808988 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6668cdff8d-z8vnk"] Sep 30 14:21:19 crc kubenswrapper[4676]: I0930 14:21:19.814590 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6668cdff8d-z8vnk"] Sep 30 14:21:20 crc kubenswrapper[4676]: I0930 14:21:20.071989 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 14:21:20 crc kubenswrapper[4676]: I0930 14:21:20.154654 4676 scope.go:117] "RemoveContainer" containerID="e982bb57f0c69fac6b00b72b2db739d79c8214c812ea9ff516f0a7d6e34a9939" Sep 30 14:21:21 crc kubenswrapper[4676]: I0930 14:21:21.255410 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-4tch9"] Sep 30 14:21:21 crc kubenswrapper[4676]: E0930 14:21:21.256508 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ace12602-f0f7-4f29-8c37-72c1e840bacc" containerName="horizon-log" Sep 30 14:21:21 crc kubenswrapper[4676]: I0930 14:21:21.256554 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="ace12602-f0f7-4f29-8c37-72c1e840bacc" containerName="horizon-log" Sep 30 14:21:21 crc kubenswrapper[4676]: E0930 14:21:21.256600 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ace12602-f0f7-4f29-8c37-72c1e840bacc" containerName="horizon" Sep 30 14:21:21 crc kubenswrapper[4676]: I0930 14:21:21.256606 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="ace12602-f0f7-4f29-8c37-72c1e840bacc" containerName="horizon" Sep 30 14:21:21 crc kubenswrapper[4676]: I0930 14:21:21.256866 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="ace12602-f0f7-4f29-8c37-72c1e840bacc" containerName="horizon" Sep 30 14:21:21 crc kubenswrapper[4676]: I0930 14:21:21.256922 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="ace12602-f0f7-4f29-8c37-72c1e840bacc" containerName="horizon-log" Sep 30 14:21:21 crc kubenswrapper[4676]: I0930 14:21:21.258535 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-4tch9" Sep 30 14:21:21 crc kubenswrapper[4676]: I0930 14:21:21.265912 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-4tch9"] Sep 30 14:21:21 crc kubenswrapper[4676]: I0930 14:21:21.350415 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-tlcl8"] Sep 30 14:21:21 crc kubenswrapper[4676]: I0930 14:21:21.351591 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-tlcl8" Sep 30 14:21:21 crc kubenswrapper[4676]: I0930 14:21:21.377365 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgx8j\" (UniqueName: \"kubernetes.io/projected/2c62023c-427e-406f-af99-0d50b6808acb-kube-api-access-tgx8j\") pod \"nova-api-db-create-4tch9\" (UID: \"2c62023c-427e-406f-af99-0d50b6808acb\") " pod="openstack/nova-api-db-create-4tch9" Sep 30 14:21:21 crc kubenswrapper[4676]: I0930 14:21:21.416162 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-tlcl8"] Sep 30 14:21:21 crc kubenswrapper[4676]: I0930 14:21:21.462747 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ace12602-f0f7-4f29-8c37-72c1e840bacc" path="/var/lib/kubelet/pods/ace12602-f0f7-4f29-8c37-72c1e840bacc/volumes" Sep 30 14:21:21 crc kubenswrapper[4676]: I0930 14:21:21.463562 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-vb8zf"] Sep 30 14:21:21 crc kubenswrapper[4676]: I0930 14:21:21.465073 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-vb8zf" Sep 30 14:21:21 crc kubenswrapper[4676]: I0930 14:21:21.465073 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-vb8zf"] Sep 30 14:21:21 crc kubenswrapper[4676]: I0930 14:21:21.479964 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfm8z\" (UniqueName: \"kubernetes.io/projected/d2c5910c-a5af-4d9f-9c16-81ac73cd7a5a-kube-api-access-xfm8z\") pod \"nova-cell0-db-create-tlcl8\" (UID: \"d2c5910c-a5af-4d9f-9c16-81ac73cd7a5a\") " pod="openstack/nova-cell0-db-create-tlcl8" Sep 30 14:21:21 crc kubenswrapper[4676]: I0930 14:21:21.480071 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgx8j\" (UniqueName: \"kubernetes.io/projected/2c62023c-427e-406f-af99-0d50b6808acb-kube-api-access-tgx8j\") pod \"nova-api-db-create-4tch9\" (UID: \"2c62023c-427e-406f-af99-0d50b6808acb\") " pod="openstack/nova-api-db-create-4tch9" Sep 30 14:21:21 crc kubenswrapper[4676]: I0930 14:21:21.540665 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgx8j\" (UniqueName: \"kubernetes.io/projected/2c62023c-427e-406f-af99-0d50b6808acb-kube-api-access-tgx8j\") pod \"nova-api-db-create-4tch9\" (UID: \"2c62023c-427e-406f-af99-0d50b6808acb\") " pod="openstack/nova-api-db-create-4tch9" Sep 30 14:21:21 crc kubenswrapper[4676]: I0930 14:21:21.577136 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Sep 30 14:21:21 crc kubenswrapper[4676]: I0930 14:21:21.581664 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqvgk\" (UniqueName: \"kubernetes.io/projected/34ae48f1-6128-48b6-a472-a35226869dc2-kube-api-access-vqvgk\") pod \"nova-cell1-db-create-vb8zf\" (UID: \"34ae48f1-6128-48b6-a472-a35226869dc2\") " pod="openstack/nova-cell1-db-create-vb8zf" Sep 30 14:21:21 crc kubenswrapper[4676]: I0930 14:21:21.581809 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfm8z\" (UniqueName: \"kubernetes.io/projected/d2c5910c-a5af-4d9f-9c16-81ac73cd7a5a-kube-api-access-xfm8z\") pod \"nova-cell0-db-create-tlcl8\" (UID: \"d2c5910c-a5af-4d9f-9c16-81ac73cd7a5a\") " pod="openstack/nova-cell0-db-create-tlcl8" Sep 30 14:21:21 crc kubenswrapper[4676]: I0930 14:21:21.584629 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-4tch9" Sep 30 14:21:21 crc kubenswrapper[4676]: I0930 14:21:21.606060 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfm8z\" (UniqueName: \"kubernetes.io/projected/d2c5910c-a5af-4d9f-9c16-81ac73cd7a5a-kube-api-access-xfm8z\") pod \"nova-cell0-db-create-tlcl8\" (UID: \"d2c5910c-a5af-4d9f-9c16-81ac73cd7a5a\") " pod="openstack/nova-cell0-db-create-tlcl8" Sep 30 14:21:21 crc kubenswrapper[4676]: I0930 14:21:21.715158 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqvgk\" (UniqueName: \"kubernetes.io/projected/34ae48f1-6128-48b6-a472-a35226869dc2-kube-api-access-vqvgk\") pod \"nova-cell1-db-create-vb8zf\" (UID: \"34ae48f1-6128-48b6-a472-a35226869dc2\") " pod="openstack/nova-cell1-db-create-vb8zf" Sep 30 14:21:21 crc kubenswrapper[4676]: I0930 14:21:21.744813 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-tlcl8" Sep 30 14:21:21 crc kubenswrapper[4676]: I0930 14:21:21.764209 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqvgk\" (UniqueName: \"kubernetes.io/projected/34ae48f1-6128-48b6-a472-a35226869dc2-kube-api-access-vqvgk\") pod \"nova-cell1-db-create-vb8zf\" (UID: \"34ae48f1-6128-48b6-a472-a35226869dc2\") " pod="openstack/nova-cell1-db-create-vb8zf" Sep 30 14:21:21 crc kubenswrapper[4676]: I0930 14:21:21.794418 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-vb8zf" Sep 30 14:21:22 crc kubenswrapper[4676]: I0930 14:21:22.116374 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-4tch9"] Sep 30 14:21:22 crc kubenswrapper[4676]: I0930 14:21:22.811863 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-4tch9" event={"ID":"2c62023c-427e-406f-af99-0d50b6808acb","Type":"ContainerStarted","Data":"e94c991cab31483e605cd2ec80ae1d5c9a42d00fdf705eac61d7dbd7015255e5"} Sep 30 14:21:22 crc kubenswrapper[4676]: I0930 14:21:22.887169 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-vb8zf"] Sep 30 14:21:22 crc kubenswrapper[4676]: I0930 14:21:22.956311 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-tlcl8"] Sep 30 14:21:22 crc kubenswrapper[4676]: W0930 14:21:22.964253 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd2c5910c_a5af_4d9f_9c16_81ac73cd7a5a.slice/crio-8fd673023135ab2415172a19e0549cc4b091bead7de883d8e75542106816067a WatchSource:0}: Error finding container 8fd673023135ab2415172a19e0549cc4b091bead7de883d8e75542106816067a: Status 404 returned error can't find the container with id 8fd673023135ab2415172a19e0549cc4b091bead7de883d8e75542106816067a Sep 30 14:21:23 crc kubenswrapper[4676]: I0930 14:21:23.827084 4676 generic.go:334] "Generic (PLEG): container finished" podID="d2c5910c-a5af-4d9f-9c16-81ac73cd7a5a" containerID="aaead5d4de93e4ccde86968661a2507e4c9ca3492a9b560f080713716c5f05ad" exitCode=0 Sep 30 14:21:23 crc kubenswrapper[4676]: I0930 14:21:23.827191 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-tlcl8" event={"ID":"d2c5910c-a5af-4d9f-9c16-81ac73cd7a5a","Type":"ContainerDied","Data":"aaead5d4de93e4ccde86968661a2507e4c9ca3492a9b560f080713716c5f05ad"} Sep 30 14:21:23 crc kubenswrapper[4676]: I0930 14:21:23.827551 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-tlcl8" event={"ID":"d2c5910c-a5af-4d9f-9c16-81ac73cd7a5a","Type":"ContainerStarted","Data":"8fd673023135ab2415172a19e0549cc4b091bead7de883d8e75542106816067a"} Sep 30 14:21:23 crc kubenswrapper[4676]: I0930 14:21:23.831251 4676 generic.go:334] "Generic (PLEG): container finished" podID="34ae48f1-6128-48b6-a472-a35226869dc2" containerID="7b1734f93cb29c2b48f80dbd407a8fb9e804f31fa645c0bfce7cc6d82e975343" exitCode=0 Sep 30 14:21:23 crc kubenswrapper[4676]: I0930 14:21:23.831435 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-vb8zf" event={"ID":"34ae48f1-6128-48b6-a472-a35226869dc2","Type":"ContainerDied","Data":"7b1734f93cb29c2b48f80dbd407a8fb9e804f31fa645c0bfce7cc6d82e975343"} Sep 30 14:21:23 crc kubenswrapper[4676]: I0930 14:21:23.831728 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-vb8zf" event={"ID":"34ae48f1-6128-48b6-a472-a35226869dc2","Type":"ContainerStarted","Data":"9ff791ddd67c41424480bfecc7bf89fbd8250aca50904367e08c38879a171f41"} Sep 30 14:21:23 crc kubenswrapper[4676]: I0930 14:21:23.836327 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"33f3cbef-5c84-4550-a054-243260e64ccb","Type":"ContainerStarted","Data":"0ec986627bf43ee5c63f707f576fff8e0274ee6cf39d3bd39d1ac02101a5020f"} Sep 30 14:21:23 crc kubenswrapper[4676]: I0930 14:21:23.850467 4676 generic.go:334] "Generic (PLEG): container finished" podID="2c62023c-427e-406f-af99-0d50b6808acb" containerID="15fe0d9acd3c959cd25fc1a238869dc72f39e3c8750058b321b878b777976149" exitCode=0 Sep 30 14:21:23 crc kubenswrapper[4676]: I0930 14:21:23.850529 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-4tch9" event={"ID":"2c62023c-427e-406f-af99-0d50b6808acb","Type":"ContainerDied","Data":"15fe0d9acd3c959cd25fc1a238869dc72f39e3c8750058b321b878b777976149"} Sep 30 14:21:24 crc kubenswrapper[4676]: I0930 14:21:24.889616 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"33f3cbef-5c84-4550-a054-243260e64ccb","Type":"ContainerStarted","Data":"8f36bf6df7ec5c16f48fad3f8feeec6b822748b9505d27954935b3b500f00ff6"} Sep 30 14:21:25 crc kubenswrapper[4676]: I0930 14:21:25.397393 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-tlcl8" Sep 30 14:21:25 crc kubenswrapper[4676]: I0930 14:21:25.500027 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfm8z\" (UniqueName: \"kubernetes.io/projected/d2c5910c-a5af-4d9f-9c16-81ac73cd7a5a-kube-api-access-xfm8z\") pod \"d2c5910c-a5af-4d9f-9c16-81ac73cd7a5a\" (UID: \"d2c5910c-a5af-4d9f-9c16-81ac73cd7a5a\") " Sep 30 14:21:25 crc kubenswrapper[4676]: I0930 14:21:25.516166 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2c5910c-a5af-4d9f-9c16-81ac73cd7a5a-kube-api-access-xfm8z" (OuterVolumeSpecName: "kube-api-access-xfm8z") pod "d2c5910c-a5af-4d9f-9c16-81ac73cd7a5a" (UID: "d2c5910c-a5af-4d9f-9c16-81ac73cd7a5a"). InnerVolumeSpecName "kube-api-access-xfm8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:21:25 crc kubenswrapper[4676]: I0930 14:21:25.555258 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-4tch9" Sep 30 14:21:25 crc kubenswrapper[4676]: I0930 14:21:25.559508 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-vb8zf" Sep 30 14:21:25 crc kubenswrapper[4676]: I0930 14:21:25.604855 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgx8j\" (UniqueName: \"kubernetes.io/projected/2c62023c-427e-406f-af99-0d50b6808acb-kube-api-access-tgx8j\") pod \"2c62023c-427e-406f-af99-0d50b6808acb\" (UID: \"2c62023c-427e-406f-af99-0d50b6808acb\") " Sep 30 14:21:25 crc kubenswrapper[4676]: I0930 14:21:25.605103 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqvgk\" (UniqueName: \"kubernetes.io/projected/34ae48f1-6128-48b6-a472-a35226869dc2-kube-api-access-vqvgk\") pod \"34ae48f1-6128-48b6-a472-a35226869dc2\" (UID: \"34ae48f1-6128-48b6-a472-a35226869dc2\") " Sep 30 14:21:25 crc kubenswrapper[4676]: I0930 14:21:25.605657 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfm8z\" (UniqueName: \"kubernetes.io/projected/d2c5910c-a5af-4d9f-9c16-81ac73cd7a5a-kube-api-access-xfm8z\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:25 crc kubenswrapper[4676]: I0930 14:21:25.612625 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34ae48f1-6128-48b6-a472-a35226869dc2-kube-api-access-vqvgk" (OuterVolumeSpecName: "kube-api-access-vqvgk") pod "34ae48f1-6128-48b6-a472-a35226869dc2" (UID: "34ae48f1-6128-48b6-a472-a35226869dc2"). InnerVolumeSpecName "kube-api-access-vqvgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:21:25 crc kubenswrapper[4676]: I0930 14:21:25.613481 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c62023c-427e-406f-af99-0d50b6808acb-kube-api-access-tgx8j" (OuterVolumeSpecName: "kube-api-access-tgx8j") pod "2c62023c-427e-406f-af99-0d50b6808acb" (UID: "2c62023c-427e-406f-af99-0d50b6808acb"). InnerVolumeSpecName "kube-api-access-tgx8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:21:25 crc kubenswrapper[4676]: I0930 14:21:25.707160 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqvgk\" (UniqueName: \"kubernetes.io/projected/34ae48f1-6128-48b6-a472-a35226869dc2-kube-api-access-vqvgk\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:25 crc kubenswrapper[4676]: I0930 14:21:25.707196 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgx8j\" (UniqueName: \"kubernetes.io/projected/2c62023c-427e-406f-af99-0d50b6808acb-kube-api-access-tgx8j\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:25 crc kubenswrapper[4676]: I0930 14:21:25.916044 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-tlcl8" event={"ID":"d2c5910c-a5af-4d9f-9c16-81ac73cd7a5a","Type":"ContainerDied","Data":"8fd673023135ab2415172a19e0549cc4b091bead7de883d8e75542106816067a"} Sep 30 14:21:25 crc kubenswrapper[4676]: I0930 14:21:25.916460 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8fd673023135ab2415172a19e0549cc4b091bead7de883d8e75542106816067a" Sep 30 14:21:25 crc kubenswrapper[4676]: I0930 14:21:25.916087 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-tlcl8" Sep 30 14:21:25 crc kubenswrapper[4676]: I0930 14:21:25.920130 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-vb8zf" event={"ID":"34ae48f1-6128-48b6-a472-a35226869dc2","Type":"ContainerDied","Data":"9ff791ddd67c41424480bfecc7bf89fbd8250aca50904367e08c38879a171f41"} Sep 30 14:21:25 crc kubenswrapper[4676]: I0930 14:21:25.920181 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ff791ddd67c41424480bfecc7bf89fbd8250aca50904367e08c38879a171f41" Sep 30 14:21:25 crc kubenswrapper[4676]: I0930 14:21:25.920177 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-vb8zf" Sep 30 14:21:25 crc kubenswrapper[4676]: I0930 14:21:25.923057 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"33f3cbef-5c84-4550-a054-243260e64ccb","Type":"ContainerStarted","Data":"c22871fd38952929b0a4f96b3faac485055d8003bd2820aaeaed2d5bc305be37"} Sep 30 14:21:25 crc kubenswrapper[4676]: I0930 14:21:25.929695 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-4tch9" event={"ID":"2c62023c-427e-406f-af99-0d50b6808acb","Type":"ContainerDied","Data":"e94c991cab31483e605cd2ec80ae1d5c9a42d00fdf705eac61d7dbd7015255e5"} Sep 30 14:21:25 crc kubenswrapper[4676]: I0930 14:21:25.929774 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e94c991cab31483e605cd2ec80ae1d5c9a42d00fdf705eac61d7dbd7015255e5" Sep 30 14:21:25 crc kubenswrapper[4676]: I0930 14:21:25.929841 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-4tch9" Sep 30 14:21:26 crc kubenswrapper[4676]: I0930 14:21:26.942405 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"33f3cbef-5c84-4550-a054-243260e64ccb","Type":"ContainerStarted","Data":"0c2e2e41eab996d515740d082e453e2e88853a75d15f8a56deff9a9464d29e16"} Sep 30 14:21:26 crc kubenswrapper[4676]: I0930 14:21:26.942692 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="33f3cbef-5c84-4550-a054-243260e64ccb" containerName="sg-core" containerID="cri-o://c22871fd38952929b0a4f96b3faac485055d8003bd2820aaeaed2d5bc305be37" gracePeriod=30 Sep 30 14:21:26 crc kubenswrapper[4676]: I0930 14:21:26.942827 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 14:21:26 crc kubenswrapper[4676]: I0930 14:21:26.942751 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="33f3cbef-5c84-4550-a054-243260e64ccb" containerName="proxy-httpd" containerID="cri-o://0c2e2e41eab996d515740d082e453e2e88853a75d15f8a56deff9a9464d29e16" gracePeriod=30 Sep 30 14:21:26 crc kubenswrapper[4676]: I0930 14:21:26.942787 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="33f3cbef-5c84-4550-a054-243260e64ccb" containerName="ceilometer-central-agent" containerID="cri-o://0ec986627bf43ee5c63f707f576fff8e0274ee6cf39d3bd39d1ac02101a5020f" gracePeriod=30 Sep 30 14:21:26 crc kubenswrapper[4676]: I0930 14:21:26.942705 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="33f3cbef-5c84-4550-a054-243260e64ccb" containerName="ceilometer-notification-agent" containerID="cri-o://8f36bf6df7ec5c16f48fad3f8feeec6b822748b9505d27954935b3b500f00ff6" gracePeriod=30 Sep 30 14:21:26 crc kubenswrapper[4676]: I0930 14:21:26.971604 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=5.11797485 podStartE2EDuration="11.971585602s" podCreationTimestamp="2025-09-30 14:21:15 +0000 UTC" firstStartedPulling="2025-09-30 14:21:19.406342848 +0000 UTC m=+1383.389431277" lastFinishedPulling="2025-09-30 14:21:26.2599536 +0000 UTC m=+1390.243042029" observedRunningTime="2025-09-30 14:21:26.968620835 +0000 UTC m=+1390.951709264" watchObservedRunningTime="2025-09-30 14:21:26.971585602 +0000 UTC m=+1390.954674031" Sep 30 14:21:27 crc kubenswrapper[4676]: I0930 14:21:27.956940 4676 generic.go:334] "Generic (PLEG): container finished" podID="33f3cbef-5c84-4550-a054-243260e64ccb" containerID="0c2e2e41eab996d515740d082e453e2e88853a75d15f8a56deff9a9464d29e16" exitCode=0 Sep 30 14:21:27 crc kubenswrapper[4676]: I0930 14:21:27.956974 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"33f3cbef-5c84-4550-a054-243260e64ccb","Type":"ContainerDied","Data":"0c2e2e41eab996d515740d082e453e2e88853a75d15f8a56deff9a9464d29e16"} Sep 30 14:21:27 crc kubenswrapper[4676]: I0930 14:21:27.957019 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"33f3cbef-5c84-4550-a054-243260e64ccb","Type":"ContainerDied","Data":"c22871fd38952929b0a4f96b3faac485055d8003bd2820aaeaed2d5bc305be37"} Sep 30 14:21:27 crc kubenswrapper[4676]: I0930 14:21:27.956990 4676 generic.go:334] "Generic (PLEG): container finished" podID="33f3cbef-5c84-4550-a054-243260e64ccb" containerID="c22871fd38952929b0a4f96b3faac485055d8003bd2820aaeaed2d5bc305be37" exitCode=2 Sep 30 14:21:27 crc kubenswrapper[4676]: I0930 14:21:27.957050 4676 generic.go:334] "Generic (PLEG): container finished" podID="33f3cbef-5c84-4550-a054-243260e64ccb" containerID="8f36bf6df7ec5c16f48fad3f8feeec6b822748b9505d27954935b3b500f00ff6" exitCode=0 Sep 30 14:21:27 crc kubenswrapper[4676]: I0930 14:21:27.957069 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"33f3cbef-5c84-4550-a054-243260e64ccb","Type":"ContainerDied","Data":"8f36bf6df7ec5c16f48fad3f8feeec6b822748b9505d27954935b3b500f00ff6"} Sep 30 14:21:29 crc kubenswrapper[4676]: I0930 14:21:29.080837 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6bbb84f9b4-l58vk" Sep 30 14:21:31 crc kubenswrapper[4676]: I0930 14:21:31.545927 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-ccb8-account-create-f8xw8"] Sep 30 14:21:31 crc kubenswrapper[4676]: E0930 14:21:31.546838 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c62023c-427e-406f-af99-0d50b6808acb" containerName="mariadb-database-create" Sep 30 14:21:31 crc kubenswrapper[4676]: I0930 14:21:31.546851 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c62023c-427e-406f-af99-0d50b6808acb" containerName="mariadb-database-create" Sep 30 14:21:31 crc kubenswrapper[4676]: E0930 14:21:31.546890 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2c5910c-a5af-4d9f-9c16-81ac73cd7a5a" containerName="mariadb-database-create" Sep 30 14:21:31 crc kubenswrapper[4676]: I0930 14:21:31.546897 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2c5910c-a5af-4d9f-9c16-81ac73cd7a5a" containerName="mariadb-database-create" Sep 30 14:21:31 crc kubenswrapper[4676]: E0930 14:21:31.546916 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34ae48f1-6128-48b6-a472-a35226869dc2" containerName="mariadb-database-create" Sep 30 14:21:31 crc kubenswrapper[4676]: I0930 14:21:31.546924 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="34ae48f1-6128-48b6-a472-a35226869dc2" containerName="mariadb-database-create" Sep 30 14:21:31 crc kubenswrapper[4676]: I0930 14:21:31.547094 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c62023c-427e-406f-af99-0d50b6808acb" containerName="mariadb-database-create" Sep 30 14:21:31 crc kubenswrapper[4676]: I0930 14:21:31.547115 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="34ae48f1-6128-48b6-a472-a35226869dc2" containerName="mariadb-database-create" Sep 30 14:21:31 crc kubenswrapper[4676]: I0930 14:21:31.547131 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2c5910c-a5af-4d9f-9c16-81ac73cd7a5a" containerName="mariadb-database-create" Sep 30 14:21:31 crc kubenswrapper[4676]: I0930 14:21:31.547687 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-ccb8-account-create-f8xw8" Sep 30 14:21:31 crc kubenswrapper[4676]: I0930 14:21:31.554318 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Sep 30 14:21:31 crc kubenswrapper[4676]: I0930 14:21:31.560276 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-ccb8-account-create-f8xw8"] Sep 30 14:21:31 crc kubenswrapper[4676]: I0930 14:21:31.618302 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwzr2\" (UniqueName: \"kubernetes.io/projected/69502754-998e-4124-8956-67747a925b66-kube-api-access-qwzr2\") pod \"nova-api-ccb8-account-create-f8xw8\" (UID: \"69502754-998e-4124-8956-67747a925b66\") " pod="openstack/nova-api-ccb8-account-create-f8xw8" Sep 30 14:21:31 crc kubenswrapper[4676]: I0930 14:21:31.718927 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-5f61-account-create-jbd7r"] Sep 30 14:21:31 crc kubenswrapper[4676]: I0930 14:21:31.719958 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwzr2\" (UniqueName: \"kubernetes.io/projected/69502754-998e-4124-8956-67747a925b66-kube-api-access-qwzr2\") pod \"nova-api-ccb8-account-create-f8xw8\" (UID: \"69502754-998e-4124-8956-67747a925b66\") " pod="openstack/nova-api-ccb8-account-create-f8xw8" Sep 30 14:21:31 crc kubenswrapper[4676]: I0930 14:21:31.722360 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-5f61-account-create-jbd7r" Sep 30 14:21:31 crc kubenswrapper[4676]: I0930 14:21:31.724525 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Sep 30 14:21:31 crc kubenswrapper[4676]: I0930 14:21:31.732454 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-5f61-account-create-jbd7r"] Sep 30 14:21:31 crc kubenswrapper[4676]: I0930 14:21:31.752076 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwzr2\" (UniqueName: \"kubernetes.io/projected/69502754-998e-4124-8956-67747a925b66-kube-api-access-qwzr2\") pod \"nova-api-ccb8-account-create-f8xw8\" (UID: \"69502754-998e-4124-8956-67747a925b66\") " pod="openstack/nova-api-ccb8-account-create-f8xw8" Sep 30 14:21:31 crc kubenswrapper[4676]: I0930 14:21:31.808048 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 14:21:31 crc kubenswrapper[4676]: I0930 14:21:31.869743 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-ccb8-account-create-f8xw8" Sep 30 14:21:31 crc kubenswrapper[4676]: I0930 14:21:31.923149 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-1872-account-create-m4vlg"] Sep 30 14:21:31 crc kubenswrapper[4676]: I0930 14:21:31.924002 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33f3cbef-5c84-4550-a054-243260e64ccb-scripts\") pod \"33f3cbef-5c84-4550-a054-243260e64ccb\" (UID: \"33f3cbef-5c84-4550-a054-243260e64ccb\") " Sep 30 14:21:31 crc kubenswrapper[4676]: I0930 14:21:31.924091 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/33f3cbef-5c84-4550-a054-243260e64ccb-sg-core-conf-yaml\") pod \"33f3cbef-5c84-4550-a054-243260e64ccb\" (UID: \"33f3cbef-5c84-4550-a054-243260e64ccb\") " Sep 30 14:21:31 crc kubenswrapper[4676]: I0930 14:21:31.924210 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-msgjl\" (UniqueName: \"kubernetes.io/projected/33f3cbef-5c84-4550-a054-243260e64ccb-kube-api-access-msgjl\") pod \"33f3cbef-5c84-4550-a054-243260e64ccb\" (UID: \"33f3cbef-5c84-4550-a054-243260e64ccb\") " Sep 30 14:21:31 crc kubenswrapper[4676]: I0930 14:21:31.924241 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33f3cbef-5c84-4550-a054-243260e64ccb-config-data\") pod \"33f3cbef-5c84-4550-a054-243260e64ccb\" (UID: \"33f3cbef-5c84-4550-a054-243260e64ccb\") " Sep 30 14:21:31 crc kubenswrapper[4676]: I0930 14:21:31.924331 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33f3cbef-5c84-4550-a054-243260e64ccb-log-httpd\") pod \"33f3cbef-5c84-4550-a054-243260e64ccb\" (UID: \"33f3cbef-5c84-4550-a054-243260e64ccb\") " Sep 30 14:21:31 crc kubenswrapper[4676]: I0930 14:21:31.924351 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33f3cbef-5c84-4550-a054-243260e64ccb-combined-ca-bundle\") pod \"33f3cbef-5c84-4550-a054-243260e64ccb\" (UID: \"33f3cbef-5c84-4550-a054-243260e64ccb\") " Sep 30 14:21:31 crc kubenswrapper[4676]: I0930 14:21:31.924542 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33f3cbef-5c84-4550-a054-243260e64ccb-run-httpd\") pod \"33f3cbef-5c84-4550-a054-243260e64ccb\" (UID: \"33f3cbef-5c84-4550-a054-243260e64ccb\") " Sep 30 14:21:31 crc kubenswrapper[4676]: I0930 14:21:31.925371 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5ftd\" (UniqueName: \"kubernetes.io/projected/96101623-7868-4b78-8c66-9aeb8e5e8ec5-kube-api-access-v5ftd\") pod \"nova-cell0-5f61-account-create-jbd7r\" (UID: \"96101623-7868-4b78-8c66-9aeb8e5e8ec5\") " pod="openstack/nova-cell0-5f61-account-create-jbd7r" Sep 30 14:21:31 crc kubenswrapper[4676]: E0930 14:21:31.927052 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33f3cbef-5c84-4550-a054-243260e64ccb" containerName="ceilometer-notification-agent" Sep 30 14:21:31 crc kubenswrapper[4676]: I0930 14:21:31.927079 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="33f3cbef-5c84-4550-a054-243260e64ccb" containerName="ceilometer-notification-agent" Sep 30 14:21:31 crc kubenswrapper[4676]: E0930 14:21:31.927110 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33f3cbef-5c84-4550-a054-243260e64ccb" containerName="proxy-httpd" Sep 30 14:21:31 crc kubenswrapper[4676]: I0930 14:21:31.927121 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="33f3cbef-5c84-4550-a054-243260e64ccb" containerName="proxy-httpd" Sep 30 14:21:31 crc kubenswrapper[4676]: E0930 14:21:31.927141 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33f3cbef-5c84-4550-a054-243260e64ccb" containerName="sg-core" Sep 30 14:21:31 crc kubenswrapper[4676]: I0930 14:21:31.927159 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="33f3cbef-5c84-4550-a054-243260e64ccb" containerName="sg-core" Sep 30 14:21:31 crc kubenswrapper[4676]: E0930 14:21:31.927185 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33f3cbef-5c84-4550-a054-243260e64ccb" containerName="ceilometer-central-agent" Sep 30 14:21:31 crc kubenswrapper[4676]: I0930 14:21:31.927191 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="33f3cbef-5c84-4550-a054-243260e64ccb" containerName="ceilometer-central-agent" Sep 30 14:21:31 crc kubenswrapper[4676]: I0930 14:21:31.927381 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="33f3cbef-5c84-4550-a054-243260e64ccb" containerName="ceilometer-notification-agent" Sep 30 14:21:31 crc kubenswrapper[4676]: I0930 14:21:31.927398 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="33f3cbef-5c84-4550-a054-243260e64ccb" containerName="proxy-httpd" Sep 30 14:21:31 crc kubenswrapper[4676]: I0930 14:21:31.927416 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="33f3cbef-5c84-4550-a054-243260e64ccb" containerName="sg-core" Sep 30 14:21:31 crc kubenswrapper[4676]: I0930 14:21:31.927430 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="33f3cbef-5c84-4550-a054-243260e64ccb" containerName="ceilometer-central-agent" Sep 30 14:21:31 crc kubenswrapper[4676]: I0930 14:21:31.927479 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33f3cbef-5c84-4550-a054-243260e64ccb-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "33f3cbef-5c84-4550-a054-243260e64ccb" (UID: "33f3cbef-5c84-4550-a054-243260e64ccb"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:21:31 crc kubenswrapper[4676]: I0930 14:21:31.928079 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-1872-account-create-m4vlg" Sep 30 14:21:31 crc kubenswrapper[4676]: I0930 14:21:31.928312 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33f3cbef-5c84-4550-a054-243260e64ccb-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "33f3cbef-5c84-4550-a054-243260e64ccb" (UID: "33f3cbef-5c84-4550-a054-243260e64ccb"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:21:31 crc kubenswrapper[4676]: I0930 14:21:31.929809 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33f3cbef-5c84-4550-a054-243260e64ccb-scripts" (OuterVolumeSpecName: "scripts") pod "33f3cbef-5c84-4550-a054-243260e64ccb" (UID: "33f3cbef-5c84-4550-a054-243260e64ccb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:21:31 crc kubenswrapper[4676]: I0930 14:21:31.930648 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33f3cbef-5c84-4550-a054-243260e64ccb-kube-api-access-msgjl" (OuterVolumeSpecName: "kube-api-access-msgjl") pod "33f3cbef-5c84-4550-a054-243260e64ccb" (UID: "33f3cbef-5c84-4550-a054-243260e64ccb"). InnerVolumeSpecName "kube-api-access-msgjl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:21:31 crc kubenswrapper[4676]: I0930 14:21:31.931424 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Sep 30 14:21:31 crc kubenswrapper[4676]: I0930 14:21:31.939178 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-1872-account-create-m4vlg"] Sep 30 14:21:31 crc kubenswrapper[4676]: I0930 14:21:31.973466 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33f3cbef-5c84-4550-a054-243260e64ccb-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "33f3cbef-5c84-4550-a054-243260e64ccb" (UID: "33f3cbef-5c84-4550-a054-243260e64ccb"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:21:32 crc kubenswrapper[4676]: I0930 14:21:32.020824 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33f3cbef-5c84-4550-a054-243260e64ccb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "33f3cbef-5c84-4550-a054-243260e64ccb" (UID: "33f3cbef-5c84-4550-a054-243260e64ccb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:21:32 crc kubenswrapper[4676]: I0930 14:21:32.027071 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5pg6\" (UniqueName: \"kubernetes.io/projected/d8eb7b5d-dac6-4093-a364-cc7311e159ef-kube-api-access-n5pg6\") pod \"nova-cell1-1872-account-create-m4vlg\" (UID: \"d8eb7b5d-dac6-4093-a364-cc7311e159ef\") " pod="openstack/nova-cell1-1872-account-create-m4vlg" Sep 30 14:21:32 crc kubenswrapper[4676]: I0930 14:21:32.027131 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5ftd\" (UniqueName: \"kubernetes.io/projected/96101623-7868-4b78-8c66-9aeb8e5e8ec5-kube-api-access-v5ftd\") pod \"nova-cell0-5f61-account-create-jbd7r\" (UID: \"96101623-7868-4b78-8c66-9aeb8e5e8ec5\") " pod="openstack/nova-cell0-5f61-account-create-jbd7r" Sep 30 14:21:32 crc kubenswrapper[4676]: I0930 14:21:32.027245 4676 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33f3cbef-5c84-4550-a054-243260e64ccb-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:32 crc kubenswrapper[4676]: I0930 14:21:32.027261 4676 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33f3cbef-5c84-4550-a054-243260e64ccb-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:32 crc kubenswrapper[4676]: I0930 14:21:32.027273 4676 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/33f3cbef-5c84-4550-a054-243260e64ccb-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:32 crc kubenswrapper[4676]: I0930 14:21:32.027285 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-msgjl\" (UniqueName: \"kubernetes.io/projected/33f3cbef-5c84-4550-a054-243260e64ccb-kube-api-access-msgjl\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:32 crc kubenswrapper[4676]: I0930 14:21:32.027298 4676 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33f3cbef-5c84-4550-a054-243260e64ccb-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:32 crc kubenswrapper[4676]: I0930 14:21:32.027309 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33f3cbef-5c84-4550-a054-243260e64ccb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:32 crc kubenswrapper[4676]: I0930 14:21:32.028478 4676 generic.go:334] "Generic (PLEG): container finished" podID="33f3cbef-5c84-4550-a054-243260e64ccb" containerID="0ec986627bf43ee5c63f707f576fff8e0274ee6cf39d3bd39d1ac02101a5020f" exitCode=0 Sep 30 14:21:32 crc kubenswrapper[4676]: I0930 14:21:32.028512 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"33f3cbef-5c84-4550-a054-243260e64ccb","Type":"ContainerDied","Data":"0ec986627bf43ee5c63f707f576fff8e0274ee6cf39d3bd39d1ac02101a5020f"} Sep 30 14:21:32 crc kubenswrapper[4676]: I0930 14:21:32.028537 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"33f3cbef-5c84-4550-a054-243260e64ccb","Type":"ContainerDied","Data":"c949ad5a21d85bc8ce9d0652771dc110dad93360d3b07e5db3a33865fa52fbd6"} Sep 30 14:21:32 crc kubenswrapper[4676]: I0930 14:21:32.028555 4676 scope.go:117] "RemoveContainer" containerID="0c2e2e41eab996d515740d082e453e2e88853a75d15f8a56deff9a9464d29e16" Sep 30 14:21:32 crc kubenswrapper[4676]: I0930 14:21:32.028696 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 14:21:32 crc kubenswrapper[4676]: I0930 14:21:32.048015 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5ftd\" (UniqueName: \"kubernetes.io/projected/96101623-7868-4b78-8c66-9aeb8e5e8ec5-kube-api-access-v5ftd\") pod \"nova-cell0-5f61-account-create-jbd7r\" (UID: \"96101623-7868-4b78-8c66-9aeb8e5e8ec5\") " pod="openstack/nova-cell0-5f61-account-create-jbd7r" Sep 30 14:21:32 crc kubenswrapper[4676]: I0930 14:21:32.063294 4676 scope.go:117] "RemoveContainer" containerID="c22871fd38952929b0a4f96b3faac485055d8003bd2820aaeaed2d5bc305be37" Sep 30 14:21:32 crc kubenswrapper[4676]: I0930 14:21:32.066215 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33f3cbef-5c84-4550-a054-243260e64ccb-config-data" (OuterVolumeSpecName: "config-data") pod "33f3cbef-5c84-4550-a054-243260e64ccb" (UID: "33f3cbef-5c84-4550-a054-243260e64ccb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:21:32 crc kubenswrapper[4676]: I0930 14:21:32.086517 4676 scope.go:117] "RemoveContainer" containerID="8f36bf6df7ec5c16f48fad3f8feeec6b822748b9505d27954935b3b500f00ff6" Sep 30 14:21:32 crc kubenswrapper[4676]: I0930 14:21:32.118368 4676 scope.go:117] "RemoveContainer" containerID="0ec986627bf43ee5c63f707f576fff8e0274ee6cf39d3bd39d1ac02101a5020f" Sep 30 14:21:32 crc kubenswrapper[4676]: I0930 14:21:32.119204 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-5f61-account-create-jbd7r" Sep 30 14:21:32 crc kubenswrapper[4676]: I0930 14:21:32.129776 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5pg6\" (UniqueName: \"kubernetes.io/projected/d8eb7b5d-dac6-4093-a364-cc7311e159ef-kube-api-access-n5pg6\") pod \"nova-cell1-1872-account-create-m4vlg\" (UID: \"d8eb7b5d-dac6-4093-a364-cc7311e159ef\") " pod="openstack/nova-cell1-1872-account-create-m4vlg" Sep 30 14:21:32 crc kubenswrapper[4676]: I0930 14:21:32.129987 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33f3cbef-5c84-4550-a054-243260e64ccb-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:32 crc kubenswrapper[4676]: I0930 14:21:32.147021 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5pg6\" (UniqueName: \"kubernetes.io/projected/d8eb7b5d-dac6-4093-a364-cc7311e159ef-kube-api-access-n5pg6\") pod \"nova-cell1-1872-account-create-m4vlg\" (UID: \"d8eb7b5d-dac6-4093-a364-cc7311e159ef\") " pod="openstack/nova-cell1-1872-account-create-m4vlg" Sep 30 14:21:32 crc kubenswrapper[4676]: I0930 14:21:32.150451 4676 scope.go:117] "RemoveContainer" containerID="0c2e2e41eab996d515740d082e453e2e88853a75d15f8a56deff9a9464d29e16" Sep 30 14:21:32 crc kubenswrapper[4676]: E0930 14:21:32.151026 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c2e2e41eab996d515740d082e453e2e88853a75d15f8a56deff9a9464d29e16\": container with ID starting with 0c2e2e41eab996d515740d082e453e2e88853a75d15f8a56deff9a9464d29e16 not found: ID does not exist" containerID="0c2e2e41eab996d515740d082e453e2e88853a75d15f8a56deff9a9464d29e16" Sep 30 14:21:32 crc kubenswrapper[4676]: I0930 14:21:32.151070 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c2e2e41eab996d515740d082e453e2e88853a75d15f8a56deff9a9464d29e16"} err="failed to get container status \"0c2e2e41eab996d515740d082e453e2e88853a75d15f8a56deff9a9464d29e16\": rpc error: code = NotFound desc = could not find container \"0c2e2e41eab996d515740d082e453e2e88853a75d15f8a56deff9a9464d29e16\": container with ID starting with 0c2e2e41eab996d515740d082e453e2e88853a75d15f8a56deff9a9464d29e16 not found: ID does not exist" Sep 30 14:21:32 crc kubenswrapper[4676]: I0930 14:21:32.151205 4676 scope.go:117] "RemoveContainer" containerID="c22871fd38952929b0a4f96b3faac485055d8003bd2820aaeaed2d5bc305be37" Sep 30 14:21:32 crc kubenswrapper[4676]: E0930 14:21:32.151949 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c22871fd38952929b0a4f96b3faac485055d8003bd2820aaeaed2d5bc305be37\": container with ID starting with c22871fd38952929b0a4f96b3faac485055d8003bd2820aaeaed2d5bc305be37 not found: ID does not exist" containerID="c22871fd38952929b0a4f96b3faac485055d8003bd2820aaeaed2d5bc305be37" Sep 30 14:21:32 crc kubenswrapper[4676]: I0930 14:21:32.151998 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c22871fd38952929b0a4f96b3faac485055d8003bd2820aaeaed2d5bc305be37"} err="failed to get container status \"c22871fd38952929b0a4f96b3faac485055d8003bd2820aaeaed2d5bc305be37\": rpc error: code = NotFound desc = could not find container \"c22871fd38952929b0a4f96b3faac485055d8003bd2820aaeaed2d5bc305be37\": container with ID starting with c22871fd38952929b0a4f96b3faac485055d8003bd2820aaeaed2d5bc305be37 not found: ID does not exist" Sep 30 14:21:32 crc kubenswrapper[4676]: I0930 14:21:32.152025 4676 scope.go:117] "RemoveContainer" containerID="8f36bf6df7ec5c16f48fad3f8feeec6b822748b9505d27954935b3b500f00ff6" Sep 30 14:21:32 crc kubenswrapper[4676]: E0930 14:21:32.152411 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f36bf6df7ec5c16f48fad3f8feeec6b822748b9505d27954935b3b500f00ff6\": container with ID starting with 8f36bf6df7ec5c16f48fad3f8feeec6b822748b9505d27954935b3b500f00ff6 not found: ID does not exist" containerID="8f36bf6df7ec5c16f48fad3f8feeec6b822748b9505d27954935b3b500f00ff6" Sep 30 14:21:32 crc kubenswrapper[4676]: I0930 14:21:32.152438 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f36bf6df7ec5c16f48fad3f8feeec6b822748b9505d27954935b3b500f00ff6"} err="failed to get container status \"8f36bf6df7ec5c16f48fad3f8feeec6b822748b9505d27954935b3b500f00ff6\": rpc error: code = NotFound desc = could not find container \"8f36bf6df7ec5c16f48fad3f8feeec6b822748b9505d27954935b3b500f00ff6\": container with ID starting with 8f36bf6df7ec5c16f48fad3f8feeec6b822748b9505d27954935b3b500f00ff6 not found: ID does not exist" Sep 30 14:21:32 crc kubenswrapper[4676]: I0930 14:21:32.152452 4676 scope.go:117] "RemoveContainer" containerID="0ec986627bf43ee5c63f707f576fff8e0274ee6cf39d3bd39d1ac02101a5020f" Sep 30 14:21:32 crc kubenswrapper[4676]: E0930 14:21:32.152673 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ec986627bf43ee5c63f707f576fff8e0274ee6cf39d3bd39d1ac02101a5020f\": container with ID starting with 0ec986627bf43ee5c63f707f576fff8e0274ee6cf39d3bd39d1ac02101a5020f not found: ID does not exist" containerID="0ec986627bf43ee5c63f707f576fff8e0274ee6cf39d3bd39d1ac02101a5020f" Sep 30 14:21:32 crc kubenswrapper[4676]: I0930 14:21:32.152704 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ec986627bf43ee5c63f707f576fff8e0274ee6cf39d3bd39d1ac02101a5020f"} err="failed to get container status \"0ec986627bf43ee5c63f707f576fff8e0274ee6cf39d3bd39d1ac02101a5020f\": rpc error: code = NotFound desc = could not find container \"0ec986627bf43ee5c63f707f576fff8e0274ee6cf39d3bd39d1ac02101a5020f\": container with ID starting with 0ec986627bf43ee5c63f707f576fff8e0274ee6cf39d3bd39d1ac02101a5020f not found: ID does not exist" Sep 30 14:21:32 crc kubenswrapper[4676]: I0930 14:21:32.215240 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-56956855f5-jwqlp" Sep 30 14:21:32 crc kubenswrapper[4676]: I0930 14:21:32.283676 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6bbb84f9b4-l58vk"] Sep 30 14:21:32 crc kubenswrapper[4676]: I0930 14:21:32.283908 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6bbb84f9b4-l58vk" podUID="e8be2e62-b82f-43cb-b1bb-099d7d0f3478" containerName="neutron-api" containerID="cri-o://3961d18819a828f34caead58848f4b73c474baf1b075cbe22e7270b079e01318" gracePeriod=30 Sep 30 14:21:32 crc kubenswrapper[4676]: I0930 14:21:32.284338 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6bbb84f9b4-l58vk" podUID="e8be2e62-b82f-43cb-b1bb-099d7d0f3478" containerName="neutron-httpd" containerID="cri-o://7c0af458813ddf93c4dc88ec6ea27a8727dbfa23aae24c5dc0df4b8cb0d0af17" gracePeriod=30 Sep 30 14:21:32 crc kubenswrapper[4676]: I0930 14:21:32.326097 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-1872-account-create-m4vlg" Sep 30 14:21:32 crc kubenswrapper[4676]: I0930 14:21:32.359383 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-ccb8-account-create-f8xw8"] Sep 30 14:21:32 crc kubenswrapper[4676]: I0930 14:21:32.389056 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 14:21:32 crc kubenswrapper[4676]: I0930 14:21:32.419463 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 14:21:32 crc kubenswrapper[4676]: I0930 14:21:32.458828 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 14:21:32 crc kubenswrapper[4676]: I0930 14:21:32.461545 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 14:21:32 crc kubenswrapper[4676]: I0930 14:21:32.467777 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 14:21:32 crc kubenswrapper[4676]: I0930 14:21:32.468139 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 14:21:32 crc kubenswrapper[4676]: I0930 14:21:32.474439 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 14:21:32 crc kubenswrapper[4676]: I0930 14:21:32.556614 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b4192acc-91dd-48e6-9049-724c2f87043d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b4192acc-91dd-48e6-9049-724c2f87043d\") " pod="openstack/ceilometer-0" Sep 30 14:21:32 crc kubenswrapper[4676]: I0930 14:21:32.557073 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4192acc-91dd-48e6-9049-724c2f87043d-scripts\") pod \"ceilometer-0\" (UID: \"b4192acc-91dd-48e6-9049-724c2f87043d\") " pod="openstack/ceilometer-0" Sep 30 14:21:32 crc kubenswrapper[4676]: I0930 14:21:32.557105 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwwlx\" (UniqueName: \"kubernetes.io/projected/b4192acc-91dd-48e6-9049-724c2f87043d-kube-api-access-lwwlx\") pod \"ceilometer-0\" (UID: \"b4192acc-91dd-48e6-9049-724c2f87043d\") " pod="openstack/ceilometer-0" Sep 30 14:21:32 crc kubenswrapper[4676]: I0930 14:21:32.557127 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4192acc-91dd-48e6-9049-724c2f87043d-config-data\") pod \"ceilometer-0\" (UID: \"b4192acc-91dd-48e6-9049-724c2f87043d\") " pod="openstack/ceilometer-0" Sep 30 14:21:32 crc kubenswrapper[4676]: I0930 14:21:32.557151 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4192acc-91dd-48e6-9049-724c2f87043d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b4192acc-91dd-48e6-9049-724c2f87043d\") " pod="openstack/ceilometer-0" Sep 30 14:21:32 crc kubenswrapper[4676]: I0930 14:21:32.557206 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4192acc-91dd-48e6-9049-724c2f87043d-log-httpd\") pod \"ceilometer-0\" (UID: \"b4192acc-91dd-48e6-9049-724c2f87043d\") " pod="openstack/ceilometer-0" Sep 30 14:21:32 crc kubenswrapper[4676]: I0930 14:21:32.557248 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4192acc-91dd-48e6-9049-724c2f87043d-run-httpd\") pod \"ceilometer-0\" (UID: \"b4192acc-91dd-48e6-9049-724c2f87043d\") " pod="openstack/ceilometer-0" Sep 30 14:21:32 crc kubenswrapper[4676]: I0930 14:21:32.635294 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-5f61-account-create-jbd7r"] Sep 30 14:21:32 crc kubenswrapper[4676]: W0930 14:21:32.654449 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod96101623_7868_4b78_8c66_9aeb8e5e8ec5.slice/crio-ff0a1d5617a734335d0231d1e672c25c6fb01d1aec78da1d0a8c7bb8534a6969 WatchSource:0}: Error finding container ff0a1d5617a734335d0231d1e672c25c6fb01d1aec78da1d0a8c7bb8534a6969: Status 404 returned error can't find the container with id ff0a1d5617a734335d0231d1e672c25c6fb01d1aec78da1d0a8c7bb8534a6969 Sep 30 14:21:32 crc kubenswrapper[4676]: I0930 14:21:32.659079 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b4192acc-91dd-48e6-9049-724c2f87043d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b4192acc-91dd-48e6-9049-724c2f87043d\") " pod="openstack/ceilometer-0" Sep 30 14:21:32 crc kubenswrapper[4676]: I0930 14:21:32.659137 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4192acc-91dd-48e6-9049-724c2f87043d-scripts\") pod \"ceilometer-0\" (UID: \"b4192acc-91dd-48e6-9049-724c2f87043d\") " pod="openstack/ceilometer-0" Sep 30 14:21:32 crc kubenswrapper[4676]: I0930 14:21:32.659175 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwwlx\" (UniqueName: \"kubernetes.io/projected/b4192acc-91dd-48e6-9049-724c2f87043d-kube-api-access-lwwlx\") pod \"ceilometer-0\" (UID: \"b4192acc-91dd-48e6-9049-724c2f87043d\") " pod="openstack/ceilometer-0" Sep 30 14:21:32 crc kubenswrapper[4676]: I0930 14:21:32.659202 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4192acc-91dd-48e6-9049-724c2f87043d-config-data\") pod \"ceilometer-0\" (UID: \"b4192acc-91dd-48e6-9049-724c2f87043d\") " pod="openstack/ceilometer-0" Sep 30 14:21:32 crc kubenswrapper[4676]: I0930 14:21:32.659227 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4192acc-91dd-48e6-9049-724c2f87043d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b4192acc-91dd-48e6-9049-724c2f87043d\") " pod="openstack/ceilometer-0" Sep 30 14:21:32 crc kubenswrapper[4676]: I0930 14:21:32.664802 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4192acc-91dd-48e6-9049-724c2f87043d-log-httpd\") pod \"ceilometer-0\" (UID: \"b4192acc-91dd-48e6-9049-724c2f87043d\") " pod="openstack/ceilometer-0" Sep 30 14:21:32 crc kubenswrapper[4676]: I0930 14:21:32.665722 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4192acc-91dd-48e6-9049-724c2f87043d-log-httpd\") pod \"ceilometer-0\" (UID: \"b4192acc-91dd-48e6-9049-724c2f87043d\") " pod="openstack/ceilometer-0" Sep 30 14:21:32 crc kubenswrapper[4676]: I0930 14:21:32.667023 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4192acc-91dd-48e6-9049-724c2f87043d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b4192acc-91dd-48e6-9049-724c2f87043d\") " pod="openstack/ceilometer-0" Sep 30 14:21:32 crc kubenswrapper[4676]: I0930 14:21:32.667815 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4192acc-91dd-48e6-9049-724c2f87043d-scripts\") pod \"ceilometer-0\" (UID: \"b4192acc-91dd-48e6-9049-724c2f87043d\") " pod="openstack/ceilometer-0" Sep 30 14:21:32 crc kubenswrapper[4676]: I0930 14:21:32.670333 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4192acc-91dd-48e6-9049-724c2f87043d-config-data\") pod \"ceilometer-0\" (UID: \"b4192acc-91dd-48e6-9049-724c2f87043d\") " pod="openstack/ceilometer-0" Sep 30 14:21:32 crc kubenswrapper[4676]: I0930 14:21:32.670351 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b4192acc-91dd-48e6-9049-724c2f87043d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b4192acc-91dd-48e6-9049-724c2f87043d\") " pod="openstack/ceilometer-0" Sep 30 14:21:32 crc kubenswrapper[4676]: I0930 14:21:32.670482 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4192acc-91dd-48e6-9049-724c2f87043d-run-httpd\") pod \"ceilometer-0\" (UID: \"b4192acc-91dd-48e6-9049-724c2f87043d\") " pod="openstack/ceilometer-0" Sep 30 14:21:32 crc kubenswrapper[4676]: I0930 14:21:32.670742 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4192acc-91dd-48e6-9049-724c2f87043d-run-httpd\") pod \"ceilometer-0\" (UID: \"b4192acc-91dd-48e6-9049-724c2f87043d\") " pod="openstack/ceilometer-0" Sep 30 14:21:32 crc kubenswrapper[4676]: I0930 14:21:32.685591 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwwlx\" (UniqueName: \"kubernetes.io/projected/b4192acc-91dd-48e6-9049-724c2f87043d-kube-api-access-lwwlx\") pod \"ceilometer-0\" (UID: \"b4192acc-91dd-48e6-9049-724c2f87043d\") " pod="openstack/ceilometer-0" Sep 30 14:21:32 crc kubenswrapper[4676]: I0930 14:21:32.747053 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-1872-account-create-m4vlg"] Sep 30 14:21:32 crc kubenswrapper[4676]: I0930 14:21:32.810538 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 14:21:33 crc kubenswrapper[4676]: I0930 14:21:33.057520 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-ccb8-account-create-f8xw8" event={"ID":"69502754-998e-4124-8956-67747a925b66","Type":"ContainerStarted","Data":"097eb66c9d5a70284123973c8052d068773ffef68ed246953b44e1c9e247d670"} Sep 30 14:21:33 crc kubenswrapper[4676]: I0930 14:21:33.057908 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-ccb8-account-create-f8xw8" event={"ID":"69502754-998e-4124-8956-67747a925b66","Type":"ContainerStarted","Data":"96a27ae723547227f12842f3675b58adc55e35e7539b20f7d33a707ba6b2ec21"} Sep 30 14:21:33 crc kubenswrapper[4676]: I0930 14:21:33.068171 4676 generic.go:334] "Generic (PLEG): container finished" podID="e8be2e62-b82f-43cb-b1bb-099d7d0f3478" containerID="7c0af458813ddf93c4dc88ec6ea27a8727dbfa23aae24c5dc0df4b8cb0d0af17" exitCode=0 Sep 30 14:21:33 crc kubenswrapper[4676]: I0930 14:21:33.068252 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bbb84f9b4-l58vk" event={"ID":"e8be2e62-b82f-43cb-b1bb-099d7d0f3478","Type":"ContainerDied","Data":"7c0af458813ddf93c4dc88ec6ea27a8727dbfa23aae24c5dc0df4b8cb0d0af17"} Sep 30 14:21:33 crc kubenswrapper[4676]: I0930 14:21:33.074789 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-5f61-account-create-jbd7r" event={"ID":"96101623-7868-4b78-8c66-9aeb8e5e8ec5","Type":"ContainerStarted","Data":"bcf77ab230c8ecdb557bad36d67b63abf7e8df5071f117b4ab224fd883d64d3c"} Sep 30 14:21:33 crc kubenswrapper[4676]: I0930 14:21:33.074842 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-5f61-account-create-jbd7r" event={"ID":"96101623-7868-4b78-8c66-9aeb8e5e8ec5","Type":"ContainerStarted","Data":"ff0a1d5617a734335d0231d1e672c25c6fb01d1aec78da1d0a8c7bb8534a6969"} Sep 30 14:21:33 crc kubenswrapper[4676]: I0930 14:21:33.082385 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-1872-account-create-m4vlg" event={"ID":"d8eb7b5d-dac6-4093-a364-cc7311e159ef","Type":"ContainerStarted","Data":"9d99058d6bebd5c65f55c6f30f0b2166b4dedbca8a5b7726560e1f66b07e386b"} Sep 30 14:21:33 crc kubenswrapper[4676]: I0930 14:21:33.082440 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-1872-account-create-m4vlg" event={"ID":"d8eb7b5d-dac6-4093-a364-cc7311e159ef","Type":"ContainerStarted","Data":"65d83f674ce190baea11488f64c6b380463278a9ecebbc48d451c2ddfa2712f3"} Sep 30 14:21:33 crc kubenswrapper[4676]: I0930 14:21:33.088260 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-ccb8-account-create-f8xw8" podStartSLOduration=2.088235368 podStartE2EDuration="2.088235368s" podCreationTimestamp="2025-09-30 14:21:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:21:33.075745771 +0000 UTC m=+1397.058834190" watchObservedRunningTime="2025-09-30 14:21:33.088235368 +0000 UTC m=+1397.071323797" Sep 30 14:21:33 crc kubenswrapper[4676]: I0930 14:21:33.348407 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 14:21:33 crc kubenswrapper[4676]: W0930 14:21:33.357513 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4192acc_91dd_48e6_9049_724c2f87043d.slice/crio-a91acfac00354946feb900757fbb1dac3c01f011ddcbe3a5752687b2509b08b8 WatchSource:0}: Error finding container a91acfac00354946feb900757fbb1dac3c01f011ddcbe3a5752687b2509b08b8: Status 404 returned error can't find the container with id a91acfac00354946feb900757fbb1dac3c01f011ddcbe3a5752687b2509b08b8 Sep 30 14:21:33 crc kubenswrapper[4676]: I0930 14:21:33.401165 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 14:21:33 crc kubenswrapper[4676]: I0930 14:21:33.401434 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="00c04551-20dc-4c4a-bb5b-39012ad94d51" containerName="glance-log" containerID="cri-o://3ae371b754f0fb586408bdbfb98125a3b3dd21e489e16eeea16990c6f8789542" gracePeriod=30 Sep 30 14:21:33 crc kubenswrapper[4676]: I0930 14:21:33.401930 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="00c04551-20dc-4c4a-bb5b-39012ad94d51" containerName="glance-httpd" containerID="cri-o://66dab40a6f6f0114922534982e6172da0eede50d0b26c825e28b48c515304ad4" gracePeriod=30 Sep 30 14:21:33 crc kubenswrapper[4676]: I0930 14:21:33.449313 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33f3cbef-5c84-4550-a054-243260e64ccb" path="/var/lib/kubelet/pods/33f3cbef-5c84-4550-a054-243260e64ccb/volumes" Sep 30 14:21:34 crc kubenswrapper[4676]: I0930 14:21:34.093164 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-ccb8-account-create-f8xw8" event={"ID":"69502754-998e-4124-8956-67747a925b66","Type":"ContainerDied","Data":"097eb66c9d5a70284123973c8052d068773ffef68ed246953b44e1c9e247d670"} Sep 30 14:21:34 crc kubenswrapper[4676]: I0930 14:21:34.093440 4676 generic.go:334] "Generic (PLEG): container finished" podID="69502754-998e-4124-8956-67747a925b66" containerID="097eb66c9d5a70284123973c8052d068773ffef68ed246953b44e1c9e247d670" exitCode=0 Sep 30 14:21:34 crc kubenswrapper[4676]: I0930 14:21:34.096076 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4192acc-91dd-48e6-9049-724c2f87043d","Type":"ContainerStarted","Data":"a91acfac00354946feb900757fbb1dac3c01f011ddcbe3a5752687b2509b08b8"} Sep 30 14:21:34 crc kubenswrapper[4676]: I0930 14:21:34.098474 4676 generic.go:334] "Generic (PLEG): container finished" podID="96101623-7868-4b78-8c66-9aeb8e5e8ec5" containerID="bcf77ab230c8ecdb557bad36d67b63abf7e8df5071f117b4ab224fd883d64d3c" exitCode=0 Sep 30 14:21:34 crc kubenswrapper[4676]: I0930 14:21:34.098531 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-5f61-account-create-jbd7r" event={"ID":"96101623-7868-4b78-8c66-9aeb8e5e8ec5","Type":"ContainerDied","Data":"bcf77ab230c8ecdb557bad36d67b63abf7e8df5071f117b4ab224fd883d64d3c"} Sep 30 14:21:34 crc kubenswrapper[4676]: I0930 14:21:34.100848 4676 generic.go:334] "Generic (PLEG): container finished" podID="00c04551-20dc-4c4a-bb5b-39012ad94d51" containerID="3ae371b754f0fb586408bdbfb98125a3b3dd21e489e16eeea16990c6f8789542" exitCode=143 Sep 30 14:21:34 crc kubenswrapper[4676]: I0930 14:21:34.100923 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"00c04551-20dc-4c4a-bb5b-39012ad94d51","Type":"ContainerDied","Data":"3ae371b754f0fb586408bdbfb98125a3b3dd21e489e16eeea16990c6f8789542"} Sep 30 14:21:34 crc kubenswrapper[4676]: I0930 14:21:34.103263 4676 generic.go:334] "Generic (PLEG): container finished" podID="d8eb7b5d-dac6-4093-a364-cc7311e159ef" containerID="9d99058d6bebd5c65f55c6f30f0b2166b4dedbca8a5b7726560e1f66b07e386b" exitCode=0 Sep 30 14:21:34 crc kubenswrapper[4676]: I0930 14:21:34.103303 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-1872-account-create-m4vlg" event={"ID":"d8eb7b5d-dac6-4093-a364-cc7311e159ef","Type":"ContainerDied","Data":"9d99058d6bebd5c65f55c6f30f0b2166b4dedbca8a5b7726560e1f66b07e386b"} Sep 30 14:21:34 crc kubenswrapper[4676]: I0930 14:21:34.271018 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 14:21:34 crc kubenswrapper[4676]: I0930 14:21:34.271256 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="77375b04-44bb-4250-a54c-0c193201f738" containerName="glance-log" containerID="cri-o://37016d8d8c710b42210167415babc7230f414a4ecdc9f080d9f5aca1c982d291" gracePeriod=30 Sep 30 14:21:34 crc kubenswrapper[4676]: I0930 14:21:34.271391 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="77375b04-44bb-4250-a54c-0c193201f738" containerName="glance-httpd" containerID="cri-o://925d36d9dc230da8ad39f5b029158299a72246e44963e0fa825f764b1c91bfa9" gracePeriod=30 Sep 30 14:21:34 crc kubenswrapper[4676]: I0930 14:21:34.565049 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-1872-account-create-m4vlg" Sep 30 14:21:34 crc kubenswrapper[4676]: I0930 14:21:34.687850 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-5f61-account-create-jbd7r" Sep 30 14:21:34 crc kubenswrapper[4676]: I0930 14:21:34.695328 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 14:21:34 crc kubenswrapper[4676]: I0930 14:21:34.708793 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5pg6\" (UniqueName: \"kubernetes.io/projected/d8eb7b5d-dac6-4093-a364-cc7311e159ef-kube-api-access-n5pg6\") pod \"d8eb7b5d-dac6-4093-a364-cc7311e159ef\" (UID: \"d8eb7b5d-dac6-4093-a364-cc7311e159ef\") " Sep 30 14:21:34 crc kubenswrapper[4676]: I0930 14:21:34.718249 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8eb7b5d-dac6-4093-a364-cc7311e159ef-kube-api-access-n5pg6" (OuterVolumeSpecName: "kube-api-access-n5pg6") pod "d8eb7b5d-dac6-4093-a364-cc7311e159ef" (UID: "d8eb7b5d-dac6-4093-a364-cc7311e159ef"). InnerVolumeSpecName "kube-api-access-n5pg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:21:34 crc kubenswrapper[4676]: I0930 14:21:34.811488 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5ftd\" (UniqueName: \"kubernetes.io/projected/96101623-7868-4b78-8c66-9aeb8e5e8ec5-kube-api-access-v5ftd\") pod \"96101623-7868-4b78-8c66-9aeb8e5e8ec5\" (UID: \"96101623-7868-4b78-8c66-9aeb8e5e8ec5\") " Sep 30 14:21:34 crc kubenswrapper[4676]: I0930 14:21:34.812346 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5pg6\" (UniqueName: \"kubernetes.io/projected/d8eb7b5d-dac6-4093-a364-cc7311e159ef-kube-api-access-n5pg6\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:34 crc kubenswrapper[4676]: I0930 14:21:34.816182 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96101623-7868-4b78-8c66-9aeb8e5e8ec5-kube-api-access-v5ftd" (OuterVolumeSpecName: "kube-api-access-v5ftd") pod "96101623-7868-4b78-8c66-9aeb8e5e8ec5" (UID: "96101623-7868-4b78-8c66-9aeb8e5e8ec5"). InnerVolumeSpecName "kube-api-access-v5ftd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:21:34 crc kubenswrapper[4676]: I0930 14:21:34.913835 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5ftd\" (UniqueName: \"kubernetes.io/projected/96101623-7868-4b78-8c66-9aeb8e5e8ec5-kube-api-access-v5ftd\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:35 crc kubenswrapper[4676]: I0930 14:21:35.124448 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-1872-account-create-m4vlg" event={"ID":"d8eb7b5d-dac6-4093-a364-cc7311e159ef","Type":"ContainerDied","Data":"65d83f674ce190baea11488f64c6b380463278a9ecebbc48d451c2ddfa2712f3"} Sep 30 14:21:35 crc kubenswrapper[4676]: I0930 14:21:35.124696 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65d83f674ce190baea11488f64c6b380463278a9ecebbc48d451c2ddfa2712f3" Sep 30 14:21:35 crc kubenswrapper[4676]: I0930 14:21:35.124839 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-1872-account-create-m4vlg" Sep 30 14:21:35 crc kubenswrapper[4676]: I0930 14:21:35.127763 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4192acc-91dd-48e6-9049-724c2f87043d","Type":"ContainerStarted","Data":"5317f29887f8501b5bcf2aaa71651b5a733faeb8ce126f8fbaf16d5d10878774"} Sep 30 14:21:35 crc kubenswrapper[4676]: I0930 14:21:35.127806 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4192acc-91dd-48e6-9049-724c2f87043d","Type":"ContainerStarted","Data":"a316181a84e64a874e15dc419463cdcd155cd443703667e9f347429c1c83ae7c"} Sep 30 14:21:35 crc kubenswrapper[4676]: I0930 14:21:35.129872 4676 generic.go:334] "Generic (PLEG): container finished" podID="e8be2e62-b82f-43cb-b1bb-099d7d0f3478" containerID="3961d18819a828f34caead58848f4b73c474baf1b075cbe22e7270b079e01318" exitCode=0 Sep 30 14:21:35 crc kubenswrapper[4676]: I0930 14:21:35.130025 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bbb84f9b4-l58vk" event={"ID":"e8be2e62-b82f-43cb-b1bb-099d7d0f3478","Type":"ContainerDied","Data":"3961d18819a828f34caead58848f4b73c474baf1b075cbe22e7270b079e01318"} Sep 30 14:21:35 crc kubenswrapper[4676]: I0930 14:21:35.131410 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-5f61-account-create-jbd7r" event={"ID":"96101623-7868-4b78-8c66-9aeb8e5e8ec5","Type":"ContainerDied","Data":"ff0a1d5617a734335d0231d1e672c25c6fb01d1aec78da1d0a8c7bb8534a6969"} Sep 30 14:21:35 crc kubenswrapper[4676]: I0930 14:21:35.131437 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff0a1d5617a734335d0231d1e672c25c6fb01d1aec78da1d0a8c7bb8534a6969" Sep 30 14:21:35 crc kubenswrapper[4676]: I0930 14:21:35.131517 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-5f61-account-create-jbd7r" Sep 30 14:21:35 crc kubenswrapper[4676]: I0930 14:21:35.147006 4676 generic.go:334] "Generic (PLEG): container finished" podID="77375b04-44bb-4250-a54c-0c193201f738" containerID="37016d8d8c710b42210167415babc7230f414a4ecdc9f080d9f5aca1c982d291" exitCode=143 Sep 30 14:21:35 crc kubenswrapper[4676]: I0930 14:21:35.147211 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"77375b04-44bb-4250-a54c-0c193201f738","Type":"ContainerDied","Data":"37016d8d8c710b42210167415babc7230f414a4ecdc9f080d9f5aca1c982d291"} Sep 30 14:21:35 crc kubenswrapper[4676]: I0930 14:21:35.198324 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6bbb84f9b4-l58vk" Sep 30 14:21:35 crc kubenswrapper[4676]: I0930 14:21:35.322994 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e8be2e62-b82f-43cb-b1bb-099d7d0f3478-config\") pod \"e8be2e62-b82f-43cb-b1bb-099d7d0f3478\" (UID: \"e8be2e62-b82f-43cb-b1bb-099d7d0f3478\") " Sep 30 14:21:35 crc kubenswrapper[4676]: I0930 14:21:35.323139 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8be2e62-b82f-43cb-b1bb-099d7d0f3478-combined-ca-bundle\") pod \"e8be2e62-b82f-43cb-b1bb-099d7d0f3478\" (UID: \"e8be2e62-b82f-43cb-b1bb-099d7d0f3478\") " Sep 30 14:21:35 crc kubenswrapper[4676]: I0930 14:21:35.323224 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e8be2e62-b82f-43cb-b1bb-099d7d0f3478-httpd-config\") pod \"e8be2e62-b82f-43cb-b1bb-099d7d0f3478\" (UID: \"e8be2e62-b82f-43cb-b1bb-099d7d0f3478\") " Sep 30 14:21:35 crc kubenswrapper[4676]: I0930 14:21:35.323308 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8be2e62-b82f-43cb-b1bb-099d7d0f3478-ovndb-tls-certs\") pod \"e8be2e62-b82f-43cb-b1bb-099d7d0f3478\" (UID: \"e8be2e62-b82f-43cb-b1bb-099d7d0f3478\") " Sep 30 14:21:35 crc kubenswrapper[4676]: I0930 14:21:35.323361 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfzpt\" (UniqueName: \"kubernetes.io/projected/e8be2e62-b82f-43cb-b1bb-099d7d0f3478-kube-api-access-vfzpt\") pod \"e8be2e62-b82f-43cb-b1bb-099d7d0f3478\" (UID: \"e8be2e62-b82f-43cb-b1bb-099d7d0f3478\") " Sep 30 14:21:35 crc kubenswrapper[4676]: I0930 14:21:35.331354 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8be2e62-b82f-43cb-b1bb-099d7d0f3478-kube-api-access-vfzpt" (OuterVolumeSpecName: "kube-api-access-vfzpt") pod "e8be2e62-b82f-43cb-b1bb-099d7d0f3478" (UID: "e8be2e62-b82f-43cb-b1bb-099d7d0f3478"). InnerVolumeSpecName "kube-api-access-vfzpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:21:35 crc kubenswrapper[4676]: I0930 14:21:35.334150 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8be2e62-b82f-43cb-b1bb-099d7d0f3478-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "e8be2e62-b82f-43cb-b1bb-099d7d0f3478" (UID: "e8be2e62-b82f-43cb-b1bb-099d7d0f3478"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:21:35 crc kubenswrapper[4676]: I0930 14:21:35.441172 4676 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e8be2e62-b82f-43cb-b1bb-099d7d0f3478-httpd-config\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:35 crc kubenswrapper[4676]: I0930 14:21:35.441213 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vfzpt\" (UniqueName: \"kubernetes.io/projected/e8be2e62-b82f-43cb-b1bb-099d7d0f3478-kube-api-access-vfzpt\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:35 crc kubenswrapper[4676]: I0930 14:21:35.461860 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8be2e62-b82f-43cb-b1bb-099d7d0f3478-config" (OuterVolumeSpecName: "config") pod "e8be2e62-b82f-43cb-b1bb-099d7d0f3478" (UID: "e8be2e62-b82f-43cb-b1bb-099d7d0f3478"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:21:35 crc kubenswrapper[4676]: I0930 14:21:35.504851 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8be2e62-b82f-43cb-b1bb-099d7d0f3478-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "e8be2e62-b82f-43cb-b1bb-099d7d0f3478" (UID: "e8be2e62-b82f-43cb-b1bb-099d7d0f3478"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:21:35 crc kubenswrapper[4676]: I0930 14:21:35.509155 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8be2e62-b82f-43cb-b1bb-099d7d0f3478-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e8be2e62-b82f-43cb-b1bb-099d7d0f3478" (UID: "e8be2e62-b82f-43cb-b1bb-099d7d0f3478"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:21:35 crc kubenswrapper[4676]: I0930 14:21:35.547386 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/e8be2e62-b82f-43cb-b1bb-099d7d0f3478-config\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:35 crc kubenswrapper[4676]: I0930 14:21:35.547426 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8be2e62-b82f-43cb-b1bb-099d7d0f3478-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:35 crc kubenswrapper[4676]: I0930 14:21:35.547440 4676 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8be2e62-b82f-43cb-b1bb-099d7d0f3478-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:35 crc kubenswrapper[4676]: I0930 14:21:35.653875 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-ccb8-account-create-f8xw8" Sep 30 14:21:35 crc kubenswrapper[4676]: I0930 14:21:35.752603 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwzr2\" (UniqueName: \"kubernetes.io/projected/69502754-998e-4124-8956-67747a925b66-kube-api-access-qwzr2\") pod \"69502754-998e-4124-8956-67747a925b66\" (UID: \"69502754-998e-4124-8956-67747a925b66\") " Sep 30 14:21:35 crc kubenswrapper[4676]: I0930 14:21:35.773138 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69502754-998e-4124-8956-67747a925b66-kube-api-access-qwzr2" (OuterVolumeSpecName: "kube-api-access-qwzr2") pod "69502754-998e-4124-8956-67747a925b66" (UID: "69502754-998e-4124-8956-67747a925b66"). InnerVolumeSpecName "kube-api-access-qwzr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:21:35 crc kubenswrapper[4676]: I0930 14:21:35.860995 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwzr2\" (UniqueName: \"kubernetes.io/projected/69502754-998e-4124-8956-67747a925b66-kube-api-access-qwzr2\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:36 crc kubenswrapper[4676]: I0930 14:21:36.157517 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4192acc-91dd-48e6-9049-724c2f87043d","Type":"ContainerStarted","Data":"7f9a9363dd2efb8b0d225b0419d5003e0dd11e4c40e34876500fdfacd02f2ac5"} Sep 30 14:21:36 crc kubenswrapper[4676]: I0930 14:21:36.160333 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bbb84f9b4-l58vk" event={"ID":"e8be2e62-b82f-43cb-b1bb-099d7d0f3478","Type":"ContainerDied","Data":"5a30999bbbe9b011dc6a502749b1d7d060026c5693c8b509173f2513286bcdc0"} Sep 30 14:21:36 crc kubenswrapper[4676]: I0930 14:21:36.160390 4676 scope.go:117] "RemoveContainer" containerID="7c0af458813ddf93c4dc88ec6ea27a8727dbfa23aae24c5dc0df4b8cb0d0af17" Sep 30 14:21:36 crc kubenswrapper[4676]: I0930 14:21:36.160352 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6bbb84f9b4-l58vk" Sep 30 14:21:36 crc kubenswrapper[4676]: I0930 14:21:36.163378 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-ccb8-account-create-f8xw8" event={"ID":"69502754-998e-4124-8956-67747a925b66","Type":"ContainerDied","Data":"96a27ae723547227f12842f3675b58adc55e35e7539b20f7d33a707ba6b2ec21"} Sep 30 14:21:36 crc kubenswrapper[4676]: I0930 14:21:36.163462 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96a27ae723547227f12842f3675b58adc55e35e7539b20f7d33a707ba6b2ec21" Sep 30 14:21:36 crc kubenswrapper[4676]: I0930 14:21:36.163541 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-ccb8-account-create-f8xw8" Sep 30 14:21:36 crc kubenswrapper[4676]: I0930 14:21:36.199970 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6bbb84f9b4-l58vk"] Sep 30 14:21:36 crc kubenswrapper[4676]: I0930 14:21:36.212571 4676 scope.go:117] "RemoveContainer" containerID="3961d18819a828f34caead58848f4b73c474baf1b075cbe22e7270b079e01318" Sep 30 14:21:36 crc kubenswrapper[4676]: I0930 14:21:36.213497 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6bbb84f9b4-l58vk"] Sep 30 14:21:37 crc kubenswrapper[4676]: I0930 14:21:37.019026 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-589w4"] Sep 30 14:21:37 crc kubenswrapper[4676]: E0930 14:21:37.020184 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8be2e62-b82f-43cb-b1bb-099d7d0f3478" containerName="neutron-httpd" Sep 30 14:21:37 crc kubenswrapper[4676]: I0930 14:21:37.020214 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8be2e62-b82f-43cb-b1bb-099d7d0f3478" containerName="neutron-httpd" Sep 30 14:21:37 crc kubenswrapper[4676]: E0930 14:21:37.020249 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96101623-7868-4b78-8c66-9aeb8e5e8ec5" containerName="mariadb-account-create" Sep 30 14:21:37 crc kubenswrapper[4676]: I0930 14:21:37.020261 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="96101623-7868-4b78-8c66-9aeb8e5e8ec5" containerName="mariadb-account-create" Sep 30 14:21:37 crc kubenswrapper[4676]: E0930 14:21:37.020302 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69502754-998e-4124-8956-67747a925b66" containerName="mariadb-account-create" Sep 30 14:21:37 crc kubenswrapper[4676]: I0930 14:21:37.020311 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="69502754-998e-4124-8956-67747a925b66" containerName="mariadb-account-create" Sep 30 14:21:37 crc kubenswrapper[4676]: E0930 14:21:37.020324 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8eb7b5d-dac6-4093-a364-cc7311e159ef" containerName="mariadb-account-create" Sep 30 14:21:37 crc kubenswrapper[4676]: I0930 14:21:37.020331 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8eb7b5d-dac6-4093-a364-cc7311e159ef" containerName="mariadb-account-create" Sep 30 14:21:37 crc kubenswrapper[4676]: E0930 14:21:37.020351 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8be2e62-b82f-43cb-b1bb-099d7d0f3478" containerName="neutron-api" Sep 30 14:21:37 crc kubenswrapper[4676]: I0930 14:21:37.020358 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8be2e62-b82f-43cb-b1bb-099d7d0f3478" containerName="neutron-api" Sep 30 14:21:37 crc kubenswrapper[4676]: I0930 14:21:37.020569 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8be2e62-b82f-43cb-b1bb-099d7d0f3478" containerName="neutron-httpd" Sep 30 14:21:37 crc kubenswrapper[4676]: I0930 14:21:37.020592 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="69502754-998e-4124-8956-67747a925b66" containerName="mariadb-account-create" Sep 30 14:21:37 crc kubenswrapper[4676]: I0930 14:21:37.020602 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="96101623-7868-4b78-8c66-9aeb8e5e8ec5" containerName="mariadb-account-create" Sep 30 14:21:37 crc kubenswrapper[4676]: I0930 14:21:37.020612 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8be2e62-b82f-43cb-b1bb-099d7d0f3478" containerName="neutron-api" Sep 30 14:21:37 crc kubenswrapper[4676]: I0930 14:21:37.020623 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8eb7b5d-dac6-4093-a364-cc7311e159ef" containerName="mariadb-account-create" Sep 30 14:21:37 crc kubenswrapper[4676]: I0930 14:21:37.021614 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-589w4" Sep 30 14:21:37 crc kubenswrapper[4676]: I0930 14:21:37.026076 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Sep 30 14:21:37 crc kubenswrapper[4676]: I0930 14:21:37.026289 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-bdwsz" Sep 30 14:21:37 crc kubenswrapper[4676]: I0930 14:21:37.029308 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Sep 30 14:21:37 crc kubenswrapper[4676]: I0930 14:21:37.062397 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-589w4"] Sep 30 14:21:37 crc kubenswrapper[4676]: I0930 14:21:37.100215 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18f239b5-877b-4291-8481-6a121c25bff9-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-589w4\" (UID: \"18f239b5-877b-4291-8481-6a121c25bff9\") " pod="openstack/nova-cell0-conductor-db-sync-589w4" Sep 30 14:21:37 crc kubenswrapper[4676]: I0930 14:21:37.100837 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18f239b5-877b-4291-8481-6a121c25bff9-scripts\") pod \"nova-cell0-conductor-db-sync-589w4\" (UID: \"18f239b5-877b-4291-8481-6a121c25bff9\") " pod="openstack/nova-cell0-conductor-db-sync-589w4" Sep 30 14:21:37 crc kubenswrapper[4676]: I0930 14:21:37.100874 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wltxv\" (UniqueName: \"kubernetes.io/projected/18f239b5-877b-4291-8481-6a121c25bff9-kube-api-access-wltxv\") pod \"nova-cell0-conductor-db-sync-589w4\" (UID: \"18f239b5-877b-4291-8481-6a121c25bff9\") " pod="openstack/nova-cell0-conductor-db-sync-589w4" Sep 30 14:21:37 crc kubenswrapper[4676]: I0930 14:21:37.101022 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18f239b5-877b-4291-8481-6a121c25bff9-config-data\") pod \"nova-cell0-conductor-db-sync-589w4\" (UID: \"18f239b5-877b-4291-8481-6a121c25bff9\") " pod="openstack/nova-cell0-conductor-db-sync-589w4" Sep 30 14:21:37 crc kubenswrapper[4676]: I0930 14:21:37.177898 4676 generic.go:334] "Generic (PLEG): container finished" podID="00c04551-20dc-4c4a-bb5b-39012ad94d51" containerID="66dab40a6f6f0114922534982e6172da0eede50d0b26c825e28b48c515304ad4" exitCode=0 Sep 30 14:21:37 crc kubenswrapper[4676]: I0930 14:21:37.177956 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"00c04551-20dc-4c4a-bb5b-39012ad94d51","Type":"ContainerDied","Data":"66dab40a6f6f0114922534982e6172da0eede50d0b26c825e28b48c515304ad4"} Sep 30 14:21:37 crc kubenswrapper[4676]: I0930 14:21:37.178055 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"00c04551-20dc-4c4a-bb5b-39012ad94d51","Type":"ContainerDied","Data":"47b6b2ec83b27f33b025f65d8d9b93a207954a291ca9f62a7c6a46f4277138b6"} Sep 30 14:21:37 crc kubenswrapper[4676]: I0930 14:21:37.178070 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="47b6b2ec83b27f33b025f65d8d9b93a207954a291ca9f62a7c6a46f4277138b6" Sep 30 14:21:37 crc kubenswrapper[4676]: I0930 14:21:37.203002 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18f239b5-877b-4291-8481-6a121c25bff9-config-data\") pod \"nova-cell0-conductor-db-sync-589w4\" (UID: \"18f239b5-877b-4291-8481-6a121c25bff9\") " pod="openstack/nova-cell0-conductor-db-sync-589w4" Sep 30 14:21:37 crc kubenswrapper[4676]: I0930 14:21:37.203210 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18f239b5-877b-4291-8481-6a121c25bff9-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-589w4\" (UID: \"18f239b5-877b-4291-8481-6a121c25bff9\") " pod="openstack/nova-cell0-conductor-db-sync-589w4" Sep 30 14:21:37 crc kubenswrapper[4676]: I0930 14:21:37.204533 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18f239b5-877b-4291-8481-6a121c25bff9-scripts\") pod \"nova-cell0-conductor-db-sync-589w4\" (UID: \"18f239b5-877b-4291-8481-6a121c25bff9\") " pod="openstack/nova-cell0-conductor-db-sync-589w4" Sep 30 14:21:37 crc kubenswrapper[4676]: I0930 14:21:37.204614 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wltxv\" (UniqueName: \"kubernetes.io/projected/18f239b5-877b-4291-8481-6a121c25bff9-kube-api-access-wltxv\") pod \"nova-cell0-conductor-db-sync-589w4\" (UID: \"18f239b5-877b-4291-8481-6a121c25bff9\") " pod="openstack/nova-cell0-conductor-db-sync-589w4" Sep 30 14:21:37 crc kubenswrapper[4676]: I0930 14:21:37.209716 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18f239b5-877b-4291-8481-6a121c25bff9-config-data\") pod \"nova-cell0-conductor-db-sync-589w4\" (UID: \"18f239b5-877b-4291-8481-6a121c25bff9\") " pod="openstack/nova-cell0-conductor-db-sync-589w4" Sep 30 14:21:37 crc kubenswrapper[4676]: I0930 14:21:37.210362 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18f239b5-877b-4291-8481-6a121c25bff9-scripts\") pod \"nova-cell0-conductor-db-sync-589w4\" (UID: \"18f239b5-877b-4291-8481-6a121c25bff9\") " pod="openstack/nova-cell0-conductor-db-sync-589w4" Sep 30 14:21:37 crc kubenswrapper[4676]: I0930 14:21:37.211687 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18f239b5-877b-4291-8481-6a121c25bff9-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-589w4\" (UID: \"18f239b5-877b-4291-8481-6a121c25bff9\") " pod="openstack/nova-cell0-conductor-db-sync-589w4" Sep 30 14:21:37 crc kubenswrapper[4676]: I0930 14:21:37.226721 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wltxv\" (UniqueName: \"kubernetes.io/projected/18f239b5-877b-4291-8481-6a121c25bff9-kube-api-access-wltxv\") pod \"nova-cell0-conductor-db-sync-589w4\" (UID: \"18f239b5-877b-4291-8481-6a121c25bff9\") " pod="openstack/nova-cell0-conductor-db-sync-589w4" Sep 30 14:21:37 crc kubenswrapper[4676]: I0930 14:21:37.245793 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 14:21:37 crc kubenswrapper[4676]: I0930 14:21:37.381521 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-589w4" Sep 30 14:21:37 crc kubenswrapper[4676]: I0930 14:21:37.416608 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00c04551-20dc-4c4a-bb5b-39012ad94d51-logs\") pod \"00c04551-20dc-4c4a-bb5b-39012ad94d51\" (UID: \"00c04551-20dc-4c4a-bb5b-39012ad94d51\") " Sep 30 14:21:37 crc kubenswrapper[4676]: I0930 14:21:37.416766 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"00c04551-20dc-4c4a-bb5b-39012ad94d51\" (UID: \"00c04551-20dc-4c4a-bb5b-39012ad94d51\") " Sep 30 14:21:37 crc kubenswrapper[4676]: I0930 14:21:37.416799 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00c04551-20dc-4c4a-bb5b-39012ad94d51-config-data\") pod \"00c04551-20dc-4c4a-bb5b-39012ad94d51\" (UID: \"00c04551-20dc-4c4a-bb5b-39012ad94d51\") " Sep 30 14:21:37 crc kubenswrapper[4676]: I0930 14:21:37.416924 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00c04551-20dc-4c4a-bb5b-39012ad94d51-combined-ca-bundle\") pod \"00c04551-20dc-4c4a-bb5b-39012ad94d51\" (UID: \"00c04551-20dc-4c4a-bb5b-39012ad94d51\") " Sep 30 14:21:37 crc kubenswrapper[4676]: I0930 14:21:37.416954 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmj5w\" (UniqueName: \"kubernetes.io/projected/00c04551-20dc-4c4a-bb5b-39012ad94d51-kube-api-access-zmj5w\") pod \"00c04551-20dc-4c4a-bb5b-39012ad94d51\" (UID: \"00c04551-20dc-4c4a-bb5b-39012ad94d51\") " Sep 30 14:21:37 crc kubenswrapper[4676]: I0930 14:21:37.416985 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/00c04551-20dc-4c4a-bb5b-39012ad94d51-public-tls-certs\") pod \"00c04551-20dc-4c4a-bb5b-39012ad94d51\" (UID: \"00c04551-20dc-4c4a-bb5b-39012ad94d51\") " Sep 30 14:21:37 crc kubenswrapper[4676]: I0930 14:21:37.417036 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/00c04551-20dc-4c4a-bb5b-39012ad94d51-httpd-run\") pod \"00c04551-20dc-4c4a-bb5b-39012ad94d51\" (UID: \"00c04551-20dc-4c4a-bb5b-39012ad94d51\") " Sep 30 14:21:37 crc kubenswrapper[4676]: I0930 14:21:37.417054 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00c04551-20dc-4c4a-bb5b-39012ad94d51-scripts\") pod \"00c04551-20dc-4c4a-bb5b-39012ad94d51\" (UID: \"00c04551-20dc-4c4a-bb5b-39012ad94d51\") " Sep 30 14:21:37 crc kubenswrapper[4676]: I0930 14:21:37.433180 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00c04551-20dc-4c4a-bb5b-39012ad94d51-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "00c04551-20dc-4c4a-bb5b-39012ad94d51" (UID: "00c04551-20dc-4c4a-bb5b-39012ad94d51"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:21:37 crc kubenswrapper[4676]: I0930 14:21:37.433437 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00c04551-20dc-4c4a-bb5b-39012ad94d51-logs" (OuterVolumeSpecName: "logs") pod "00c04551-20dc-4c4a-bb5b-39012ad94d51" (UID: "00c04551-20dc-4c4a-bb5b-39012ad94d51"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:21:37 crc kubenswrapper[4676]: I0930 14:21:37.438246 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00c04551-20dc-4c4a-bb5b-39012ad94d51-scripts" (OuterVolumeSpecName: "scripts") pod "00c04551-20dc-4c4a-bb5b-39012ad94d51" (UID: "00c04551-20dc-4c4a-bb5b-39012ad94d51"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:21:37 crc kubenswrapper[4676]: I0930 14:21:37.443802 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "00c04551-20dc-4c4a-bb5b-39012ad94d51" (UID: "00c04551-20dc-4c4a-bb5b-39012ad94d51"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 14:21:37 crc kubenswrapper[4676]: I0930 14:21:37.444209 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00c04551-20dc-4c4a-bb5b-39012ad94d51-kube-api-access-zmj5w" (OuterVolumeSpecName: "kube-api-access-zmj5w") pod "00c04551-20dc-4c4a-bb5b-39012ad94d51" (UID: "00c04551-20dc-4c4a-bb5b-39012ad94d51"). InnerVolumeSpecName "kube-api-access-zmj5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:21:37 crc kubenswrapper[4676]: I0930 14:21:37.456778 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8be2e62-b82f-43cb-b1bb-099d7d0f3478" path="/var/lib/kubelet/pods/e8be2e62-b82f-43cb-b1bb-099d7d0f3478/volumes" Sep 30 14:21:37 crc kubenswrapper[4676]: I0930 14:21:37.490889 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00c04551-20dc-4c4a-bb5b-39012ad94d51-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "00c04551-20dc-4c4a-bb5b-39012ad94d51" (UID: "00c04551-20dc-4c4a-bb5b-39012ad94d51"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:21:37 crc kubenswrapper[4676]: I0930 14:21:37.513223 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00c04551-20dc-4c4a-bb5b-39012ad94d51-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "00c04551-20dc-4c4a-bb5b-39012ad94d51" (UID: "00c04551-20dc-4c4a-bb5b-39012ad94d51"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:21:37 crc kubenswrapper[4676]: I0930 14:21:37.519116 4676 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Sep 30 14:21:37 crc kubenswrapper[4676]: I0930 14:21:37.519164 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00c04551-20dc-4c4a-bb5b-39012ad94d51-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:37 crc kubenswrapper[4676]: I0930 14:21:37.519178 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmj5w\" (UniqueName: \"kubernetes.io/projected/00c04551-20dc-4c4a-bb5b-39012ad94d51-kube-api-access-zmj5w\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:37 crc kubenswrapper[4676]: I0930 14:21:37.519188 4676 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/00c04551-20dc-4c4a-bb5b-39012ad94d51-public-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:37 crc kubenswrapper[4676]: I0930 14:21:37.519196 4676 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/00c04551-20dc-4c4a-bb5b-39012ad94d51-httpd-run\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:37 crc kubenswrapper[4676]: I0930 14:21:37.519208 4676 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00c04551-20dc-4c4a-bb5b-39012ad94d51-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:37 crc kubenswrapper[4676]: I0930 14:21:37.519219 4676 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00c04551-20dc-4c4a-bb5b-39012ad94d51-logs\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:37 crc kubenswrapper[4676]: I0930 14:21:37.536622 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00c04551-20dc-4c4a-bb5b-39012ad94d51-config-data" (OuterVolumeSpecName: "config-data") pod "00c04551-20dc-4c4a-bb5b-39012ad94d51" (UID: "00c04551-20dc-4c4a-bb5b-39012ad94d51"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:21:37 crc kubenswrapper[4676]: I0930 14:21:37.553502 4676 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Sep 30 14:21:37 crc kubenswrapper[4676]: I0930 14:21:37.620693 4676 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:37 crc kubenswrapper[4676]: I0930 14:21:37.620720 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00c04551-20dc-4c4a-bb5b-39012ad94d51-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.007469 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-589w4"] Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.105525 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.203131 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-589w4" event={"ID":"18f239b5-877b-4291-8481-6a121c25bff9","Type":"ContainerStarted","Data":"1792e23a7c12b6c5d711f4bceff49e9f9eca9c7865b39256123e60375ed0954f"} Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.231987 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4192acc-91dd-48e6-9049-724c2f87043d","Type":"ContainerStarted","Data":"c44fbee39b5cbb506a60b7580dd96675036b64e9982f8f9878b3268fad375e1c"} Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.232160 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b4192acc-91dd-48e6-9049-724c2f87043d" containerName="ceilometer-central-agent" containerID="cri-o://a316181a84e64a874e15dc419463cdcd155cd443703667e9f347429c1c83ae7c" gracePeriod=30 Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.232241 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b4192acc-91dd-48e6-9049-724c2f87043d" containerName="proxy-httpd" containerID="cri-o://c44fbee39b5cbb506a60b7580dd96675036b64e9982f8f9878b3268fad375e1c" gracePeriod=30 Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.232263 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.232276 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b4192acc-91dd-48e6-9049-724c2f87043d" containerName="sg-core" containerID="cri-o://7f9a9363dd2efb8b0d225b0419d5003e0dd11e4c40e34876500fdfacd02f2ac5" gracePeriod=30 Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.232308 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b4192acc-91dd-48e6-9049-724c2f87043d" containerName="ceilometer-notification-agent" containerID="cri-o://5317f29887f8501b5bcf2aaa71651b5a733faeb8ce126f8fbaf16d5d10878774" gracePeriod=30 Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.235761 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"77375b04-44bb-4250-a54c-0c193201f738\" (UID: \"77375b04-44bb-4250-a54c-0c193201f738\") " Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.235850 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77375b04-44bb-4250-a54c-0c193201f738-combined-ca-bundle\") pod \"77375b04-44bb-4250-a54c-0c193201f738\" (UID: \"77375b04-44bb-4250-a54c-0c193201f738\") " Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.235925 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77375b04-44bb-4250-a54c-0c193201f738-scripts\") pod \"77375b04-44bb-4250-a54c-0c193201f738\" (UID: \"77375b04-44bb-4250-a54c-0c193201f738\") " Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.235982 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77375b04-44bb-4250-a54c-0c193201f738-logs\") pod \"77375b04-44bb-4250-a54c-0c193201f738\" (UID: \"77375b04-44bb-4250-a54c-0c193201f738\") " Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.236063 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77375b04-44bb-4250-a54c-0c193201f738-config-data\") pod \"77375b04-44bb-4250-a54c-0c193201f738\" (UID: \"77375b04-44bb-4250-a54c-0c193201f738\") " Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.236098 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/77375b04-44bb-4250-a54c-0c193201f738-httpd-run\") pod \"77375b04-44bb-4250-a54c-0c193201f738\" (UID: \"77375b04-44bb-4250-a54c-0c193201f738\") " Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.236146 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5pgdf\" (UniqueName: \"kubernetes.io/projected/77375b04-44bb-4250-a54c-0c193201f738-kube-api-access-5pgdf\") pod \"77375b04-44bb-4250-a54c-0c193201f738\" (UID: \"77375b04-44bb-4250-a54c-0c193201f738\") " Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.236269 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/77375b04-44bb-4250-a54c-0c193201f738-internal-tls-certs\") pod \"77375b04-44bb-4250-a54c-0c193201f738\" (UID: \"77375b04-44bb-4250-a54c-0c193201f738\") " Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.237443 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77375b04-44bb-4250-a54c-0c193201f738-logs" (OuterVolumeSpecName: "logs") pod "77375b04-44bb-4250-a54c-0c193201f738" (UID: "77375b04-44bb-4250-a54c-0c193201f738"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.238227 4676 generic.go:334] "Generic (PLEG): container finished" podID="77375b04-44bb-4250-a54c-0c193201f738" containerID="925d36d9dc230da8ad39f5b029158299a72246e44963e0fa825f764b1c91bfa9" exitCode=0 Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.238351 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77375b04-44bb-4250-a54c-0c193201f738-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "77375b04-44bb-4250-a54c-0c193201f738" (UID: "77375b04-44bb-4250-a54c-0c193201f738"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.238387 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.241493 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.242108 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"77375b04-44bb-4250-a54c-0c193201f738","Type":"ContainerDied","Data":"925d36d9dc230da8ad39f5b029158299a72246e44963e0fa825f764b1c91bfa9"} Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.242145 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"77375b04-44bb-4250-a54c-0c193201f738","Type":"ContainerDied","Data":"5c0285907c9d2d39e4d88107a5a2f6019fcff6a73931bedf16354a62a21b9d01"} Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.242163 4676 scope.go:117] "RemoveContainer" containerID="925d36d9dc230da8ad39f5b029158299a72246e44963e0fa825f764b1c91bfa9" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.268432 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77375b04-44bb-4250-a54c-0c193201f738-kube-api-access-5pgdf" (OuterVolumeSpecName: "kube-api-access-5pgdf") pod "77375b04-44bb-4250-a54c-0c193201f738" (UID: "77375b04-44bb-4250-a54c-0c193201f738"). InnerVolumeSpecName "kube-api-access-5pgdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.268944 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77375b04-44bb-4250-a54c-0c193201f738-scripts" (OuterVolumeSpecName: "scripts") pod "77375b04-44bb-4250-a54c-0c193201f738" (UID: "77375b04-44bb-4250-a54c-0c193201f738"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.290834 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "77375b04-44bb-4250-a54c-0c193201f738" (UID: "77375b04-44bb-4250-a54c-0c193201f738"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.315170 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.727723482 podStartE2EDuration="6.315145952s" podCreationTimestamp="2025-09-30 14:21:32 +0000 UTC" firstStartedPulling="2025-09-30 14:21:33.360173716 +0000 UTC m=+1397.343262145" lastFinishedPulling="2025-09-30 14:21:36.947596186 +0000 UTC m=+1400.930684615" observedRunningTime="2025-09-30 14:21:38.304912294 +0000 UTC m=+1402.288000733" watchObservedRunningTime="2025-09-30 14:21:38.315145952 +0000 UTC m=+1402.298234381" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.338174 4676 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77375b04-44bb-4250-a54c-0c193201f738-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.338212 4676 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77375b04-44bb-4250-a54c-0c193201f738-logs\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.338221 4676 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/77375b04-44bb-4250-a54c-0c193201f738-httpd-run\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.338232 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5pgdf\" (UniqueName: \"kubernetes.io/projected/77375b04-44bb-4250-a54c-0c193201f738-kube-api-access-5pgdf\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.338257 4676 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.350714 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77375b04-44bb-4250-a54c-0c193201f738-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "77375b04-44bb-4250-a54c-0c193201f738" (UID: "77375b04-44bb-4250-a54c-0c193201f738"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.376029 4676 scope.go:117] "RemoveContainer" containerID="37016d8d8c710b42210167415babc7230f414a4ecdc9f080d9f5aca1c982d291" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.408078 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77375b04-44bb-4250-a54c-0c193201f738-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "77375b04-44bb-4250-a54c-0c193201f738" (UID: "77375b04-44bb-4250-a54c-0c193201f738"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.414121 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.424306 4676 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.427093 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.428023 4676 scope.go:117] "RemoveContainer" containerID="925d36d9dc230da8ad39f5b029158299a72246e44963e0fa825f764b1c91bfa9" Sep 30 14:21:38 crc kubenswrapper[4676]: E0930 14:21:38.430243 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"925d36d9dc230da8ad39f5b029158299a72246e44963e0fa825f764b1c91bfa9\": container with ID starting with 925d36d9dc230da8ad39f5b029158299a72246e44963e0fa825f764b1c91bfa9 not found: ID does not exist" containerID="925d36d9dc230da8ad39f5b029158299a72246e44963e0fa825f764b1c91bfa9" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.430291 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"925d36d9dc230da8ad39f5b029158299a72246e44963e0fa825f764b1c91bfa9"} err="failed to get container status \"925d36d9dc230da8ad39f5b029158299a72246e44963e0fa825f764b1c91bfa9\": rpc error: code = NotFound desc = could not find container \"925d36d9dc230da8ad39f5b029158299a72246e44963e0fa825f764b1c91bfa9\": container with ID starting with 925d36d9dc230da8ad39f5b029158299a72246e44963e0fa825f764b1c91bfa9 not found: ID does not exist" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.430328 4676 scope.go:117] "RemoveContainer" containerID="37016d8d8c710b42210167415babc7230f414a4ecdc9f080d9f5aca1c982d291" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.433400 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77375b04-44bb-4250-a54c-0c193201f738-config-data" (OuterVolumeSpecName: "config-data") pod "77375b04-44bb-4250-a54c-0c193201f738" (UID: "77375b04-44bb-4250-a54c-0c193201f738"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:21:38 crc kubenswrapper[4676]: E0930 14:21:38.435034 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37016d8d8c710b42210167415babc7230f414a4ecdc9f080d9f5aca1c982d291\": container with ID starting with 37016d8d8c710b42210167415babc7230f414a4ecdc9f080d9f5aca1c982d291 not found: ID does not exist" containerID="37016d8d8c710b42210167415babc7230f414a4ecdc9f080d9f5aca1c982d291" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.435087 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37016d8d8c710b42210167415babc7230f414a4ecdc9f080d9f5aca1c982d291"} err="failed to get container status \"37016d8d8c710b42210167415babc7230f414a4ecdc9f080d9f5aca1c982d291\": rpc error: code = NotFound desc = could not find container \"37016d8d8c710b42210167415babc7230f414a4ecdc9f080d9f5aca1c982d291\": container with ID starting with 37016d8d8c710b42210167415babc7230f414a4ecdc9f080d9f5aca1c982d291 not found: ID does not exist" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.436487 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 14:21:38 crc kubenswrapper[4676]: E0930 14:21:38.437185 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77375b04-44bb-4250-a54c-0c193201f738" containerName="glance-httpd" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.437213 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="77375b04-44bb-4250-a54c-0c193201f738" containerName="glance-httpd" Sep 30 14:21:38 crc kubenswrapper[4676]: E0930 14:21:38.437260 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00c04551-20dc-4c4a-bb5b-39012ad94d51" containerName="glance-log" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.437272 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="00c04551-20dc-4c4a-bb5b-39012ad94d51" containerName="glance-log" Sep 30 14:21:38 crc kubenswrapper[4676]: E0930 14:21:38.437291 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77375b04-44bb-4250-a54c-0c193201f738" containerName="glance-log" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.437300 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="77375b04-44bb-4250-a54c-0c193201f738" containerName="glance-log" Sep 30 14:21:38 crc kubenswrapper[4676]: E0930 14:21:38.437324 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00c04551-20dc-4c4a-bb5b-39012ad94d51" containerName="glance-httpd" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.437333 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="00c04551-20dc-4c4a-bb5b-39012ad94d51" containerName="glance-httpd" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.437597 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="00c04551-20dc-4c4a-bb5b-39012ad94d51" containerName="glance-log" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.437623 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="77375b04-44bb-4250-a54c-0c193201f738" containerName="glance-log" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.437653 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="77375b04-44bb-4250-a54c-0c193201f738" containerName="glance-httpd" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.437663 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="00c04551-20dc-4c4a-bb5b-39012ad94d51" containerName="glance-httpd" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.440445 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.440492 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77375b04-44bb-4250-a54c-0c193201f738-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.440520 4676 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/77375b04-44bb-4250-a54c-0c193201f738-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.440535 4676 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.440544 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77375b04-44bb-4250-a54c-0c193201f738-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.442431 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.445806 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.445991 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.542544 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a22bcf65-b8af-4f8a-845c-31b1b3609e05-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a22bcf65-b8af-4f8a-845c-31b1b3609e05\") " pod="openstack/glance-default-external-api-0" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.542587 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgjm5\" (UniqueName: \"kubernetes.io/projected/a22bcf65-b8af-4f8a-845c-31b1b3609e05-kube-api-access-bgjm5\") pod \"glance-default-external-api-0\" (UID: \"a22bcf65-b8af-4f8a-845c-31b1b3609e05\") " pod="openstack/glance-default-external-api-0" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.542610 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"a22bcf65-b8af-4f8a-845c-31b1b3609e05\") " pod="openstack/glance-default-external-api-0" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.542683 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a22bcf65-b8af-4f8a-845c-31b1b3609e05-logs\") pod \"glance-default-external-api-0\" (UID: \"a22bcf65-b8af-4f8a-845c-31b1b3609e05\") " pod="openstack/glance-default-external-api-0" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.542700 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a22bcf65-b8af-4f8a-845c-31b1b3609e05-scripts\") pod \"glance-default-external-api-0\" (UID: \"a22bcf65-b8af-4f8a-845c-31b1b3609e05\") " pod="openstack/glance-default-external-api-0" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.542756 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a22bcf65-b8af-4f8a-845c-31b1b3609e05-config-data\") pod \"glance-default-external-api-0\" (UID: \"a22bcf65-b8af-4f8a-845c-31b1b3609e05\") " pod="openstack/glance-default-external-api-0" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.542787 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a22bcf65-b8af-4f8a-845c-31b1b3609e05-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a22bcf65-b8af-4f8a-845c-31b1b3609e05\") " pod="openstack/glance-default-external-api-0" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.542833 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a22bcf65-b8af-4f8a-845c-31b1b3609e05-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a22bcf65-b8af-4f8a-845c-31b1b3609e05\") " pod="openstack/glance-default-external-api-0" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.578280 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.585347 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.606839 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.608637 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.613192 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.613864 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.620280 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.644360 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a22bcf65-b8af-4f8a-845c-31b1b3609e05-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a22bcf65-b8af-4f8a-845c-31b1b3609e05\") " pod="openstack/glance-default-external-api-0" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.644438 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a22bcf65-b8af-4f8a-845c-31b1b3609e05-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a22bcf65-b8af-4f8a-845c-31b1b3609e05\") " pod="openstack/glance-default-external-api-0" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.644466 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgjm5\" (UniqueName: \"kubernetes.io/projected/a22bcf65-b8af-4f8a-845c-31b1b3609e05-kube-api-access-bgjm5\") pod \"glance-default-external-api-0\" (UID: \"a22bcf65-b8af-4f8a-845c-31b1b3609e05\") " pod="openstack/glance-default-external-api-0" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.644492 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"a22bcf65-b8af-4f8a-845c-31b1b3609e05\") " pod="openstack/glance-default-external-api-0" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.644535 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a22bcf65-b8af-4f8a-845c-31b1b3609e05-logs\") pod \"glance-default-external-api-0\" (UID: \"a22bcf65-b8af-4f8a-845c-31b1b3609e05\") " pod="openstack/glance-default-external-api-0" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.644554 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a22bcf65-b8af-4f8a-845c-31b1b3609e05-scripts\") pod \"glance-default-external-api-0\" (UID: \"a22bcf65-b8af-4f8a-845c-31b1b3609e05\") " pod="openstack/glance-default-external-api-0" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.644608 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a22bcf65-b8af-4f8a-845c-31b1b3609e05-config-data\") pod \"glance-default-external-api-0\" (UID: \"a22bcf65-b8af-4f8a-845c-31b1b3609e05\") " pod="openstack/glance-default-external-api-0" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.644632 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a22bcf65-b8af-4f8a-845c-31b1b3609e05-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a22bcf65-b8af-4f8a-845c-31b1b3609e05\") " pod="openstack/glance-default-external-api-0" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.644946 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a22bcf65-b8af-4f8a-845c-31b1b3609e05-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a22bcf65-b8af-4f8a-845c-31b1b3609e05\") " pod="openstack/glance-default-external-api-0" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.644994 4676 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"a22bcf65-b8af-4f8a-845c-31b1b3609e05\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.645075 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a22bcf65-b8af-4f8a-845c-31b1b3609e05-logs\") pod \"glance-default-external-api-0\" (UID: \"a22bcf65-b8af-4f8a-845c-31b1b3609e05\") " pod="openstack/glance-default-external-api-0" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.652760 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a22bcf65-b8af-4f8a-845c-31b1b3609e05-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a22bcf65-b8af-4f8a-845c-31b1b3609e05\") " pod="openstack/glance-default-external-api-0" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.653940 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a22bcf65-b8af-4f8a-845c-31b1b3609e05-scripts\") pod \"glance-default-external-api-0\" (UID: \"a22bcf65-b8af-4f8a-845c-31b1b3609e05\") " pod="openstack/glance-default-external-api-0" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.655982 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a22bcf65-b8af-4f8a-845c-31b1b3609e05-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a22bcf65-b8af-4f8a-845c-31b1b3609e05\") " pod="openstack/glance-default-external-api-0" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.664301 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a22bcf65-b8af-4f8a-845c-31b1b3609e05-config-data\") pod \"glance-default-external-api-0\" (UID: \"a22bcf65-b8af-4f8a-845c-31b1b3609e05\") " pod="openstack/glance-default-external-api-0" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.684997 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgjm5\" (UniqueName: \"kubernetes.io/projected/a22bcf65-b8af-4f8a-845c-31b1b3609e05-kube-api-access-bgjm5\") pod \"glance-default-external-api-0\" (UID: \"a22bcf65-b8af-4f8a-845c-31b1b3609e05\") " pod="openstack/glance-default-external-api-0" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.694360 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"a22bcf65-b8af-4f8a-845c-31b1b3609e05\") " pod="openstack/glance-default-external-api-0" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.746726 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cec7cd30-e0cb-41bb-a620-8d3fad4e2338-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cec7cd30-e0cb-41bb-a620-8d3fad4e2338\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.746805 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkjkq\" (UniqueName: \"kubernetes.io/projected/cec7cd30-e0cb-41bb-a620-8d3fad4e2338-kube-api-access-zkjkq\") pod \"glance-default-internal-api-0\" (UID: \"cec7cd30-e0cb-41bb-a620-8d3fad4e2338\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.746850 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cec7cd30-e0cb-41bb-a620-8d3fad4e2338-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cec7cd30-e0cb-41bb-a620-8d3fad4e2338\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.746921 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"cec7cd30-e0cb-41bb-a620-8d3fad4e2338\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.746959 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cec7cd30-e0cb-41bb-a620-8d3fad4e2338-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cec7cd30-e0cb-41bb-a620-8d3fad4e2338\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.747018 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cec7cd30-e0cb-41bb-a620-8d3fad4e2338-logs\") pod \"glance-default-internal-api-0\" (UID: \"cec7cd30-e0cb-41bb-a620-8d3fad4e2338\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.747246 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cec7cd30-e0cb-41bb-a620-8d3fad4e2338-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cec7cd30-e0cb-41bb-a620-8d3fad4e2338\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.747311 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cec7cd30-e0cb-41bb-a620-8d3fad4e2338-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"cec7cd30-e0cb-41bb-a620-8d3fad4e2338\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.760099 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.848537 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cec7cd30-e0cb-41bb-a620-8d3fad4e2338-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cec7cd30-e0cb-41bb-a620-8d3fad4e2338\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.848584 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cec7cd30-e0cb-41bb-a620-8d3fad4e2338-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"cec7cd30-e0cb-41bb-a620-8d3fad4e2338\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.848627 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cec7cd30-e0cb-41bb-a620-8d3fad4e2338-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cec7cd30-e0cb-41bb-a620-8d3fad4e2338\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.848668 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkjkq\" (UniqueName: \"kubernetes.io/projected/cec7cd30-e0cb-41bb-a620-8d3fad4e2338-kube-api-access-zkjkq\") pod \"glance-default-internal-api-0\" (UID: \"cec7cd30-e0cb-41bb-a620-8d3fad4e2338\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.848700 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cec7cd30-e0cb-41bb-a620-8d3fad4e2338-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cec7cd30-e0cb-41bb-a620-8d3fad4e2338\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.848722 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"cec7cd30-e0cb-41bb-a620-8d3fad4e2338\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.848750 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cec7cd30-e0cb-41bb-a620-8d3fad4e2338-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cec7cd30-e0cb-41bb-a620-8d3fad4e2338\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.848795 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cec7cd30-e0cb-41bb-a620-8d3fad4e2338-logs\") pod \"glance-default-internal-api-0\" (UID: \"cec7cd30-e0cb-41bb-a620-8d3fad4e2338\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.849228 4676 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"cec7cd30-e0cb-41bb-a620-8d3fad4e2338\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-internal-api-0" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.849479 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cec7cd30-e0cb-41bb-a620-8d3fad4e2338-logs\") pod \"glance-default-internal-api-0\" (UID: \"cec7cd30-e0cb-41bb-a620-8d3fad4e2338\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.849738 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cec7cd30-e0cb-41bb-a620-8d3fad4e2338-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cec7cd30-e0cb-41bb-a620-8d3fad4e2338\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.856207 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cec7cd30-e0cb-41bb-a620-8d3fad4e2338-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cec7cd30-e0cb-41bb-a620-8d3fad4e2338\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.859999 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cec7cd30-e0cb-41bb-a620-8d3fad4e2338-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cec7cd30-e0cb-41bb-a620-8d3fad4e2338\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.870383 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cec7cd30-e0cb-41bb-a620-8d3fad4e2338-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"cec7cd30-e0cb-41bb-a620-8d3fad4e2338\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.872802 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cec7cd30-e0cb-41bb-a620-8d3fad4e2338-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cec7cd30-e0cb-41bb-a620-8d3fad4e2338\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.876379 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkjkq\" (UniqueName: \"kubernetes.io/projected/cec7cd30-e0cb-41bb-a620-8d3fad4e2338-kube-api-access-zkjkq\") pod \"glance-default-internal-api-0\" (UID: \"cec7cd30-e0cb-41bb-a620-8d3fad4e2338\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:21:38 crc kubenswrapper[4676]: I0930 14:21:38.928142 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"cec7cd30-e0cb-41bb-a620-8d3fad4e2338\") " pod="openstack/glance-default-internal-api-0" Sep 30 14:21:39 crc kubenswrapper[4676]: I0930 14:21:39.183606 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 14:21:39 crc kubenswrapper[4676]: I0930 14:21:39.264216 4676 generic.go:334] "Generic (PLEG): container finished" podID="b4192acc-91dd-48e6-9049-724c2f87043d" containerID="c44fbee39b5cbb506a60b7580dd96675036b64e9982f8f9878b3268fad375e1c" exitCode=0 Sep 30 14:21:39 crc kubenswrapper[4676]: I0930 14:21:39.264264 4676 generic.go:334] "Generic (PLEG): container finished" podID="b4192acc-91dd-48e6-9049-724c2f87043d" containerID="7f9a9363dd2efb8b0d225b0419d5003e0dd11e4c40e34876500fdfacd02f2ac5" exitCode=2 Sep 30 14:21:39 crc kubenswrapper[4676]: I0930 14:21:39.264291 4676 generic.go:334] "Generic (PLEG): container finished" podID="b4192acc-91dd-48e6-9049-724c2f87043d" containerID="5317f29887f8501b5bcf2aaa71651b5a733faeb8ce126f8fbaf16d5d10878774" exitCode=0 Sep 30 14:21:39 crc kubenswrapper[4676]: I0930 14:21:39.264356 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4192acc-91dd-48e6-9049-724c2f87043d","Type":"ContainerDied","Data":"c44fbee39b5cbb506a60b7580dd96675036b64e9982f8f9878b3268fad375e1c"} Sep 30 14:21:39 crc kubenswrapper[4676]: I0930 14:21:39.264387 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4192acc-91dd-48e6-9049-724c2f87043d","Type":"ContainerDied","Data":"7f9a9363dd2efb8b0d225b0419d5003e0dd11e4c40e34876500fdfacd02f2ac5"} Sep 30 14:21:39 crc kubenswrapper[4676]: I0930 14:21:39.264399 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4192acc-91dd-48e6-9049-724c2f87043d","Type":"ContainerDied","Data":"5317f29887f8501b5bcf2aaa71651b5a733faeb8ce126f8fbaf16d5d10878774"} Sep 30 14:21:39 crc kubenswrapper[4676]: I0930 14:21:39.373084 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 14:21:39 crc kubenswrapper[4676]: W0930 14:21:39.410241 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda22bcf65_b8af_4f8a_845c_31b1b3609e05.slice/crio-33598d9c1a98e3e7e5830b0637504634086d5e8f09fd4bc7f17822c5f2e6afef WatchSource:0}: Error finding container 33598d9c1a98e3e7e5830b0637504634086d5e8f09fd4bc7f17822c5f2e6afef: Status 404 returned error can't find the container with id 33598d9c1a98e3e7e5830b0637504634086d5e8f09fd4bc7f17822c5f2e6afef Sep 30 14:21:39 crc kubenswrapper[4676]: I0930 14:21:39.445634 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00c04551-20dc-4c4a-bb5b-39012ad94d51" path="/var/lib/kubelet/pods/00c04551-20dc-4c4a-bb5b-39012ad94d51/volumes" Sep 30 14:21:39 crc kubenswrapper[4676]: I0930 14:21:39.446756 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77375b04-44bb-4250-a54c-0c193201f738" path="/var/lib/kubelet/pods/77375b04-44bb-4250-a54c-0c193201f738/volumes" Sep 30 14:21:39 crc kubenswrapper[4676]: I0930 14:21:39.766035 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 14:21:40 crc kubenswrapper[4676]: I0930 14:21:40.290241 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cec7cd30-e0cb-41bb-a620-8d3fad4e2338","Type":"ContainerStarted","Data":"d5d571be63725e83738d194d471ce07a705a415a7b9b07005cf052ea19c8a56f"} Sep 30 14:21:40 crc kubenswrapper[4676]: I0930 14:21:40.298855 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a22bcf65-b8af-4f8a-845c-31b1b3609e05","Type":"ContainerStarted","Data":"47a6fed5a84391719ae2fa577e29a516f9b00c14057904e54b1d16f6d3e6d253"} Sep 30 14:21:40 crc kubenswrapper[4676]: I0930 14:21:40.298945 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a22bcf65-b8af-4f8a-845c-31b1b3609e05","Type":"ContainerStarted","Data":"33598d9c1a98e3e7e5830b0637504634086d5e8f09fd4bc7f17822c5f2e6afef"} Sep 30 14:21:41 crc kubenswrapper[4676]: I0930 14:21:41.314127 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a22bcf65-b8af-4f8a-845c-31b1b3609e05","Type":"ContainerStarted","Data":"5d0402d809c0abe6ca1798e6f74f414459f57c066558f8ca8051f9a32d3977a2"} Sep 30 14:21:41 crc kubenswrapper[4676]: I0930 14:21:41.321165 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cec7cd30-e0cb-41bb-a620-8d3fad4e2338","Type":"ContainerStarted","Data":"38ffba4ee73e8324c6c67ab7dbbd73cbbec4aed664b0c41a351aacfd6e3ab3a5"} Sep 30 14:21:41 crc kubenswrapper[4676]: I0930 14:21:41.321262 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cec7cd30-e0cb-41bb-a620-8d3fad4e2338","Type":"ContainerStarted","Data":"23a4a6984ec963fae301d1f63f2c23b99f5612c6fa71c854b49cbb9e147bb8d3"} Sep 30 14:21:41 crc kubenswrapper[4676]: I0930 14:21:41.340470 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.3404463189999998 podStartE2EDuration="3.340446319s" podCreationTimestamp="2025-09-30 14:21:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:21:41.334316788 +0000 UTC m=+1405.317405227" watchObservedRunningTime="2025-09-30 14:21:41.340446319 +0000 UTC m=+1405.323534748" Sep 30 14:21:41 crc kubenswrapper[4676]: I0930 14:21:41.376720 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.37669888 podStartE2EDuration="3.37669888s" podCreationTimestamp="2025-09-30 14:21:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:21:41.370277821 +0000 UTC m=+1405.353366270" watchObservedRunningTime="2025-09-30 14:21:41.37669888 +0000 UTC m=+1405.359787309" Sep 30 14:21:42 crc kubenswrapper[4676]: I0930 14:21:42.350226 4676 generic.go:334] "Generic (PLEG): container finished" podID="b4192acc-91dd-48e6-9049-724c2f87043d" containerID="a316181a84e64a874e15dc419463cdcd155cd443703667e9f347429c1c83ae7c" exitCode=0 Sep 30 14:21:42 crc kubenswrapper[4676]: I0930 14:21:42.351002 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4192acc-91dd-48e6-9049-724c2f87043d","Type":"ContainerDied","Data":"a316181a84e64a874e15dc419463cdcd155cd443703667e9f347429c1c83ae7c"} Sep 30 14:21:42 crc kubenswrapper[4676]: I0930 14:21:42.351078 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4192acc-91dd-48e6-9049-724c2f87043d","Type":"ContainerDied","Data":"a91acfac00354946feb900757fbb1dac3c01f011ddcbe3a5752687b2509b08b8"} Sep 30 14:21:42 crc kubenswrapper[4676]: I0930 14:21:42.351091 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a91acfac00354946feb900757fbb1dac3c01f011ddcbe3a5752687b2509b08b8" Sep 30 14:21:42 crc kubenswrapper[4676]: I0930 14:21:42.390093 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 14:21:42 crc kubenswrapper[4676]: I0930 14:21:42.453086 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b4192acc-91dd-48e6-9049-724c2f87043d-sg-core-conf-yaml\") pod \"b4192acc-91dd-48e6-9049-724c2f87043d\" (UID: \"b4192acc-91dd-48e6-9049-724c2f87043d\") " Sep 30 14:21:42 crc kubenswrapper[4676]: I0930 14:21:42.453223 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4192acc-91dd-48e6-9049-724c2f87043d-log-httpd\") pod \"b4192acc-91dd-48e6-9049-724c2f87043d\" (UID: \"b4192acc-91dd-48e6-9049-724c2f87043d\") " Sep 30 14:21:42 crc kubenswrapper[4676]: I0930 14:21:42.453281 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4192acc-91dd-48e6-9049-724c2f87043d-combined-ca-bundle\") pod \"b4192acc-91dd-48e6-9049-724c2f87043d\" (UID: \"b4192acc-91dd-48e6-9049-724c2f87043d\") " Sep 30 14:21:42 crc kubenswrapper[4676]: I0930 14:21:42.453449 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwwlx\" (UniqueName: \"kubernetes.io/projected/b4192acc-91dd-48e6-9049-724c2f87043d-kube-api-access-lwwlx\") pod \"b4192acc-91dd-48e6-9049-724c2f87043d\" (UID: \"b4192acc-91dd-48e6-9049-724c2f87043d\") " Sep 30 14:21:42 crc kubenswrapper[4676]: I0930 14:21:42.453499 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4192acc-91dd-48e6-9049-724c2f87043d-config-data\") pod \"b4192acc-91dd-48e6-9049-724c2f87043d\" (UID: \"b4192acc-91dd-48e6-9049-724c2f87043d\") " Sep 30 14:21:42 crc kubenswrapper[4676]: I0930 14:21:42.453618 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4192acc-91dd-48e6-9049-724c2f87043d-scripts\") pod \"b4192acc-91dd-48e6-9049-724c2f87043d\" (UID: \"b4192acc-91dd-48e6-9049-724c2f87043d\") " Sep 30 14:21:42 crc kubenswrapper[4676]: I0930 14:21:42.453801 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4192acc-91dd-48e6-9049-724c2f87043d-run-httpd\") pod \"b4192acc-91dd-48e6-9049-724c2f87043d\" (UID: \"b4192acc-91dd-48e6-9049-724c2f87043d\") " Sep 30 14:21:42 crc kubenswrapper[4676]: I0930 14:21:42.453938 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4192acc-91dd-48e6-9049-724c2f87043d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b4192acc-91dd-48e6-9049-724c2f87043d" (UID: "b4192acc-91dd-48e6-9049-724c2f87043d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:21:42 crc kubenswrapper[4676]: I0930 14:21:42.454481 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4192acc-91dd-48e6-9049-724c2f87043d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b4192acc-91dd-48e6-9049-724c2f87043d" (UID: "b4192acc-91dd-48e6-9049-724c2f87043d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:21:42 crc kubenswrapper[4676]: I0930 14:21:42.456417 4676 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4192acc-91dd-48e6-9049-724c2f87043d-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:42 crc kubenswrapper[4676]: I0930 14:21:42.456440 4676 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4192acc-91dd-48e6-9049-724c2f87043d-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:42 crc kubenswrapper[4676]: I0930 14:21:42.460704 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4192acc-91dd-48e6-9049-724c2f87043d-kube-api-access-lwwlx" (OuterVolumeSpecName: "kube-api-access-lwwlx") pod "b4192acc-91dd-48e6-9049-724c2f87043d" (UID: "b4192acc-91dd-48e6-9049-724c2f87043d"). InnerVolumeSpecName "kube-api-access-lwwlx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:21:42 crc kubenswrapper[4676]: I0930 14:21:42.462748 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4192acc-91dd-48e6-9049-724c2f87043d-scripts" (OuterVolumeSpecName: "scripts") pod "b4192acc-91dd-48e6-9049-724c2f87043d" (UID: "b4192acc-91dd-48e6-9049-724c2f87043d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:21:42 crc kubenswrapper[4676]: I0930 14:21:42.496844 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4192acc-91dd-48e6-9049-724c2f87043d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b4192acc-91dd-48e6-9049-724c2f87043d" (UID: "b4192acc-91dd-48e6-9049-724c2f87043d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:21:42 crc kubenswrapper[4676]: I0930 14:21:42.559503 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwwlx\" (UniqueName: \"kubernetes.io/projected/b4192acc-91dd-48e6-9049-724c2f87043d-kube-api-access-lwwlx\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:42 crc kubenswrapper[4676]: I0930 14:21:42.559558 4676 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4192acc-91dd-48e6-9049-724c2f87043d-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:42 crc kubenswrapper[4676]: I0930 14:21:42.559579 4676 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b4192acc-91dd-48e6-9049-724c2f87043d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:42 crc kubenswrapper[4676]: I0930 14:21:42.565696 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4192acc-91dd-48e6-9049-724c2f87043d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b4192acc-91dd-48e6-9049-724c2f87043d" (UID: "b4192acc-91dd-48e6-9049-724c2f87043d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:21:42 crc kubenswrapper[4676]: I0930 14:21:42.569068 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4192acc-91dd-48e6-9049-724c2f87043d-config-data" (OuterVolumeSpecName: "config-data") pod "b4192acc-91dd-48e6-9049-724c2f87043d" (UID: "b4192acc-91dd-48e6-9049-724c2f87043d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:21:42 crc kubenswrapper[4676]: I0930 14:21:42.661922 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4192acc-91dd-48e6-9049-724c2f87043d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:42 crc kubenswrapper[4676]: I0930 14:21:42.662289 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4192acc-91dd-48e6-9049-724c2f87043d-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:43 crc kubenswrapper[4676]: I0930 14:21:43.372961 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 14:21:43 crc kubenswrapper[4676]: I0930 14:21:43.413107 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 14:21:43 crc kubenswrapper[4676]: I0930 14:21:43.422060 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 14:21:43 crc kubenswrapper[4676]: I0930 14:21:43.445307 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4192acc-91dd-48e6-9049-724c2f87043d" path="/var/lib/kubelet/pods/b4192acc-91dd-48e6-9049-724c2f87043d/volumes" Sep 30 14:21:43 crc kubenswrapper[4676]: I0930 14:21:43.447664 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 14:21:43 crc kubenswrapper[4676]: E0930 14:21:43.448418 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4192acc-91dd-48e6-9049-724c2f87043d" containerName="sg-core" Sep 30 14:21:43 crc kubenswrapper[4676]: I0930 14:21:43.448492 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4192acc-91dd-48e6-9049-724c2f87043d" containerName="sg-core" Sep 30 14:21:43 crc kubenswrapper[4676]: E0930 14:21:43.448580 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4192acc-91dd-48e6-9049-724c2f87043d" containerName="ceilometer-central-agent" Sep 30 14:21:43 crc kubenswrapper[4676]: I0930 14:21:43.448654 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4192acc-91dd-48e6-9049-724c2f87043d" containerName="ceilometer-central-agent" Sep 30 14:21:43 crc kubenswrapper[4676]: E0930 14:21:43.448727 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4192acc-91dd-48e6-9049-724c2f87043d" containerName="proxy-httpd" Sep 30 14:21:43 crc kubenswrapper[4676]: I0930 14:21:43.448798 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4192acc-91dd-48e6-9049-724c2f87043d" containerName="proxy-httpd" Sep 30 14:21:43 crc kubenswrapper[4676]: E0930 14:21:43.448909 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4192acc-91dd-48e6-9049-724c2f87043d" containerName="ceilometer-notification-agent" Sep 30 14:21:43 crc kubenswrapper[4676]: I0930 14:21:43.448987 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4192acc-91dd-48e6-9049-724c2f87043d" containerName="ceilometer-notification-agent" Sep 30 14:21:43 crc kubenswrapper[4676]: I0930 14:21:43.449290 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4192acc-91dd-48e6-9049-724c2f87043d" containerName="proxy-httpd" Sep 30 14:21:43 crc kubenswrapper[4676]: I0930 14:21:43.449442 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4192acc-91dd-48e6-9049-724c2f87043d" containerName="ceilometer-central-agent" Sep 30 14:21:43 crc kubenswrapper[4676]: I0930 14:21:43.449537 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4192acc-91dd-48e6-9049-724c2f87043d" containerName="ceilometer-notification-agent" Sep 30 14:21:43 crc kubenswrapper[4676]: I0930 14:21:43.449632 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4192acc-91dd-48e6-9049-724c2f87043d" containerName="sg-core" Sep 30 14:21:43 crc kubenswrapper[4676]: I0930 14:21:43.462241 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 14:21:43 crc kubenswrapper[4676]: I0930 14:21:43.464141 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 14:21:43 crc kubenswrapper[4676]: I0930 14:21:43.472593 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 14:21:43 crc kubenswrapper[4676]: I0930 14:21:43.472745 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 14:21:43 crc kubenswrapper[4676]: I0930 14:21:43.581147 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/018e0e05-7ff4-4620-989d-5843f092e164-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"018e0e05-7ff4-4620-989d-5843f092e164\") " pod="openstack/ceilometer-0" Sep 30 14:21:43 crc kubenswrapper[4676]: I0930 14:21:43.581522 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/018e0e05-7ff4-4620-989d-5843f092e164-run-httpd\") pod \"ceilometer-0\" (UID: \"018e0e05-7ff4-4620-989d-5843f092e164\") " pod="openstack/ceilometer-0" Sep 30 14:21:43 crc kubenswrapper[4676]: I0930 14:21:43.581545 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2slt\" (UniqueName: \"kubernetes.io/projected/018e0e05-7ff4-4620-989d-5843f092e164-kube-api-access-l2slt\") pod \"ceilometer-0\" (UID: \"018e0e05-7ff4-4620-989d-5843f092e164\") " pod="openstack/ceilometer-0" Sep 30 14:21:43 crc kubenswrapper[4676]: I0930 14:21:43.581590 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/018e0e05-7ff4-4620-989d-5843f092e164-config-data\") pod \"ceilometer-0\" (UID: \"018e0e05-7ff4-4620-989d-5843f092e164\") " pod="openstack/ceilometer-0" Sep 30 14:21:43 crc kubenswrapper[4676]: I0930 14:21:43.581617 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/018e0e05-7ff4-4620-989d-5843f092e164-log-httpd\") pod \"ceilometer-0\" (UID: \"018e0e05-7ff4-4620-989d-5843f092e164\") " pod="openstack/ceilometer-0" Sep 30 14:21:43 crc kubenswrapper[4676]: I0930 14:21:43.581717 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/018e0e05-7ff4-4620-989d-5843f092e164-scripts\") pod \"ceilometer-0\" (UID: \"018e0e05-7ff4-4620-989d-5843f092e164\") " pod="openstack/ceilometer-0" Sep 30 14:21:43 crc kubenswrapper[4676]: I0930 14:21:43.581782 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/018e0e05-7ff4-4620-989d-5843f092e164-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"018e0e05-7ff4-4620-989d-5843f092e164\") " pod="openstack/ceilometer-0" Sep 30 14:21:43 crc kubenswrapper[4676]: I0930 14:21:43.683157 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/018e0e05-7ff4-4620-989d-5843f092e164-config-data\") pod \"ceilometer-0\" (UID: \"018e0e05-7ff4-4620-989d-5843f092e164\") " pod="openstack/ceilometer-0" Sep 30 14:21:43 crc kubenswrapper[4676]: I0930 14:21:43.683227 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/018e0e05-7ff4-4620-989d-5843f092e164-log-httpd\") pod \"ceilometer-0\" (UID: \"018e0e05-7ff4-4620-989d-5843f092e164\") " pod="openstack/ceilometer-0" Sep 30 14:21:43 crc kubenswrapper[4676]: I0930 14:21:43.683319 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/018e0e05-7ff4-4620-989d-5843f092e164-scripts\") pod \"ceilometer-0\" (UID: \"018e0e05-7ff4-4620-989d-5843f092e164\") " pod="openstack/ceilometer-0" Sep 30 14:21:43 crc kubenswrapper[4676]: I0930 14:21:43.683484 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/018e0e05-7ff4-4620-989d-5843f092e164-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"018e0e05-7ff4-4620-989d-5843f092e164\") " pod="openstack/ceilometer-0" Sep 30 14:21:43 crc kubenswrapper[4676]: I0930 14:21:43.683536 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/018e0e05-7ff4-4620-989d-5843f092e164-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"018e0e05-7ff4-4620-989d-5843f092e164\") " pod="openstack/ceilometer-0" Sep 30 14:21:43 crc kubenswrapper[4676]: I0930 14:21:43.683562 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/018e0e05-7ff4-4620-989d-5843f092e164-run-httpd\") pod \"ceilometer-0\" (UID: \"018e0e05-7ff4-4620-989d-5843f092e164\") " pod="openstack/ceilometer-0" Sep 30 14:21:43 crc kubenswrapper[4676]: I0930 14:21:43.683583 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2slt\" (UniqueName: \"kubernetes.io/projected/018e0e05-7ff4-4620-989d-5843f092e164-kube-api-access-l2slt\") pod \"ceilometer-0\" (UID: \"018e0e05-7ff4-4620-989d-5843f092e164\") " pod="openstack/ceilometer-0" Sep 30 14:21:43 crc kubenswrapper[4676]: I0930 14:21:43.684869 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/018e0e05-7ff4-4620-989d-5843f092e164-log-httpd\") pod \"ceilometer-0\" (UID: \"018e0e05-7ff4-4620-989d-5843f092e164\") " pod="openstack/ceilometer-0" Sep 30 14:21:43 crc kubenswrapper[4676]: I0930 14:21:43.685027 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/018e0e05-7ff4-4620-989d-5843f092e164-run-httpd\") pod \"ceilometer-0\" (UID: \"018e0e05-7ff4-4620-989d-5843f092e164\") " pod="openstack/ceilometer-0" Sep 30 14:21:43 crc kubenswrapper[4676]: I0930 14:21:43.690059 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/018e0e05-7ff4-4620-989d-5843f092e164-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"018e0e05-7ff4-4620-989d-5843f092e164\") " pod="openstack/ceilometer-0" Sep 30 14:21:43 crc kubenswrapper[4676]: I0930 14:21:43.699377 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/018e0e05-7ff4-4620-989d-5843f092e164-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"018e0e05-7ff4-4620-989d-5843f092e164\") " pod="openstack/ceilometer-0" Sep 30 14:21:43 crc kubenswrapper[4676]: I0930 14:21:43.700125 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/018e0e05-7ff4-4620-989d-5843f092e164-config-data\") pod \"ceilometer-0\" (UID: \"018e0e05-7ff4-4620-989d-5843f092e164\") " pod="openstack/ceilometer-0" Sep 30 14:21:43 crc kubenswrapper[4676]: I0930 14:21:43.701143 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/018e0e05-7ff4-4620-989d-5843f092e164-scripts\") pod \"ceilometer-0\" (UID: \"018e0e05-7ff4-4620-989d-5843f092e164\") " pod="openstack/ceilometer-0" Sep 30 14:21:43 crc kubenswrapper[4676]: I0930 14:21:43.702643 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2slt\" (UniqueName: \"kubernetes.io/projected/018e0e05-7ff4-4620-989d-5843f092e164-kube-api-access-l2slt\") pod \"ceilometer-0\" (UID: \"018e0e05-7ff4-4620-989d-5843f092e164\") " pod="openstack/ceilometer-0" Sep 30 14:21:43 crc kubenswrapper[4676]: I0930 14:21:43.787091 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 14:21:47 crc kubenswrapper[4676]: I0930 14:21:47.780044 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 14:21:47 crc kubenswrapper[4676]: W0930 14:21:47.787097 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod018e0e05_7ff4_4620_989d_5843f092e164.slice/crio-f055b7f2f3536f739746641e0cad10e9f78a142244652fc42ec1a43493abd5f5 WatchSource:0}: Error finding container f055b7f2f3536f739746641e0cad10e9f78a142244652fc42ec1a43493abd5f5: Status 404 returned error can't find the container with id f055b7f2f3536f739746641e0cad10e9f78a142244652fc42ec1a43493abd5f5 Sep 30 14:21:48 crc kubenswrapper[4676]: I0930 14:21:48.470062 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"018e0e05-7ff4-4620-989d-5843f092e164","Type":"ContainerStarted","Data":"f055b7f2f3536f739746641e0cad10e9f78a142244652fc42ec1a43493abd5f5"} Sep 30 14:21:48 crc kubenswrapper[4676]: I0930 14:21:48.471662 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-589w4" event={"ID":"18f239b5-877b-4291-8481-6a121c25bff9","Type":"ContainerStarted","Data":"7c401ca1e5312a6509c27ce37f7ec2ddd0f578f113baa2f49b4f416f04b3eb5c"} Sep 30 14:21:48 crc kubenswrapper[4676]: I0930 14:21:48.488788 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-589w4" podStartSLOduration=3.052970241 podStartE2EDuration="12.488774126s" podCreationTimestamp="2025-09-30 14:21:36 +0000 UTC" firstStartedPulling="2025-09-30 14:21:38.016050472 +0000 UTC m=+1401.999138901" lastFinishedPulling="2025-09-30 14:21:47.451854357 +0000 UTC m=+1411.434942786" observedRunningTime="2025-09-30 14:21:48.488420507 +0000 UTC m=+1412.471508936" watchObservedRunningTime="2025-09-30 14:21:48.488774126 +0000 UTC m=+1412.471862555" Sep 30 14:21:48 crc kubenswrapper[4676]: I0930 14:21:48.760780 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Sep 30 14:21:48 crc kubenswrapper[4676]: I0930 14:21:48.760828 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Sep 30 14:21:48 crc kubenswrapper[4676]: I0930 14:21:48.789097 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Sep 30 14:21:48 crc kubenswrapper[4676]: I0930 14:21:48.800378 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Sep 30 14:21:49 crc kubenswrapper[4676]: I0930 14:21:49.183830 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Sep 30 14:21:49 crc kubenswrapper[4676]: I0930 14:21:49.184189 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Sep 30 14:21:49 crc kubenswrapper[4676]: I0930 14:21:49.218923 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Sep 30 14:21:49 crc kubenswrapper[4676]: I0930 14:21:49.236127 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Sep 30 14:21:49 crc kubenswrapper[4676]: I0930 14:21:49.497407 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"018e0e05-7ff4-4620-989d-5843f092e164","Type":"ContainerStarted","Data":"cd05e8eb6cc47850912162a49c10d1a2257a1c06066980a307cac966c2f07af0"} Sep 30 14:21:49 crc kubenswrapper[4676]: I0930 14:21:49.498619 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Sep 30 14:21:49 crc kubenswrapper[4676]: I0930 14:21:49.498651 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Sep 30 14:21:49 crc kubenswrapper[4676]: I0930 14:21:49.498709 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Sep 30 14:21:49 crc kubenswrapper[4676]: I0930 14:21:49.498723 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"018e0e05-7ff4-4620-989d-5843f092e164","Type":"ContainerStarted","Data":"c35f035349b2023c6afae875ad6a34e4c8b90262ebffab69b16e3834f228391a"} Sep 30 14:21:49 crc kubenswrapper[4676]: I0930 14:21:49.498740 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Sep 30 14:21:50 crc kubenswrapper[4676]: I0930 14:21:50.509179 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"018e0e05-7ff4-4620-989d-5843f092e164","Type":"ContainerStarted","Data":"6eb4e20f3ff14a2e138fbddafeebd3a514b6e74e5b270c5dd172192eab0ab67b"} Sep 30 14:21:51 crc kubenswrapper[4676]: I0930 14:21:51.681836 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Sep 30 14:21:51 crc kubenswrapper[4676]: I0930 14:21:51.682465 4676 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 14:21:51 crc kubenswrapper[4676]: I0930 14:21:51.719352 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Sep 30 14:21:51 crc kubenswrapper[4676]: I0930 14:21:51.719519 4676 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 14:21:51 crc kubenswrapper[4676]: I0930 14:21:51.723187 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Sep 30 14:21:51 crc kubenswrapper[4676]: I0930 14:21:51.743662 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Sep 30 14:21:52 crc kubenswrapper[4676]: I0930 14:21:52.532041 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"018e0e05-7ff4-4620-989d-5843f092e164","Type":"ContainerStarted","Data":"0c286ac4997b05114d2eb4b41e833d0efbe0c6be5df67ff0558b23c7a5ece3e1"} Sep 30 14:21:52 crc kubenswrapper[4676]: I0930 14:21:52.532636 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 14:21:52 crc kubenswrapper[4676]: I0930 14:21:52.560450 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=5.68657593 podStartE2EDuration="9.560430149s" podCreationTimestamp="2025-09-30 14:21:43 +0000 UTC" firstStartedPulling="2025-09-30 14:21:47.789769464 +0000 UTC m=+1411.772857883" lastFinishedPulling="2025-09-30 14:21:51.663623673 +0000 UTC m=+1415.646712102" observedRunningTime="2025-09-30 14:21:52.558759265 +0000 UTC m=+1416.541847694" watchObservedRunningTime="2025-09-30 14:21:52.560430149 +0000 UTC m=+1416.543518578" Sep 30 14:21:54 crc kubenswrapper[4676]: I0930 14:21:54.099542 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 14:21:54 crc kubenswrapper[4676]: I0930 14:21:54.547907 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="018e0e05-7ff4-4620-989d-5843f092e164" containerName="ceilometer-central-agent" containerID="cri-o://c35f035349b2023c6afae875ad6a34e4c8b90262ebffab69b16e3834f228391a" gracePeriod=30 Sep 30 14:21:54 crc kubenswrapper[4676]: I0930 14:21:54.547976 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="018e0e05-7ff4-4620-989d-5843f092e164" containerName="proxy-httpd" containerID="cri-o://0c286ac4997b05114d2eb4b41e833d0efbe0c6be5df67ff0558b23c7a5ece3e1" gracePeriod=30 Sep 30 14:21:54 crc kubenswrapper[4676]: I0930 14:21:54.548020 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="018e0e05-7ff4-4620-989d-5843f092e164" containerName="sg-core" containerID="cri-o://6eb4e20f3ff14a2e138fbddafeebd3a514b6e74e5b270c5dd172192eab0ab67b" gracePeriod=30 Sep 30 14:21:54 crc kubenswrapper[4676]: I0930 14:21:54.548069 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="018e0e05-7ff4-4620-989d-5843f092e164" containerName="ceilometer-notification-agent" containerID="cri-o://cd05e8eb6cc47850912162a49c10d1a2257a1c06066980a307cac966c2f07af0" gracePeriod=30 Sep 30 14:21:55 crc kubenswrapper[4676]: I0930 14:21:55.561825 4676 generic.go:334] "Generic (PLEG): container finished" podID="018e0e05-7ff4-4620-989d-5843f092e164" containerID="0c286ac4997b05114d2eb4b41e833d0efbe0c6be5df67ff0558b23c7a5ece3e1" exitCode=0 Sep 30 14:21:55 crc kubenswrapper[4676]: I0930 14:21:55.562383 4676 generic.go:334] "Generic (PLEG): container finished" podID="018e0e05-7ff4-4620-989d-5843f092e164" containerID="6eb4e20f3ff14a2e138fbddafeebd3a514b6e74e5b270c5dd172192eab0ab67b" exitCode=2 Sep 30 14:21:55 crc kubenswrapper[4676]: I0930 14:21:55.562392 4676 generic.go:334] "Generic (PLEG): container finished" podID="018e0e05-7ff4-4620-989d-5843f092e164" containerID="cd05e8eb6cc47850912162a49c10d1a2257a1c06066980a307cac966c2f07af0" exitCode=0 Sep 30 14:21:55 crc kubenswrapper[4676]: I0930 14:21:55.562346 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"018e0e05-7ff4-4620-989d-5843f092e164","Type":"ContainerDied","Data":"0c286ac4997b05114d2eb4b41e833d0efbe0c6be5df67ff0558b23c7a5ece3e1"} Sep 30 14:21:55 crc kubenswrapper[4676]: I0930 14:21:55.562434 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"018e0e05-7ff4-4620-989d-5843f092e164","Type":"ContainerDied","Data":"6eb4e20f3ff14a2e138fbddafeebd3a514b6e74e5b270c5dd172192eab0ab67b"} Sep 30 14:21:55 crc kubenswrapper[4676]: I0930 14:21:55.562447 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"018e0e05-7ff4-4620-989d-5843f092e164","Type":"ContainerDied","Data":"cd05e8eb6cc47850912162a49c10d1a2257a1c06066980a307cac966c2f07af0"} Sep 30 14:21:56 crc kubenswrapper[4676]: I0930 14:21:56.485947 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 14:21:56 crc kubenswrapper[4676]: I0930 14:21:56.556097 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/018e0e05-7ff4-4620-989d-5843f092e164-sg-core-conf-yaml\") pod \"018e0e05-7ff4-4620-989d-5843f092e164\" (UID: \"018e0e05-7ff4-4620-989d-5843f092e164\") " Sep 30 14:21:56 crc kubenswrapper[4676]: I0930 14:21:56.556239 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2slt\" (UniqueName: \"kubernetes.io/projected/018e0e05-7ff4-4620-989d-5843f092e164-kube-api-access-l2slt\") pod \"018e0e05-7ff4-4620-989d-5843f092e164\" (UID: \"018e0e05-7ff4-4620-989d-5843f092e164\") " Sep 30 14:21:56 crc kubenswrapper[4676]: I0930 14:21:56.556272 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/018e0e05-7ff4-4620-989d-5843f092e164-config-data\") pod \"018e0e05-7ff4-4620-989d-5843f092e164\" (UID: \"018e0e05-7ff4-4620-989d-5843f092e164\") " Sep 30 14:21:56 crc kubenswrapper[4676]: I0930 14:21:56.556292 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/018e0e05-7ff4-4620-989d-5843f092e164-combined-ca-bundle\") pod \"018e0e05-7ff4-4620-989d-5843f092e164\" (UID: \"018e0e05-7ff4-4620-989d-5843f092e164\") " Sep 30 14:21:56 crc kubenswrapper[4676]: I0930 14:21:56.557304 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/018e0e05-7ff4-4620-989d-5843f092e164-run-httpd\") pod \"018e0e05-7ff4-4620-989d-5843f092e164\" (UID: \"018e0e05-7ff4-4620-989d-5843f092e164\") " Sep 30 14:21:56 crc kubenswrapper[4676]: I0930 14:21:56.557344 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/018e0e05-7ff4-4620-989d-5843f092e164-log-httpd\") pod \"018e0e05-7ff4-4620-989d-5843f092e164\" (UID: \"018e0e05-7ff4-4620-989d-5843f092e164\") " Sep 30 14:21:56 crc kubenswrapper[4676]: I0930 14:21:56.557383 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/018e0e05-7ff4-4620-989d-5843f092e164-scripts\") pod \"018e0e05-7ff4-4620-989d-5843f092e164\" (UID: \"018e0e05-7ff4-4620-989d-5843f092e164\") " Sep 30 14:21:56 crc kubenswrapper[4676]: I0930 14:21:56.557614 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/018e0e05-7ff4-4620-989d-5843f092e164-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "018e0e05-7ff4-4620-989d-5843f092e164" (UID: "018e0e05-7ff4-4620-989d-5843f092e164"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:21:56 crc kubenswrapper[4676]: I0930 14:21:56.557847 4676 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/018e0e05-7ff4-4620-989d-5843f092e164-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:56 crc kubenswrapper[4676]: I0930 14:21:56.558184 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/018e0e05-7ff4-4620-989d-5843f092e164-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "018e0e05-7ff4-4620-989d-5843f092e164" (UID: "018e0e05-7ff4-4620-989d-5843f092e164"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:21:56 crc kubenswrapper[4676]: I0930 14:21:56.563230 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/018e0e05-7ff4-4620-989d-5843f092e164-kube-api-access-l2slt" (OuterVolumeSpecName: "kube-api-access-l2slt") pod "018e0e05-7ff4-4620-989d-5843f092e164" (UID: "018e0e05-7ff4-4620-989d-5843f092e164"). InnerVolumeSpecName "kube-api-access-l2slt". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:21:56 crc kubenswrapper[4676]: I0930 14:21:56.564150 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/018e0e05-7ff4-4620-989d-5843f092e164-scripts" (OuterVolumeSpecName: "scripts") pod "018e0e05-7ff4-4620-989d-5843f092e164" (UID: "018e0e05-7ff4-4620-989d-5843f092e164"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:21:56 crc kubenswrapper[4676]: I0930 14:21:56.593805 4676 generic.go:334] "Generic (PLEG): container finished" podID="018e0e05-7ff4-4620-989d-5843f092e164" containerID="c35f035349b2023c6afae875ad6a34e4c8b90262ebffab69b16e3834f228391a" exitCode=0 Sep 30 14:21:56 crc kubenswrapper[4676]: I0930 14:21:56.593868 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"018e0e05-7ff4-4620-989d-5843f092e164","Type":"ContainerDied","Data":"c35f035349b2023c6afae875ad6a34e4c8b90262ebffab69b16e3834f228391a"} Sep 30 14:21:56 crc kubenswrapper[4676]: I0930 14:21:56.593980 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 14:21:56 crc kubenswrapper[4676]: I0930 14:21:56.594048 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"018e0e05-7ff4-4620-989d-5843f092e164","Type":"ContainerDied","Data":"f055b7f2f3536f739746641e0cad10e9f78a142244652fc42ec1a43493abd5f5"} Sep 30 14:21:56 crc kubenswrapper[4676]: I0930 14:21:56.594090 4676 scope.go:117] "RemoveContainer" containerID="0c286ac4997b05114d2eb4b41e833d0efbe0c6be5df67ff0558b23c7a5ece3e1" Sep 30 14:21:56 crc kubenswrapper[4676]: I0930 14:21:56.601188 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/018e0e05-7ff4-4620-989d-5843f092e164-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "018e0e05-7ff4-4620-989d-5843f092e164" (UID: "018e0e05-7ff4-4620-989d-5843f092e164"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:21:56 crc kubenswrapper[4676]: I0930 14:21:56.652228 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/018e0e05-7ff4-4620-989d-5843f092e164-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "018e0e05-7ff4-4620-989d-5843f092e164" (UID: "018e0e05-7ff4-4620-989d-5843f092e164"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:21:56 crc kubenswrapper[4676]: I0930 14:21:56.658710 4676 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/018e0e05-7ff4-4620-989d-5843f092e164-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:56 crc kubenswrapper[4676]: I0930 14:21:56.658739 4676 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/018e0e05-7ff4-4620-989d-5843f092e164-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:56 crc kubenswrapper[4676]: I0930 14:21:56.658749 4676 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/018e0e05-7ff4-4620-989d-5843f092e164-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:56 crc kubenswrapper[4676]: I0930 14:21:56.658760 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2slt\" (UniqueName: \"kubernetes.io/projected/018e0e05-7ff4-4620-989d-5843f092e164-kube-api-access-l2slt\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:56 crc kubenswrapper[4676]: I0930 14:21:56.658770 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/018e0e05-7ff4-4620-989d-5843f092e164-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:56 crc kubenswrapper[4676]: I0930 14:21:56.673617 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/018e0e05-7ff4-4620-989d-5843f092e164-config-data" (OuterVolumeSpecName: "config-data") pod "018e0e05-7ff4-4620-989d-5843f092e164" (UID: "018e0e05-7ff4-4620-989d-5843f092e164"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:21:56 crc kubenswrapper[4676]: I0930 14:21:56.673754 4676 scope.go:117] "RemoveContainer" containerID="6eb4e20f3ff14a2e138fbddafeebd3a514b6e74e5b270c5dd172192eab0ab67b" Sep 30 14:21:56 crc kubenswrapper[4676]: I0930 14:21:56.691930 4676 scope.go:117] "RemoveContainer" containerID="cd05e8eb6cc47850912162a49c10d1a2257a1c06066980a307cac966c2f07af0" Sep 30 14:21:56 crc kubenswrapper[4676]: I0930 14:21:56.721895 4676 scope.go:117] "RemoveContainer" containerID="c35f035349b2023c6afae875ad6a34e4c8b90262ebffab69b16e3834f228391a" Sep 30 14:21:56 crc kubenswrapper[4676]: I0930 14:21:56.750175 4676 scope.go:117] "RemoveContainer" containerID="0c286ac4997b05114d2eb4b41e833d0efbe0c6be5df67ff0558b23c7a5ece3e1" Sep 30 14:21:56 crc kubenswrapper[4676]: E0930 14:21:56.750866 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c286ac4997b05114d2eb4b41e833d0efbe0c6be5df67ff0558b23c7a5ece3e1\": container with ID starting with 0c286ac4997b05114d2eb4b41e833d0efbe0c6be5df67ff0558b23c7a5ece3e1 not found: ID does not exist" containerID="0c286ac4997b05114d2eb4b41e833d0efbe0c6be5df67ff0558b23c7a5ece3e1" Sep 30 14:21:56 crc kubenswrapper[4676]: I0930 14:21:56.750917 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c286ac4997b05114d2eb4b41e833d0efbe0c6be5df67ff0558b23c7a5ece3e1"} err="failed to get container status \"0c286ac4997b05114d2eb4b41e833d0efbe0c6be5df67ff0558b23c7a5ece3e1\": rpc error: code = NotFound desc = could not find container \"0c286ac4997b05114d2eb4b41e833d0efbe0c6be5df67ff0558b23c7a5ece3e1\": container with ID starting with 0c286ac4997b05114d2eb4b41e833d0efbe0c6be5df67ff0558b23c7a5ece3e1 not found: ID does not exist" Sep 30 14:21:56 crc kubenswrapper[4676]: I0930 14:21:56.750940 4676 scope.go:117] "RemoveContainer" containerID="6eb4e20f3ff14a2e138fbddafeebd3a514b6e74e5b270c5dd172192eab0ab67b" Sep 30 14:21:56 crc kubenswrapper[4676]: E0930 14:21:56.751439 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6eb4e20f3ff14a2e138fbddafeebd3a514b6e74e5b270c5dd172192eab0ab67b\": container with ID starting with 6eb4e20f3ff14a2e138fbddafeebd3a514b6e74e5b270c5dd172192eab0ab67b not found: ID does not exist" containerID="6eb4e20f3ff14a2e138fbddafeebd3a514b6e74e5b270c5dd172192eab0ab67b" Sep 30 14:21:56 crc kubenswrapper[4676]: I0930 14:21:56.751461 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6eb4e20f3ff14a2e138fbddafeebd3a514b6e74e5b270c5dd172192eab0ab67b"} err="failed to get container status \"6eb4e20f3ff14a2e138fbddafeebd3a514b6e74e5b270c5dd172192eab0ab67b\": rpc error: code = NotFound desc = could not find container \"6eb4e20f3ff14a2e138fbddafeebd3a514b6e74e5b270c5dd172192eab0ab67b\": container with ID starting with 6eb4e20f3ff14a2e138fbddafeebd3a514b6e74e5b270c5dd172192eab0ab67b not found: ID does not exist" Sep 30 14:21:56 crc kubenswrapper[4676]: I0930 14:21:56.751476 4676 scope.go:117] "RemoveContainer" containerID="cd05e8eb6cc47850912162a49c10d1a2257a1c06066980a307cac966c2f07af0" Sep 30 14:21:56 crc kubenswrapper[4676]: E0930 14:21:56.751998 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd05e8eb6cc47850912162a49c10d1a2257a1c06066980a307cac966c2f07af0\": container with ID starting with cd05e8eb6cc47850912162a49c10d1a2257a1c06066980a307cac966c2f07af0 not found: ID does not exist" containerID="cd05e8eb6cc47850912162a49c10d1a2257a1c06066980a307cac966c2f07af0" Sep 30 14:21:56 crc kubenswrapper[4676]: I0930 14:21:56.752028 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd05e8eb6cc47850912162a49c10d1a2257a1c06066980a307cac966c2f07af0"} err="failed to get container status \"cd05e8eb6cc47850912162a49c10d1a2257a1c06066980a307cac966c2f07af0\": rpc error: code = NotFound desc = could not find container \"cd05e8eb6cc47850912162a49c10d1a2257a1c06066980a307cac966c2f07af0\": container with ID starting with cd05e8eb6cc47850912162a49c10d1a2257a1c06066980a307cac966c2f07af0 not found: ID does not exist" Sep 30 14:21:56 crc kubenswrapper[4676]: I0930 14:21:56.752040 4676 scope.go:117] "RemoveContainer" containerID="c35f035349b2023c6afae875ad6a34e4c8b90262ebffab69b16e3834f228391a" Sep 30 14:21:56 crc kubenswrapper[4676]: E0930 14:21:56.752254 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c35f035349b2023c6afae875ad6a34e4c8b90262ebffab69b16e3834f228391a\": container with ID starting with c35f035349b2023c6afae875ad6a34e4c8b90262ebffab69b16e3834f228391a not found: ID does not exist" containerID="c35f035349b2023c6afae875ad6a34e4c8b90262ebffab69b16e3834f228391a" Sep 30 14:21:56 crc kubenswrapper[4676]: I0930 14:21:56.752280 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c35f035349b2023c6afae875ad6a34e4c8b90262ebffab69b16e3834f228391a"} err="failed to get container status \"c35f035349b2023c6afae875ad6a34e4c8b90262ebffab69b16e3834f228391a\": rpc error: code = NotFound desc = could not find container \"c35f035349b2023c6afae875ad6a34e4c8b90262ebffab69b16e3834f228391a\": container with ID starting with c35f035349b2023c6afae875ad6a34e4c8b90262ebffab69b16e3834f228391a not found: ID does not exist" Sep 30 14:21:56 crc kubenswrapper[4676]: I0930 14:21:56.759987 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/018e0e05-7ff4-4620-989d-5843f092e164-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 14:21:56 crc kubenswrapper[4676]: I0930 14:21:56.928133 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 14:21:56 crc kubenswrapper[4676]: I0930 14:21:56.937739 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 14:21:56 crc kubenswrapper[4676]: I0930 14:21:56.961005 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 14:21:56 crc kubenswrapper[4676]: E0930 14:21:56.961377 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="018e0e05-7ff4-4620-989d-5843f092e164" containerName="ceilometer-notification-agent" Sep 30 14:21:56 crc kubenswrapper[4676]: I0930 14:21:56.961395 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="018e0e05-7ff4-4620-989d-5843f092e164" containerName="ceilometer-notification-agent" Sep 30 14:21:56 crc kubenswrapper[4676]: E0930 14:21:56.961406 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="018e0e05-7ff4-4620-989d-5843f092e164" containerName="ceilometer-central-agent" Sep 30 14:21:56 crc kubenswrapper[4676]: I0930 14:21:56.961413 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="018e0e05-7ff4-4620-989d-5843f092e164" containerName="ceilometer-central-agent" Sep 30 14:21:56 crc kubenswrapper[4676]: E0930 14:21:56.961436 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="018e0e05-7ff4-4620-989d-5843f092e164" containerName="sg-core" Sep 30 14:21:56 crc kubenswrapper[4676]: I0930 14:21:56.961442 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="018e0e05-7ff4-4620-989d-5843f092e164" containerName="sg-core" Sep 30 14:21:56 crc kubenswrapper[4676]: E0930 14:21:56.961456 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="018e0e05-7ff4-4620-989d-5843f092e164" containerName="proxy-httpd" Sep 30 14:21:56 crc kubenswrapper[4676]: I0930 14:21:56.961462 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="018e0e05-7ff4-4620-989d-5843f092e164" containerName="proxy-httpd" Sep 30 14:21:56 crc kubenswrapper[4676]: I0930 14:21:56.961635 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="018e0e05-7ff4-4620-989d-5843f092e164" containerName="ceilometer-notification-agent" Sep 30 14:21:56 crc kubenswrapper[4676]: I0930 14:21:56.961653 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="018e0e05-7ff4-4620-989d-5843f092e164" containerName="proxy-httpd" Sep 30 14:21:56 crc kubenswrapper[4676]: I0930 14:21:56.961668 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="018e0e05-7ff4-4620-989d-5843f092e164" containerName="ceilometer-central-agent" Sep 30 14:21:56 crc kubenswrapper[4676]: I0930 14:21:56.961676 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="018e0e05-7ff4-4620-989d-5843f092e164" containerName="sg-core" Sep 30 14:21:56 crc kubenswrapper[4676]: I0930 14:21:56.963379 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 14:21:56 crc kubenswrapper[4676]: I0930 14:21:56.967258 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 14:21:56 crc kubenswrapper[4676]: I0930 14:21:56.967456 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 14:21:56 crc kubenswrapper[4676]: I0930 14:21:56.977403 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 14:21:57 crc kubenswrapper[4676]: I0930 14:21:57.165634 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/551886de-0586-40d2-9d65-f19496c555db-run-httpd\") pod \"ceilometer-0\" (UID: \"551886de-0586-40d2-9d65-f19496c555db\") " pod="openstack/ceilometer-0" Sep 30 14:21:57 crc kubenswrapper[4676]: I0930 14:21:57.165738 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/551886de-0586-40d2-9d65-f19496c555db-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"551886de-0586-40d2-9d65-f19496c555db\") " pod="openstack/ceilometer-0" Sep 30 14:21:57 crc kubenswrapper[4676]: I0930 14:21:57.166082 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/551886de-0586-40d2-9d65-f19496c555db-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"551886de-0586-40d2-9d65-f19496c555db\") " pod="openstack/ceilometer-0" Sep 30 14:21:57 crc kubenswrapper[4676]: I0930 14:21:57.166177 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/551886de-0586-40d2-9d65-f19496c555db-config-data\") pod \"ceilometer-0\" (UID: \"551886de-0586-40d2-9d65-f19496c555db\") " pod="openstack/ceilometer-0" Sep 30 14:21:57 crc kubenswrapper[4676]: I0930 14:21:57.166200 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxntr\" (UniqueName: \"kubernetes.io/projected/551886de-0586-40d2-9d65-f19496c555db-kube-api-access-hxntr\") pod \"ceilometer-0\" (UID: \"551886de-0586-40d2-9d65-f19496c555db\") " pod="openstack/ceilometer-0" Sep 30 14:21:57 crc kubenswrapper[4676]: I0930 14:21:57.166340 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/551886de-0586-40d2-9d65-f19496c555db-log-httpd\") pod \"ceilometer-0\" (UID: \"551886de-0586-40d2-9d65-f19496c555db\") " pod="openstack/ceilometer-0" Sep 30 14:21:57 crc kubenswrapper[4676]: I0930 14:21:57.166392 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/551886de-0586-40d2-9d65-f19496c555db-scripts\") pod \"ceilometer-0\" (UID: \"551886de-0586-40d2-9d65-f19496c555db\") " pod="openstack/ceilometer-0" Sep 30 14:21:57 crc kubenswrapper[4676]: I0930 14:21:57.268342 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/551886de-0586-40d2-9d65-f19496c555db-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"551886de-0586-40d2-9d65-f19496c555db\") " pod="openstack/ceilometer-0" Sep 30 14:21:57 crc kubenswrapper[4676]: I0930 14:21:57.268399 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/551886de-0586-40d2-9d65-f19496c555db-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"551886de-0586-40d2-9d65-f19496c555db\") " pod="openstack/ceilometer-0" Sep 30 14:21:57 crc kubenswrapper[4676]: I0930 14:21:57.268444 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/551886de-0586-40d2-9d65-f19496c555db-config-data\") pod \"ceilometer-0\" (UID: \"551886de-0586-40d2-9d65-f19496c555db\") " pod="openstack/ceilometer-0" Sep 30 14:21:57 crc kubenswrapper[4676]: I0930 14:21:57.268472 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxntr\" (UniqueName: \"kubernetes.io/projected/551886de-0586-40d2-9d65-f19496c555db-kube-api-access-hxntr\") pod \"ceilometer-0\" (UID: \"551886de-0586-40d2-9d65-f19496c555db\") " pod="openstack/ceilometer-0" Sep 30 14:21:57 crc kubenswrapper[4676]: I0930 14:21:57.268544 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/551886de-0586-40d2-9d65-f19496c555db-log-httpd\") pod \"ceilometer-0\" (UID: \"551886de-0586-40d2-9d65-f19496c555db\") " pod="openstack/ceilometer-0" Sep 30 14:21:57 crc kubenswrapper[4676]: I0930 14:21:57.268571 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/551886de-0586-40d2-9d65-f19496c555db-scripts\") pod \"ceilometer-0\" (UID: \"551886de-0586-40d2-9d65-f19496c555db\") " pod="openstack/ceilometer-0" Sep 30 14:21:57 crc kubenswrapper[4676]: I0930 14:21:57.268634 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/551886de-0586-40d2-9d65-f19496c555db-run-httpd\") pod \"ceilometer-0\" (UID: \"551886de-0586-40d2-9d65-f19496c555db\") " pod="openstack/ceilometer-0" Sep 30 14:21:57 crc kubenswrapper[4676]: I0930 14:21:57.269217 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/551886de-0586-40d2-9d65-f19496c555db-run-httpd\") pod \"ceilometer-0\" (UID: \"551886de-0586-40d2-9d65-f19496c555db\") " pod="openstack/ceilometer-0" Sep 30 14:21:57 crc kubenswrapper[4676]: I0930 14:21:57.269226 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/551886de-0586-40d2-9d65-f19496c555db-log-httpd\") pod \"ceilometer-0\" (UID: \"551886de-0586-40d2-9d65-f19496c555db\") " pod="openstack/ceilometer-0" Sep 30 14:21:57 crc kubenswrapper[4676]: I0930 14:21:57.272619 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/551886de-0586-40d2-9d65-f19496c555db-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"551886de-0586-40d2-9d65-f19496c555db\") " pod="openstack/ceilometer-0" Sep 30 14:21:57 crc kubenswrapper[4676]: I0930 14:21:57.272746 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/551886de-0586-40d2-9d65-f19496c555db-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"551886de-0586-40d2-9d65-f19496c555db\") " pod="openstack/ceilometer-0" Sep 30 14:21:57 crc kubenswrapper[4676]: I0930 14:21:57.274079 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/551886de-0586-40d2-9d65-f19496c555db-config-data\") pod \"ceilometer-0\" (UID: \"551886de-0586-40d2-9d65-f19496c555db\") " pod="openstack/ceilometer-0" Sep 30 14:21:57 crc kubenswrapper[4676]: I0930 14:21:57.275012 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/551886de-0586-40d2-9d65-f19496c555db-scripts\") pod \"ceilometer-0\" (UID: \"551886de-0586-40d2-9d65-f19496c555db\") " pod="openstack/ceilometer-0" Sep 30 14:21:57 crc kubenswrapper[4676]: I0930 14:21:57.285833 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxntr\" (UniqueName: \"kubernetes.io/projected/551886de-0586-40d2-9d65-f19496c555db-kube-api-access-hxntr\") pod \"ceilometer-0\" (UID: \"551886de-0586-40d2-9d65-f19496c555db\") " pod="openstack/ceilometer-0" Sep 30 14:21:57 crc kubenswrapper[4676]: I0930 14:21:57.448233 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="018e0e05-7ff4-4620-989d-5843f092e164" path="/var/lib/kubelet/pods/018e0e05-7ff4-4620-989d-5843f092e164/volumes" Sep 30 14:21:57 crc kubenswrapper[4676]: I0930 14:21:57.581688 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 14:21:58 crc kubenswrapper[4676]: I0930 14:21:58.056604 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 14:21:58 crc kubenswrapper[4676]: W0930 14:21:58.063035 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod551886de_0586_40d2_9d65_f19496c555db.slice/crio-02bef6143c61f7e746313cd45735112772f757786d32f59f3834ba0fbf7dee37 WatchSource:0}: Error finding container 02bef6143c61f7e746313cd45735112772f757786d32f59f3834ba0fbf7dee37: Status 404 returned error can't find the container with id 02bef6143c61f7e746313cd45735112772f757786d32f59f3834ba0fbf7dee37 Sep 30 14:21:58 crc kubenswrapper[4676]: I0930 14:21:58.615836 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"551886de-0586-40d2-9d65-f19496c555db","Type":"ContainerStarted","Data":"02bef6143c61f7e746313cd45735112772f757786d32f59f3834ba0fbf7dee37"} Sep 30 14:21:59 crc kubenswrapper[4676]: I0930 14:21:59.632529 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"551886de-0586-40d2-9d65-f19496c555db","Type":"ContainerStarted","Data":"b095d607af66caf02415d6e41a8008b217cce04367dfa59a33694fb00433d26b"} Sep 30 14:21:59 crc kubenswrapper[4676]: I0930 14:21:59.632977 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"551886de-0586-40d2-9d65-f19496c555db","Type":"ContainerStarted","Data":"bc962a29570941ce10e5edf24761529f5cfee013d3ceb729eee3c55dddb9277d"} Sep 30 14:22:00 crc kubenswrapper[4676]: I0930 14:22:00.642571 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"551886de-0586-40d2-9d65-f19496c555db","Type":"ContainerStarted","Data":"8bef1ec8969b5f57b63e138b94702194da7c4a0eb4136ff8dbe0405806081bd4"} Sep 30 14:22:01 crc kubenswrapper[4676]: I0930 14:22:01.655036 4676 generic.go:334] "Generic (PLEG): container finished" podID="18f239b5-877b-4291-8481-6a121c25bff9" containerID="7c401ca1e5312a6509c27ce37f7ec2ddd0f578f113baa2f49b4f416f04b3eb5c" exitCode=0 Sep 30 14:22:01 crc kubenswrapper[4676]: I0930 14:22:01.655254 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-589w4" event={"ID":"18f239b5-877b-4291-8481-6a121c25bff9","Type":"ContainerDied","Data":"7c401ca1e5312a6509c27ce37f7ec2ddd0f578f113baa2f49b4f416f04b3eb5c"} Sep 30 14:22:02 crc kubenswrapper[4676]: I0930 14:22:02.667826 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"551886de-0586-40d2-9d65-f19496c555db","Type":"ContainerStarted","Data":"aa687090bceded467ced263240b51137f6f0f09d3b601ca8d12e12c3ea6beb4e"} Sep 30 14:22:02 crc kubenswrapper[4676]: I0930 14:22:02.668184 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 14:22:02 crc kubenswrapper[4676]: I0930 14:22:02.705059 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.24842205 podStartE2EDuration="6.705033993s" podCreationTimestamp="2025-09-30 14:21:56 +0000 UTC" firstStartedPulling="2025-09-30 14:21:58.066058208 +0000 UTC m=+1422.049146637" lastFinishedPulling="2025-09-30 14:22:01.522670151 +0000 UTC m=+1425.505758580" observedRunningTime="2025-09-30 14:22:02.698402829 +0000 UTC m=+1426.681491258" watchObservedRunningTime="2025-09-30 14:22:02.705033993 +0000 UTC m=+1426.688122422" Sep 30 14:22:03 crc kubenswrapper[4676]: I0930 14:22:03.051517 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-589w4" Sep 30 14:22:03 crc kubenswrapper[4676]: I0930 14:22:03.093765 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wltxv\" (UniqueName: \"kubernetes.io/projected/18f239b5-877b-4291-8481-6a121c25bff9-kube-api-access-wltxv\") pod \"18f239b5-877b-4291-8481-6a121c25bff9\" (UID: \"18f239b5-877b-4291-8481-6a121c25bff9\") " Sep 30 14:22:03 crc kubenswrapper[4676]: I0930 14:22:03.093818 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18f239b5-877b-4291-8481-6a121c25bff9-combined-ca-bundle\") pod \"18f239b5-877b-4291-8481-6a121c25bff9\" (UID: \"18f239b5-877b-4291-8481-6a121c25bff9\") " Sep 30 14:22:03 crc kubenswrapper[4676]: I0930 14:22:03.093974 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18f239b5-877b-4291-8481-6a121c25bff9-config-data\") pod \"18f239b5-877b-4291-8481-6a121c25bff9\" (UID: \"18f239b5-877b-4291-8481-6a121c25bff9\") " Sep 30 14:22:03 crc kubenswrapper[4676]: I0930 14:22:03.094912 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18f239b5-877b-4291-8481-6a121c25bff9-scripts\") pod \"18f239b5-877b-4291-8481-6a121c25bff9\" (UID: \"18f239b5-877b-4291-8481-6a121c25bff9\") " Sep 30 14:22:03 crc kubenswrapper[4676]: I0930 14:22:03.108093 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18f239b5-877b-4291-8481-6a121c25bff9-scripts" (OuterVolumeSpecName: "scripts") pod "18f239b5-877b-4291-8481-6a121c25bff9" (UID: "18f239b5-877b-4291-8481-6a121c25bff9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:22:03 crc kubenswrapper[4676]: I0930 14:22:03.111801 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18f239b5-877b-4291-8481-6a121c25bff9-kube-api-access-wltxv" (OuterVolumeSpecName: "kube-api-access-wltxv") pod "18f239b5-877b-4291-8481-6a121c25bff9" (UID: "18f239b5-877b-4291-8481-6a121c25bff9"). InnerVolumeSpecName "kube-api-access-wltxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:22:03 crc kubenswrapper[4676]: I0930 14:22:03.127401 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18f239b5-877b-4291-8481-6a121c25bff9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "18f239b5-877b-4291-8481-6a121c25bff9" (UID: "18f239b5-877b-4291-8481-6a121c25bff9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:22:03 crc kubenswrapper[4676]: I0930 14:22:03.140192 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18f239b5-877b-4291-8481-6a121c25bff9-config-data" (OuterVolumeSpecName: "config-data") pod "18f239b5-877b-4291-8481-6a121c25bff9" (UID: "18f239b5-877b-4291-8481-6a121c25bff9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:22:03 crc kubenswrapper[4676]: I0930 14:22:03.198945 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wltxv\" (UniqueName: \"kubernetes.io/projected/18f239b5-877b-4291-8481-6a121c25bff9-kube-api-access-wltxv\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:03 crc kubenswrapper[4676]: I0930 14:22:03.199012 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18f239b5-877b-4291-8481-6a121c25bff9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:03 crc kubenswrapper[4676]: I0930 14:22:03.199031 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18f239b5-877b-4291-8481-6a121c25bff9-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:03 crc kubenswrapper[4676]: I0930 14:22:03.199070 4676 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18f239b5-877b-4291-8481-6a121c25bff9-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:03 crc kubenswrapper[4676]: I0930 14:22:03.683801 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-589w4" event={"ID":"18f239b5-877b-4291-8481-6a121c25bff9","Type":"ContainerDied","Data":"1792e23a7c12b6c5d711f4bceff49e9f9eca9c7865b39256123e60375ed0954f"} Sep 30 14:22:03 crc kubenswrapper[4676]: I0930 14:22:03.683851 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1792e23a7c12b6c5d711f4bceff49e9f9eca9c7865b39256123e60375ed0954f" Sep 30 14:22:03 crc kubenswrapper[4676]: I0930 14:22:03.683857 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-589w4" Sep 30 14:22:03 crc kubenswrapper[4676]: I0930 14:22:03.771480 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Sep 30 14:22:03 crc kubenswrapper[4676]: E0930 14:22:03.771946 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18f239b5-877b-4291-8481-6a121c25bff9" containerName="nova-cell0-conductor-db-sync" Sep 30 14:22:03 crc kubenswrapper[4676]: I0930 14:22:03.771964 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="18f239b5-877b-4291-8481-6a121c25bff9" containerName="nova-cell0-conductor-db-sync" Sep 30 14:22:03 crc kubenswrapper[4676]: I0930 14:22:03.772142 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="18f239b5-877b-4291-8481-6a121c25bff9" containerName="nova-cell0-conductor-db-sync" Sep 30 14:22:03 crc kubenswrapper[4676]: I0930 14:22:03.772737 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Sep 30 14:22:03 crc kubenswrapper[4676]: I0930 14:22:03.775282 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Sep 30 14:22:03 crc kubenswrapper[4676]: I0930 14:22:03.782532 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Sep 30 14:22:03 crc kubenswrapper[4676]: I0930 14:22:03.787396 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-bdwsz" Sep 30 14:22:03 crc kubenswrapper[4676]: I0930 14:22:03.811544 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6w2cl\" (UniqueName: \"kubernetes.io/projected/7c38ef92-eb09-4ce7-b23f-10886d83860c-kube-api-access-6w2cl\") pod \"nova-cell0-conductor-0\" (UID: \"7c38ef92-eb09-4ce7-b23f-10886d83860c\") " pod="openstack/nova-cell0-conductor-0" Sep 30 14:22:03 crc kubenswrapper[4676]: I0930 14:22:03.811581 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c38ef92-eb09-4ce7-b23f-10886d83860c-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"7c38ef92-eb09-4ce7-b23f-10886d83860c\") " pod="openstack/nova-cell0-conductor-0" Sep 30 14:22:03 crc kubenswrapper[4676]: I0930 14:22:03.811659 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c38ef92-eb09-4ce7-b23f-10886d83860c-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"7c38ef92-eb09-4ce7-b23f-10886d83860c\") " pod="openstack/nova-cell0-conductor-0" Sep 30 14:22:03 crc kubenswrapper[4676]: I0930 14:22:03.912620 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6w2cl\" (UniqueName: \"kubernetes.io/projected/7c38ef92-eb09-4ce7-b23f-10886d83860c-kube-api-access-6w2cl\") pod \"nova-cell0-conductor-0\" (UID: \"7c38ef92-eb09-4ce7-b23f-10886d83860c\") " pod="openstack/nova-cell0-conductor-0" Sep 30 14:22:03 crc kubenswrapper[4676]: I0930 14:22:03.913172 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c38ef92-eb09-4ce7-b23f-10886d83860c-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"7c38ef92-eb09-4ce7-b23f-10886d83860c\") " pod="openstack/nova-cell0-conductor-0" Sep 30 14:22:03 crc kubenswrapper[4676]: I0930 14:22:03.913581 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c38ef92-eb09-4ce7-b23f-10886d83860c-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"7c38ef92-eb09-4ce7-b23f-10886d83860c\") " pod="openstack/nova-cell0-conductor-0" Sep 30 14:22:03 crc kubenswrapper[4676]: I0930 14:22:03.918045 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c38ef92-eb09-4ce7-b23f-10886d83860c-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"7c38ef92-eb09-4ce7-b23f-10886d83860c\") " pod="openstack/nova-cell0-conductor-0" Sep 30 14:22:03 crc kubenswrapper[4676]: I0930 14:22:03.927418 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c38ef92-eb09-4ce7-b23f-10886d83860c-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"7c38ef92-eb09-4ce7-b23f-10886d83860c\") " pod="openstack/nova-cell0-conductor-0" Sep 30 14:22:03 crc kubenswrapper[4676]: I0930 14:22:03.929472 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6w2cl\" (UniqueName: \"kubernetes.io/projected/7c38ef92-eb09-4ce7-b23f-10886d83860c-kube-api-access-6w2cl\") pod \"nova-cell0-conductor-0\" (UID: \"7c38ef92-eb09-4ce7-b23f-10886d83860c\") " pod="openstack/nova-cell0-conductor-0" Sep 30 14:22:04 crc kubenswrapper[4676]: I0930 14:22:04.090286 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Sep 30 14:22:04 crc kubenswrapper[4676]: I0930 14:22:04.546908 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Sep 30 14:22:04 crc kubenswrapper[4676]: I0930 14:22:04.692870 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"7c38ef92-eb09-4ce7-b23f-10886d83860c","Type":"ContainerStarted","Data":"7ebfec2a34390b2ed51617def3b982b1b8dc987bd0fac358dbac2ba462e29837"} Sep 30 14:22:05 crc kubenswrapper[4676]: I0930 14:22:05.706865 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"7c38ef92-eb09-4ce7-b23f-10886d83860c","Type":"ContainerStarted","Data":"27c8d68f7e3cd6375d94bffe0b6ddd74f5eb7bb7fe194de759f058a2be1a2ce1"} Sep 30 14:22:05 crc kubenswrapper[4676]: I0930 14:22:05.707061 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Sep 30 14:22:08 crc kubenswrapper[4676]: I0930 14:22:08.009765 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="77375b04-44bb-4250-a54c-0c193201f738" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.159:9292/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Sep 30 14:22:08 crc kubenswrapper[4676]: I0930 14:22:08.010343 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="77375b04-44bb-4250-a54c-0c193201f738" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.159:9292/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Sep 30 14:22:09 crc kubenswrapper[4676]: I0930 14:22:09.120385 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Sep 30 14:22:09 crc kubenswrapper[4676]: I0930 14:22:09.139551 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=6.139530741 podStartE2EDuration="6.139530741s" podCreationTimestamp="2025-09-30 14:22:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:22:05.741723031 +0000 UTC m=+1429.724811460" watchObservedRunningTime="2025-09-30 14:22:09.139530741 +0000 UTC m=+1433.122619170" Sep 30 14:22:09 crc kubenswrapper[4676]: I0930 14:22:09.562381 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-jpk85"] Sep 30 14:22:09 crc kubenswrapper[4676]: I0930 14:22:09.564390 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-jpk85" Sep 30 14:22:09 crc kubenswrapper[4676]: I0930 14:22:09.567643 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Sep 30 14:22:09 crc kubenswrapper[4676]: I0930 14:22:09.568167 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Sep 30 14:22:09 crc kubenswrapper[4676]: I0930 14:22:09.584903 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-jpk85"] Sep 30 14:22:09 crc kubenswrapper[4676]: I0930 14:22:09.612617 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d2e03c1-e9fc-4b6d-a755-0582ad936263-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-jpk85\" (UID: \"1d2e03c1-e9fc-4b6d-a755-0582ad936263\") " pod="openstack/nova-cell0-cell-mapping-jpk85" Sep 30 14:22:09 crc kubenswrapper[4676]: I0930 14:22:09.612684 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d2e03c1-e9fc-4b6d-a755-0582ad936263-scripts\") pod \"nova-cell0-cell-mapping-jpk85\" (UID: \"1d2e03c1-e9fc-4b6d-a755-0582ad936263\") " pod="openstack/nova-cell0-cell-mapping-jpk85" Sep 30 14:22:09 crc kubenswrapper[4676]: I0930 14:22:09.612757 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d2e03c1-e9fc-4b6d-a755-0582ad936263-config-data\") pod \"nova-cell0-cell-mapping-jpk85\" (UID: \"1d2e03c1-e9fc-4b6d-a755-0582ad936263\") " pod="openstack/nova-cell0-cell-mapping-jpk85" Sep 30 14:22:09 crc kubenswrapper[4676]: I0930 14:22:09.612834 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gg5xz\" (UniqueName: \"kubernetes.io/projected/1d2e03c1-e9fc-4b6d-a755-0582ad936263-kube-api-access-gg5xz\") pod \"nova-cell0-cell-mapping-jpk85\" (UID: \"1d2e03c1-e9fc-4b6d-a755-0582ad936263\") " pod="openstack/nova-cell0-cell-mapping-jpk85" Sep 30 14:22:09 crc kubenswrapper[4676]: I0930 14:22:09.714725 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gg5xz\" (UniqueName: \"kubernetes.io/projected/1d2e03c1-e9fc-4b6d-a755-0582ad936263-kube-api-access-gg5xz\") pod \"nova-cell0-cell-mapping-jpk85\" (UID: \"1d2e03c1-e9fc-4b6d-a755-0582ad936263\") " pod="openstack/nova-cell0-cell-mapping-jpk85" Sep 30 14:22:09 crc kubenswrapper[4676]: I0930 14:22:09.714787 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d2e03c1-e9fc-4b6d-a755-0582ad936263-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-jpk85\" (UID: \"1d2e03c1-e9fc-4b6d-a755-0582ad936263\") " pod="openstack/nova-cell0-cell-mapping-jpk85" Sep 30 14:22:09 crc kubenswrapper[4676]: I0930 14:22:09.714840 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d2e03c1-e9fc-4b6d-a755-0582ad936263-scripts\") pod \"nova-cell0-cell-mapping-jpk85\" (UID: \"1d2e03c1-e9fc-4b6d-a755-0582ad936263\") " pod="openstack/nova-cell0-cell-mapping-jpk85" Sep 30 14:22:09 crc kubenswrapper[4676]: I0930 14:22:09.714928 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d2e03c1-e9fc-4b6d-a755-0582ad936263-config-data\") pod \"nova-cell0-cell-mapping-jpk85\" (UID: \"1d2e03c1-e9fc-4b6d-a755-0582ad936263\") " pod="openstack/nova-cell0-cell-mapping-jpk85" Sep 30 14:22:09 crc kubenswrapper[4676]: I0930 14:22:09.721520 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d2e03c1-e9fc-4b6d-a755-0582ad936263-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-jpk85\" (UID: \"1d2e03c1-e9fc-4b6d-a755-0582ad936263\") " pod="openstack/nova-cell0-cell-mapping-jpk85" Sep 30 14:22:09 crc kubenswrapper[4676]: I0930 14:22:09.722479 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 14:22:09 crc kubenswrapper[4676]: I0930 14:22:09.722612 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d2e03c1-e9fc-4b6d-a755-0582ad936263-config-data\") pod \"nova-cell0-cell-mapping-jpk85\" (UID: \"1d2e03c1-e9fc-4b6d-a755-0582ad936263\") " pod="openstack/nova-cell0-cell-mapping-jpk85" Sep 30 14:22:09 crc kubenswrapper[4676]: I0930 14:22:09.727339 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d2e03c1-e9fc-4b6d-a755-0582ad936263-scripts\") pod \"nova-cell0-cell-mapping-jpk85\" (UID: \"1d2e03c1-e9fc-4b6d-a755-0582ad936263\") " pod="openstack/nova-cell0-cell-mapping-jpk85" Sep 30 14:22:09 crc kubenswrapper[4676]: I0930 14:22:09.732623 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 14:22:09 crc kubenswrapper[4676]: I0930 14:22:09.736398 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Sep 30 14:22:09 crc kubenswrapper[4676]: I0930 14:22:09.742763 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gg5xz\" (UniqueName: \"kubernetes.io/projected/1d2e03c1-e9fc-4b6d-a755-0582ad936263-kube-api-access-gg5xz\") pod \"nova-cell0-cell-mapping-jpk85\" (UID: \"1d2e03c1-e9fc-4b6d-a755-0582ad936263\") " pod="openstack/nova-cell0-cell-mapping-jpk85" Sep 30 14:22:09 crc kubenswrapper[4676]: I0930 14:22:09.764666 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 14:22:09 crc kubenswrapper[4676]: I0930 14:22:09.869108 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Sep 30 14:22:09 crc kubenswrapper[4676]: I0930 14:22:09.872703 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 14:22:09 crc kubenswrapper[4676]: I0930 14:22:09.874988 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Sep 30 14:22:09 crc kubenswrapper[4676]: I0930 14:22:09.882324 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 14:22:09 crc kubenswrapper[4676]: I0930 14:22:09.883532 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 14:22:09 crc kubenswrapper[4676]: I0930 14:22:09.895117 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Sep 30 14:22:09 crc kubenswrapper[4676]: I0930 14:22:09.895672 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-jpk85" Sep 30 14:22:09 crc kubenswrapper[4676]: I0930 14:22:09.910977 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 14:22:09 crc kubenswrapper[4676]: I0930 14:22:09.921753 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvg7d\" (UniqueName: \"kubernetes.io/projected/818f60d7-30b2-4ace-a2d0-51f551976dca-kube-api-access-kvg7d\") pod \"nova-cell1-novncproxy-0\" (UID: \"818f60d7-30b2-4ace-a2d0-51f551976dca\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 14:22:09 crc kubenswrapper[4676]: I0930 14:22:09.921813 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/818f60d7-30b2-4ace-a2d0-51f551976dca-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"818f60d7-30b2-4ace-a2d0-51f551976dca\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 14:22:09 crc kubenswrapper[4676]: I0930 14:22:09.923785 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/818f60d7-30b2-4ace-a2d0-51f551976dca-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"818f60d7-30b2-4ace-a2d0-51f551976dca\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 14:22:09 crc kubenswrapper[4676]: I0930 14:22:09.944187 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.025537 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0eee199-f211-4092-9600-25fef2b42baf-config-data\") pod \"nova-metadata-0\" (UID: \"c0eee199-f211-4092-9600-25fef2b42baf\") " pod="openstack/nova-metadata-0" Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.025884 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/818f60d7-30b2-4ace-a2d0-51f551976dca-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"818f60d7-30b2-4ace-a2d0-51f551976dca\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.026034 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0eee199-f211-4092-9600-25fef2b42baf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c0eee199-f211-4092-9600-25fef2b42baf\") " pod="openstack/nova-metadata-0" Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.026167 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29ef64ab-bbf7-4ab9-ba3d-938dd6d90221-config-data\") pod \"nova-scheduler-0\" (UID: \"29ef64ab-bbf7-4ab9-ba3d-938dd6d90221\") " pod="openstack/nova-scheduler-0" Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.026252 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0eee199-f211-4092-9600-25fef2b42baf-logs\") pod \"nova-metadata-0\" (UID: \"c0eee199-f211-4092-9600-25fef2b42baf\") " pod="openstack/nova-metadata-0" Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.026767 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pc9wr\" (UniqueName: \"kubernetes.io/projected/c0eee199-f211-4092-9600-25fef2b42baf-kube-api-access-pc9wr\") pod \"nova-metadata-0\" (UID: \"c0eee199-f211-4092-9600-25fef2b42baf\") " pod="openstack/nova-metadata-0" Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.026911 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvg7d\" (UniqueName: \"kubernetes.io/projected/818f60d7-30b2-4ace-a2d0-51f551976dca-kube-api-access-kvg7d\") pod \"nova-cell1-novncproxy-0\" (UID: \"818f60d7-30b2-4ace-a2d0-51f551976dca\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.027036 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29ef64ab-bbf7-4ab9-ba3d-938dd6d90221-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"29ef64ab-bbf7-4ab9-ba3d-938dd6d90221\") " pod="openstack/nova-scheduler-0" Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.027148 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/818f60d7-30b2-4ace-a2d0-51f551976dca-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"818f60d7-30b2-4ace-a2d0-51f551976dca\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.027242 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jp4ns\" (UniqueName: \"kubernetes.io/projected/29ef64ab-bbf7-4ab9-ba3d-938dd6d90221-kube-api-access-jp4ns\") pod \"nova-scheduler-0\" (UID: \"29ef64ab-bbf7-4ab9-ba3d-938dd6d90221\") " pod="openstack/nova-scheduler-0" Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.047766 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/818f60d7-30b2-4ace-a2d0-51f551976dca-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"818f60d7-30b2-4ace-a2d0-51f551976dca\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.048141 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.062086 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.066691 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.068495 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/818f60d7-30b2-4ace-a2d0-51f551976dca-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"818f60d7-30b2-4ace-a2d0-51f551976dca\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.082461 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvg7d\" (UniqueName: \"kubernetes.io/projected/818f60d7-30b2-4ace-a2d0-51f551976dca-kube-api-access-kvg7d\") pod \"nova-cell1-novncproxy-0\" (UID: \"818f60d7-30b2-4ace-a2d0-51f551976dca\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.117097 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.144465 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0eee199-f211-4092-9600-25fef2b42baf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c0eee199-f211-4092-9600-25fef2b42baf\") " pod="openstack/nova-metadata-0" Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.144580 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29ef64ab-bbf7-4ab9-ba3d-938dd6d90221-config-data\") pod \"nova-scheduler-0\" (UID: \"29ef64ab-bbf7-4ab9-ba3d-938dd6d90221\") " pod="openstack/nova-scheduler-0" Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.144607 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0eee199-f211-4092-9600-25fef2b42baf-logs\") pod \"nova-metadata-0\" (UID: \"c0eee199-f211-4092-9600-25fef2b42baf\") " pod="openstack/nova-metadata-0" Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.170996 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28068cf5-7fa1-4380-be99-4c0951283c8a-logs\") pod \"nova-api-0\" (UID: \"28068cf5-7fa1-4380-be99-4c0951283c8a\") " pod="openstack/nova-api-0" Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.171140 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pc9wr\" (UniqueName: \"kubernetes.io/projected/c0eee199-f211-4092-9600-25fef2b42baf-kube-api-access-pc9wr\") pod \"nova-metadata-0\" (UID: \"c0eee199-f211-4092-9600-25fef2b42baf\") " pod="openstack/nova-metadata-0" Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.171266 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28068cf5-7fa1-4380-be99-4c0951283c8a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"28068cf5-7fa1-4380-be99-4c0951283c8a\") " pod="openstack/nova-api-0" Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.171307 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29ef64ab-bbf7-4ab9-ba3d-938dd6d90221-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"29ef64ab-bbf7-4ab9-ba3d-938dd6d90221\") " pod="openstack/nova-scheduler-0" Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.171358 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28068cf5-7fa1-4380-be99-4c0951283c8a-config-data\") pod \"nova-api-0\" (UID: \"28068cf5-7fa1-4380-be99-4c0951283c8a\") " pod="openstack/nova-api-0" Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.171387 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jp4ns\" (UniqueName: \"kubernetes.io/projected/29ef64ab-bbf7-4ab9-ba3d-938dd6d90221-kube-api-access-jp4ns\") pod \"nova-scheduler-0\" (UID: \"29ef64ab-bbf7-4ab9-ba3d-938dd6d90221\") " pod="openstack/nova-scheduler-0" Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.171445 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0eee199-f211-4092-9600-25fef2b42baf-config-data\") pod \"nova-metadata-0\" (UID: \"c0eee199-f211-4092-9600-25fef2b42baf\") " pod="openstack/nova-metadata-0" Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.171484 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8w2r\" (UniqueName: \"kubernetes.io/projected/28068cf5-7fa1-4380-be99-4c0951283c8a-kube-api-access-s8w2r\") pod \"nova-api-0\" (UID: \"28068cf5-7fa1-4380-be99-4c0951283c8a\") " pod="openstack/nova-api-0" Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.175096 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-dc5jg"] Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.182976 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29ef64ab-bbf7-4ab9-ba3d-938dd6d90221-config-data\") pod \"nova-scheduler-0\" (UID: \"29ef64ab-bbf7-4ab9-ba3d-938dd6d90221\") " pod="openstack/nova-scheduler-0" Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.184656 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.192738 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0eee199-f211-4092-9600-25fef2b42baf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c0eee199-f211-4092-9600-25fef2b42baf\") " pod="openstack/nova-metadata-0" Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.204313 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0eee199-f211-4092-9600-25fef2b42baf-logs\") pod \"nova-metadata-0\" (UID: \"c0eee199-f211-4092-9600-25fef2b42baf\") " pod="openstack/nova-metadata-0" Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.207044 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29ef64ab-bbf7-4ab9-ba3d-938dd6d90221-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"29ef64ab-bbf7-4ab9-ba3d-938dd6d90221\") " pod="openstack/nova-scheduler-0" Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.208923 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pc9wr\" (UniqueName: \"kubernetes.io/projected/c0eee199-f211-4092-9600-25fef2b42baf-kube-api-access-pc9wr\") pod \"nova-metadata-0\" (UID: \"c0eee199-f211-4092-9600-25fef2b42baf\") " pod="openstack/nova-metadata-0" Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.214262 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0eee199-f211-4092-9600-25fef2b42baf-config-data\") pod \"nova-metadata-0\" (UID: \"c0eee199-f211-4092-9600-25fef2b42baf\") " pod="openstack/nova-metadata-0" Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.223714 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-dc5jg" Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.257831 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-dc5jg"] Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.267119 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jp4ns\" (UniqueName: \"kubernetes.io/projected/29ef64ab-bbf7-4ab9-ba3d-938dd6d90221-kube-api-access-jp4ns\") pod \"nova-scheduler-0\" (UID: \"29ef64ab-bbf7-4ab9-ba3d-938dd6d90221\") " pod="openstack/nova-scheduler-0" Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.273848 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28068cf5-7fa1-4380-be99-4c0951283c8a-config-data\") pod \"nova-api-0\" (UID: \"28068cf5-7fa1-4380-be99-4c0951283c8a\") " pod="openstack/nova-api-0" Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.273945 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sfw7\" (UniqueName: \"kubernetes.io/projected/90feda1c-209e-474d-bd1c-eee343b5f674-kube-api-access-5sfw7\") pod \"dnsmasq-dns-845d6d6f59-dc5jg\" (UID: \"90feda1c-209e-474d-bd1c-eee343b5f674\") " pod="openstack/dnsmasq-dns-845d6d6f59-dc5jg" Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.273978 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8w2r\" (UniqueName: \"kubernetes.io/projected/28068cf5-7fa1-4380-be99-4c0951283c8a-kube-api-access-s8w2r\") pod \"nova-api-0\" (UID: \"28068cf5-7fa1-4380-be99-4c0951283c8a\") " pod="openstack/nova-api-0" Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.274095 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90feda1c-209e-474d-bd1c-eee343b5f674-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-dc5jg\" (UID: \"90feda1c-209e-474d-bd1c-eee343b5f674\") " pod="openstack/dnsmasq-dns-845d6d6f59-dc5jg" Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.274160 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90feda1c-209e-474d-bd1c-eee343b5f674-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-dc5jg\" (UID: \"90feda1c-209e-474d-bd1c-eee343b5f674\") " pod="openstack/dnsmasq-dns-845d6d6f59-dc5jg" Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.274242 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28068cf5-7fa1-4380-be99-4c0951283c8a-logs\") pod \"nova-api-0\" (UID: \"28068cf5-7fa1-4380-be99-4c0951283c8a\") " pod="openstack/nova-api-0" Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.274290 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90feda1c-209e-474d-bd1c-eee343b5f674-config\") pod \"dnsmasq-dns-845d6d6f59-dc5jg\" (UID: \"90feda1c-209e-474d-bd1c-eee343b5f674\") " pod="openstack/dnsmasq-dns-845d6d6f59-dc5jg" Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.274327 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90feda1c-209e-474d-bd1c-eee343b5f674-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-dc5jg\" (UID: \"90feda1c-209e-474d-bd1c-eee343b5f674\") " pod="openstack/dnsmasq-dns-845d6d6f59-dc5jg" Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.274386 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/90feda1c-209e-474d-bd1c-eee343b5f674-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-dc5jg\" (UID: \"90feda1c-209e-474d-bd1c-eee343b5f674\") " pod="openstack/dnsmasq-dns-845d6d6f59-dc5jg" Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.274422 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28068cf5-7fa1-4380-be99-4c0951283c8a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"28068cf5-7fa1-4380-be99-4c0951283c8a\") " pod="openstack/nova-api-0" Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.280419 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28068cf5-7fa1-4380-be99-4c0951283c8a-logs\") pod \"nova-api-0\" (UID: \"28068cf5-7fa1-4380-be99-4c0951283c8a\") " pod="openstack/nova-api-0" Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.286687 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28068cf5-7fa1-4380-be99-4c0951283c8a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"28068cf5-7fa1-4380-be99-4c0951283c8a\") " pod="openstack/nova-api-0" Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.289701 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28068cf5-7fa1-4380-be99-4c0951283c8a-config-data\") pod \"nova-api-0\" (UID: \"28068cf5-7fa1-4380-be99-4c0951283c8a\") " pod="openstack/nova-api-0" Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.321809 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8w2r\" (UniqueName: \"kubernetes.io/projected/28068cf5-7fa1-4380-be99-4c0951283c8a-kube-api-access-s8w2r\") pod \"nova-api-0\" (UID: \"28068cf5-7fa1-4380-be99-4c0951283c8a\") " pod="openstack/nova-api-0" Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.384457 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90feda1c-209e-474d-bd1c-eee343b5f674-config\") pod \"dnsmasq-dns-845d6d6f59-dc5jg\" (UID: \"90feda1c-209e-474d-bd1c-eee343b5f674\") " pod="openstack/dnsmasq-dns-845d6d6f59-dc5jg" Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.384544 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90feda1c-209e-474d-bd1c-eee343b5f674-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-dc5jg\" (UID: \"90feda1c-209e-474d-bd1c-eee343b5f674\") " pod="openstack/dnsmasq-dns-845d6d6f59-dc5jg" Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.384581 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/90feda1c-209e-474d-bd1c-eee343b5f674-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-dc5jg\" (UID: \"90feda1c-209e-474d-bd1c-eee343b5f674\") " pod="openstack/dnsmasq-dns-845d6d6f59-dc5jg" Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.384670 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sfw7\" (UniqueName: \"kubernetes.io/projected/90feda1c-209e-474d-bd1c-eee343b5f674-kube-api-access-5sfw7\") pod \"dnsmasq-dns-845d6d6f59-dc5jg\" (UID: \"90feda1c-209e-474d-bd1c-eee343b5f674\") " pod="openstack/dnsmasq-dns-845d6d6f59-dc5jg" Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.384703 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90feda1c-209e-474d-bd1c-eee343b5f674-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-dc5jg\" (UID: \"90feda1c-209e-474d-bd1c-eee343b5f674\") " pod="openstack/dnsmasq-dns-845d6d6f59-dc5jg" Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.384759 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90feda1c-209e-474d-bd1c-eee343b5f674-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-dc5jg\" (UID: \"90feda1c-209e-474d-bd1c-eee343b5f674\") " pod="openstack/dnsmasq-dns-845d6d6f59-dc5jg" Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.385685 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/90feda1c-209e-474d-bd1c-eee343b5f674-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-dc5jg\" (UID: \"90feda1c-209e-474d-bd1c-eee343b5f674\") " pod="openstack/dnsmasq-dns-845d6d6f59-dc5jg" Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.385898 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90feda1c-209e-474d-bd1c-eee343b5f674-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-dc5jg\" (UID: \"90feda1c-209e-474d-bd1c-eee343b5f674\") " pod="openstack/dnsmasq-dns-845d6d6f59-dc5jg" Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.385949 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90feda1c-209e-474d-bd1c-eee343b5f674-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-dc5jg\" (UID: \"90feda1c-209e-474d-bd1c-eee343b5f674\") " pod="openstack/dnsmasq-dns-845d6d6f59-dc5jg" Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.386682 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90feda1c-209e-474d-bd1c-eee343b5f674-config\") pod \"dnsmasq-dns-845d6d6f59-dc5jg\" (UID: \"90feda1c-209e-474d-bd1c-eee343b5f674\") " pod="openstack/dnsmasq-dns-845d6d6f59-dc5jg" Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.386738 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90feda1c-209e-474d-bd1c-eee343b5f674-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-dc5jg\" (UID: \"90feda1c-209e-474d-bd1c-eee343b5f674\") " pod="openstack/dnsmasq-dns-845d6d6f59-dc5jg" Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.404404 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sfw7\" (UniqueName: \"kubernetes.io/projected/90feda1c-209e-474d-bd1c-eee343b5f674-kube-api-access-5sfw7\") pod \"dnsmasq-dns-845d6d6f59-dc5jg\" (UID: \"90feda1c-209e-474d-bd1c-eee343b5f674\") " pod="openstack/dnsmasq-dns-845d6d6f59-dc5jg" Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.431665 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.497285 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.523350 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.567832 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-jpk85"] Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.618634 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-dc5jg" Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.826526 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-jpk85" event={"ID":"1d2e03c1-e9fc-4b6d-a755-0582ad936263","Type":"ContainerStarted","Data":"3695bfc2014ea5138bc0f4db58aa049426a17af7a7d5dda0a53ddc9b7ba611e8"} Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.843229 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 14:22:10 crc kubenswrapper[4676]: I0930 14:22:10.942847 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 14:22:11 crc kubenswrapper[4676]: I0930 14:22:11.009858 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bwwr8"] Sep 30 14:22:11 crc kubenswrapper[4676]: I0930 14:22:11.012892 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-bwwr8" Sep 30 14:22:11 crc kubenswrapper[4676]: I0930 14:22:11.017045 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Sep 30 14:22:11 crc kubenswrapper[4676]: I0930 14:22:11.017368 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Sep 30 14:22:11 crc kubenswrapper[4676]: I0930 14:22:11.033062 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bwwr8"] Sep 30 14:22:11 crc kubenswrapper[4676]: W0930 14:22:11.033285 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod28068cf5_7fa1_4380_be99_4c0951283c8a.slice/crio-c10e65cf28c1ab8d96f6411a387e7fc9adef3094f3833ed748c9431a67b8dee0 WatchSource:0}: Error finding container c10e65cf28c1ab8d96f6411a387e7fc9adef3094f3833ed748c9431a67b8dee0: Status 404 returned error can't find the container with id c10e65cf28c1ab8d96f6411a387e7fc9adef3094f3833ed748c9431a67b8dee0 Sep 30 14:22:11 crc kubenswrapper[4676]: I0930 14:22:11.116145 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gw4z4\" (UniqueName: \"kubernetes.io/projected/279b0c8e-a8bf-457a-a345-ba9c1309c118-kube-api-access-gw4z4\") pod \"nova-cell1-conductor-db-sync-bwwr8\" (UID: \"279b0c8e-a8bf-457a-a345-ba9c1309c118\") " pod="openstack/nova-cell1-conductor-db-sync-bwwr8" Sep 30 14:22:11 crc kubenswrapper[4676]: I0930 14:22:11.116233 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/279b0c8e-a8bf-457a-a345-ba9c1309c118-config-data\") pod \"nova-cell1-conductor-db-sync-bwwr8\" (UID: \"279b0c8e-a8bf-457a-a345-ba9c1309c118\") " pod="openstack/nova-cell1-conductor-db-sync-bwwr8" Sep 30 14:22:11 crc kubenswrapper[4676]: I0930 14:22:11.116488 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/279b0c8e-a8bf-457a-a345-ba9c1309c118-scripts\") pod \"nova-cell1-conductor-db-sync-bwwr8\" (UID: \"279b0c8e-a8bf-457a-a345-ba9c1309c118\") " pod="openstack/nova-cell1-conductor-db-sync-bwwr8" Sep 30 14:22:11 crc kubenswrapper[4676]: I0930 14:22:11.116511 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/279b0c8e-a8bf-457a-a345-ba9c1309c118-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-bwwr8\" (UID: \"279b0c8e-a8bf-457a-a345-ba9c1309c118\") " pod="openstack/nova-cell1-conductor-db-sync-bwwr8" Sep 30 14:22:11 crc kubenswrapper[4676]: I0930 14:22:11.150907 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 14:22:11 crc kubenswrapper[4676]: W0930 14:22:11.161030 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc0eee199_f211_4092_9600_25fef2b42baf.slice/crio-93d652576c3cb18aa0cdbd1913a1e61ed1c8668f7667c882e4f46836e8ccd810 WatchSource:0}: Error finding container 93d652576c3cb18aa0cdbd1913a1e61ed1c8668f7667c882e4f46836e8ccd810: Status 404 returned error can't find the container with id 93d652576c3cb18aa0cdbd1913a1e61ed1c8668f7667c882e4f46836e8ccd810 Sep 30 14:22:11 crc kubenswrapper[4676]: I0930 14:22:11.218577 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gw4z4\" (UniqueName: \"kubernetes.io/projected/279b0c8e-a8bf-457a-a345-ba9c1309c118-kube-api-access-gw4z4\") pod \"nova-cell1-conductor-db-sync-bwwr8\" (UID: \"279b0c8e-a8bf-457a-a345-ba9c1309c118\") " pod="openstack/nova-cell1-conductor-db-sync-bwwr8" Sep 30 14:22:11 crc kubenswrapper[4676]: I0930 14:22:11.218625 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/279b0c8e-a8bf-457a-a345-ba9c1309c118-config-data\") pod \"nova-cell1-conductor-db-sync-bwwr8\" (UID: \"279b0c8e-a8bf-457a-a345-ba9c1309c118\") " pod="openstack/nova-cell1-conductor-db-sync-bwwr8" Sep 30 14:22:11 crc kubenswrapper[4676]: I0930 14:22:11.218719 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/279b0c8e-a8bf-457a-a345-ba9c1309c118-scripts\") pod \"nova-cell1-conductor-db-sync-bwwr8\" (UID: \"279b0c8e-a8bf-457a-a345-ba9c1309c118\") " pod="openstack/nova-cell1-conductor-db-sync-bwwr8" Sep 30 14:22:11 crc kubenswrapper[4676]: I0930 14:22:11.218736 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/279b0c8e-a8bf-457a-a345-ba9c1309c118-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-bwwr8\" (UID: \"279b0c8e-a8bf-457a-a345-ba9c1309c118\") " pod="openstack/nova-cell1-conductor-db-sync-bwwr8" Sep 30 14:22:11 crc kubenswrapper[4676]: I0930 14:22:11.227582 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/279b0c8e-a8bf-457a-a345-ba9c1309c118-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-bwwr8\" (UID: \"279b0c8e-a8bf-457a-a345-ba9c1309c118\") " pod="openstack/nova-cell1-conductor-db-sync-bwwr8" Sep 30 14:22:11 crc kubenswrapper[4676]: I0930 14:22:11.228065 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/279b0c8e-a8bf-457a-a345-ba9c1309c118-scripts\") pod \"nova-cell1-conductor-db-sync-bwwr8\" (UID: \"279b0c8e-a8bf-457a-a345-ba9c1309c118\") " pod="openstack/nova-cell1-conductor-db-sync-bwwr8" Sep 30 14:22:11 crc kubenswrapper[4676]: I0930 14:22:11.229655 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/279b0c8e-a8bf-457a-a345-ba9c1309c118-config-data\") pod \"nova-cell1-conductor-db-sync-bwwr8\" (UID: \"279b0c8e-a8bf-457a-a345-ba9c1309c118\") " pod="openstack/nova-cell1-conductor-db-sync-bwwr8" Sep 30 14:22:11 crc kubenswrapper[4676]: I0930 14:22:11.253478 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gw4z4\" (UniqueName: \"kubernetes.io/projected/279b0c8e-a8bf-457a-a345-ba9c1309c118-kube-api-access-gw4z4\") pod \"nova-cell1-conductor-db-sync-bwwr8\" (UID: \"279b0c8e-a8bf-457a-a345-ba9c1309c118\") " pod="openstack/nova-cell1-conductor-db-sync-bwwr8" Sep 30 14:22:11 crc kubenswrapper[4676]: I0930 14:22:11.284352 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 14:22:11 crc kubenswrapper[4676]: I0930 14:22:11.364102 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-bwwr8" Sep 30 14:22:11 crc kubenswrapper[4676]: I0930 14:22:11.387737 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-dc5jg"] Sep 30 14:22:11 crc kubenswrapper[4676]: W0930 14:22:11.393186 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90feda1c_209e_474d_bd1c_eee343b5f674.slice/crio-c52a4cf7dcdfeba2587705edd1ef20c18e064540f4bbc08f22bba60a9188a7a2 WatchSource:0}: Error finding container c52a4cf7dcdfeba2587705edd1ef20c18e064540f4bbc08f22bba60a9188a7a2: Status 404 returned error can't find the container with id c52a4cf7dcdfeba2587705edd1ef20c18e064540f4bbc08f22bba60a9188a7a2 Sep 30 14:22:11 crc kubenswrapper[4676]: I0930 14:22:11.844699 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"28068cf5-7fa1-4380-be99-4c0951283c8a","Type":"ContainerStarted","Data":"c10e65cf28c1ab8d96f6411a387e7fc9adef3094f3833ed748c9431a67b8dee0"} Sep 30 14:22:11 crc kubenswrapper[4676]: I0930 14:22:11.846582 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"818f60d7-30b2-4ace-a2d0-51f551976dca","Type":"ContainerStarted","Data":"d4fd30550352f4bc0cbe97b899e1143d5a6e98070774025f9ae2f2667ceeb39c"} Sep 30 14:22:11 crc kubenswrapper[4676]: I0930 14:22:11.850041 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-dc5jg" event={"ID":"90feda1c-209e-474d-bd1c-eee343b5f674","Type":"ContainerDied","Data":"85261a9fae223a658b9979f97e7159ed11991262b7a8fb2d6dfa71f1ba50cf02"} Sep 30 14:22:11 crc kubenswrapper[4676]: I0930 14:22:11.850997 4676 generic.go:334] "Generic (PLEG): container finished" podID="90feda1c-209e-474d-bd1c-eee343b5f674" containerID="85261a9fae223a658b9979f97e7159ed11991262b7a8fb2d6dfa71f1ba50cf02" exitCode=0 Sep 30 14:22:11 crc kubenswrapper[4676]: I0930 14:22:11.851106 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-dc5jg" event={"ID":"90feda1c-209e-474d-bd1c-eee343b5f674","Type":"ContainerStarted","Data":"c52a4cf7dcdfeba2587705edd1ef20c18e064540f4bbc08f22bba60a9188a7a2"} Sep 30 14:22:11 crc kubenswrapper[4676]: I0930 14:22:11.867277 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c0eee199-f211-4092-9600-25fef2b42baf","Type":"ContainerStarted","Data":"93d652576c3cb18aa0cdbd1913a1e61ed1c8668f7667c882e4f46836e8ccd810"} Sep 30 14:22:11 crc kubenswrapper[4676]: I0930 14:22:11.890137 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-jpk85" event={"ID":"1d2e03c1-e9fc-4b6d-a755-0582ad936263","Type":"ContainerStarted","Data":"26d0c3000bad072b538f1bc904894eea3b3bb242351875c7c5b939e6c2114141"} Sep 30 14:22:11 crc kubenswrapper[4676]: I0930 14:22:11.904947 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"29ef64ab-bbf7-4ab9-ba3d-938dd6d90221","Type":"ContainerStarted","Data":"655f5b7434d20d0d0e2c11b0bea5e7dea41f26699d8203ce6eb06255a22bb5b8"} Sep 30 14:22:11 crc kubenswrapper[4676]: I0930 14:22:11.908763 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bwwr8"] Sep 30 14:22:11 crc kubenswrapper[4676]: I0930 14:22:11.922023 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-jpk85" podStartSLOduration=2.921999862 podStartE2EDuration="2.921999862s" podCreationTimestamp="2025-09-30 14:22:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:22:11.915095022 +0000 UTC m=+1435.898183451" watchObservedRunningTime="2025-09-30 14:22:11.921999862 +0000 UTC m=+1435.905088291" Sep 30 14:22:12 crc kubenswrapper[4676]: I0930 14:22:12.918944 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-bwwr8" event={"ID":"279b0c8e-a8bf-457a-a345-ba9c1309c118","Type":"ContainerStarted","Data":"c71d74e0bbf7d6f48c794a196ed4ac131c92a4a7148bf33f1be1e83fd93f7e73"} Sep 30 14:22:12 crc kubenswrapper[4676]: I0930 14:22:12.919277 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-bwwr8" event={"ID":"279b0c8e-a8bf-457a-a345-ba9c1309c118","Type":"ContainerStarted","Data":"5581816deb3f4018449ef45c48c840d7ff63c089915f82f60cce25a453fda3fa"} Sep 30 14:22:12 crc kubenswrapper[4676]: I0930 14:22:12.928042 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-dc5jg" event={"ID":"90feda1c-209e-474d-bd1c-eee343b5f674","Type":"ContainerStarted","Data":"d2dee546d5ac2ad3f210578fb5199e6246e537e1afa877b9a9efa21051362f18"} Sep 30 14:22:12 crc kubenswrapper[4676]: I0930 14:22:12.928082 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-845d6d6f59-dc5jg" Sep 30 14:22:12 crc kubenswrapper[4676]: I0930 14:22:12.962542 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-bwwr8" podStartSLOduration=2.962522978 podStartE2EDuration="2.962522978s" podCreationTimestamp="2025-09-30 14:22:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:22:12.937694099 +0000 UTC m=+1436.920782528" watchObservedRunningTime="2025-09-30 14:22:12.962522978 +0000 UTC m=+1436.945611407" Sep 30 14:22:12 crc kubenswrapper[4676]: I0930 14:22:12.969135 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-845d6d6f59-dc5jg" podStartSLOduration=2.969116209 podStartE2EDuration="2.969116209s" podCreationTimestamp="2025-09-30 14:22:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:22:12.958321678 +0000 UTC m=+1436.941410117" watchObservedRunningTime="2025-09-30 14:22:12.969116209 +0000 UTC m=+1436.952204638" Sep 30 14:22:13 crc kubenswrapper[4676]: I0930 14:22:13.453260 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-z9nrs"] Sep 30 14:22:13 crc kubenswrapper[4676]: I0930 14:22:13.456417 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z9nrs" Sep 30 14:22:13 crc kubenswrapper[4676]: I0930 14:22:13.482752 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-z9nrs"] Sep 30 14:22:13 crc kubenswrapper[4676]: I0930 14:22:13.491165 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nct9\" (UniqueName: \"kubernetes.io/projected/be110bf6-fee0-4775-a76c-938038512ef6-kube-api-access-2nct9\") pod \"certified-operators-z9nrs\" (UID: \"be110bf6-fee0-4775-a76c-938038512ef6\") " pod="openshift-marketplace/certified-operators-z9nrs" Sep 30 14:22:13 crc kubenswrapper[4676]: I0930 14:22:13.491230 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be110bf6-fee0-4775-a76c-938038512ef6-utilities\") pod \"certified-operators-z9nrs\" (UID: \"be110bf6-fee0-4775-a76c-938038512ef6\") " pod="openshift-marketplace/certified-operators-z9nrs" Sep 30 14:22:13 crc kubenswrapper[4676]: I0930 14:22:13.491307 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be110bf6-fee0-4775-a76c-938038512ef6-catalog-content\") pod \"certified-operators-z9nrs\" (UID: \"be110bf6-fee0-4775-a76c-938038512ef6\") " pod="openshift-marketplace/certified-operators-z9nrs" Sep 30 14:22:13 crc kubenswrapper[4676]: I0930 14:22:13.593666 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be110bf6-fee0-4775-a76c-938038512ef6-catalog-content\") pod \"certified-operators-z9nrs\" (UID: \"be110bf6-fee0-4775-a76c-938038512ef6\") " pod="openshift-marketplace/certified-operators-z9nrs" Sep 30 14:22:13 crc kubenswrapper[4676]: I0930 14:22:13.593854 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nct9\" (UniqueName: \"kubernetes.io/projected/be110bf6-fee0-4775-a76c-938038512ef6-kube-api-access-2nct9\") pod \"certified-operators-z9nrs\" (UID: \"be110bf6-fee0-4775-a76c-938038512ef6\") " pod="openshift-marketplace/certified-operators-z9nrs" Sep 30 14:22:13 crc kubenswrapper[4676]: I0930 14:22:13.594033 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be110bf6-fee0-4775-a76c-938038512ef6-utilities\") pod \"certified-operators-z9nrs\" (UID: \"be110bf6-fee0-4775-a76c-938038512ef6\") " pod="openshift-marketplace/certified-operators-z9nrs" Sep 30 14:22:13 crc kubenswrapper[4676]: I0930 14:22:13.594285 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be110bf6-fee0-4775-a76c-938038512ef6-catalog-content\") pod \"certified-operators-z9nrs\" (UID: \"be110bf6-fee0-4775-a76c-938038512ef6\") " pod="openshift-marketplace/certified-operators-z9nrs" Sep 30 14:22:13 crc kubenswrapper[4676]: I0930 14:22:13.594643 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be110bf6-fee0-4775-a76c-938038512ef6-utilities\") pod \"certified-operators-z9nrs\" (UID: \"be110bf6-fee0-4775-a76c-938038512ef6\") " pod="openshift-marketplace/certified-operators-z9nrs" Sep 30 14:22:13 crc kubenswrapper[4676]: I0930 14:22:13.621882 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nct9\" (UniqueName: \"kubernetes.io/projected/be110bf6-fee0-4775-a76c-938038512ef6-kube-api-access-2nct9\") pod \"certified-operators-z9nrs\" (UID: \"be110bf6-fee0-4775-a76c-938038512ef6\") " pod="openshift-marketplace/certified-operators-z9nrs" Sep 30 14:22:13 crc kubenswrapper[4676]: I0930 14:22:13.790350 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z9nrs" Sep 30 14:22:14 crc kubenswrapper[4676]: I0930 14:22:14.129720 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 14:22:14 crc kubenswrapper[4676]: I0930 14:22:14.149708 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 14:22:15 crc kubenswrapper[4676]: W0930 14:22:15.497109 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe110bf6_fee0_4775_a76c_938038512ef6.slice/crio-d98931d4a192aa0a637c8193c426a0ec484a7a486365741213ea8e7d4cfc7811 WatchSource:0}: Error finding container d98931d4a192aa0a637c8193c426a0ec484a7a486365741213ea8e7d4cfc7811: Status 404 returned error can't find the container with id d98931d4a192aa0a637c8193c426a0ec484a7a486365741213ea8e7d4cfc7811 Sep 30 14:22:15 crc kubenswrapper[4676]: I0930 14:22:15.501983 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-z9nrs"] Sep 30 14:22:15 crc kubenswrapper[4676]: I0930 14:22:15.986092 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c0eee199-f211-4092-9600-25fef2b42baf","Type":"ContainerStarted","Data":"f217f1307b5fb7a42b33bf9c1d2775be94b89adaf13b609cf4c268e3b4708ed4"} Sep 30 14:22:15 crc kubenswrapper[4676]: I0930 14:22:15.986497 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c0eee199-f211-4092-9600-25fef2b42baf","Type":"ContainerStarted","Data":"0f4807b52c1a96059749cdcbe06bef070962521a1c2c849e41bc17c39067ea91"} Sep 30 14:22:15 crc kubenswrapper[4676]: I0930 14:22:15.986669 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c0eee199-f211-4092-9600-25fef2b42baf" containerName="nova-metadata-log" containerID="cri-o://0f4807b52c1a96059749cdcbe06bef070962521a1c2c849e41bc17c39067ea91" gracePeriod=30 Sep 30 14:22:15 crc kubenswrapper[4676]: I0930 14:22:15.987195 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c0eee199-f211-4092-9600-25fef2b42baf" containerName="nova-metadata-metadata" containerID="cri-o://f217f1307b5fb7a42b33bf9c1d2775be94b89adaf13b609cf4c268e3b4708ed4" gracePeriod=30 Sep 30 14:22:15 crc kubenswrapper[4676]: I0930 14:22:15.997644 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"29ef64ab-bbf7-4ab9-ba3d-938dd6d90221","Type":"ContainerStarted","Data":"08d535ee44351c2ed718113b2ea1c57740e056373fbd52286dc8732c20e09ed4"} Sep 30 14:22:16 crc kubenswrapper[4676]: I0930 14:22:16.001041 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"28068cf5-7fa1-4380-be99-4c0951283c8a","Type":"ContainerStarted","Data":"c398912b19fcae65168856e20bfce851fe8904dce4ccbeede142a806f649703e"} Sep 30 14:22:16 crc kubenswrapper[4676]: I0930 14:22:16.001096 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"28068cf5-7fa1-4380-be99-4c0951283c8a","Type":"ContainerStarted","Data":"da4fc51b2a61e554d610da7848635d7d2345c3e8056a1910424b8488a66bb762"} Sep 30 14:22:16 crc kubenswrapper[4676]: I0930 14:22:16.006128 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"818f60d7-30b2-4ace-a2d0-51f551976dca","Type":"ContainerStarted","Data":"be75cfbb6f3ecbff302c559f35f0f56ccfd1a4cf26151c405bd14b56e1b34014"} Sep 30 14:22:16 crc kubenswrapper[4676]: I0930 14:22:16.006477 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="818f60d7-30b2-4ace-a2d0-51f551976dca" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://be75cfbb6f3ecbff302c559f35f0f56ccfd1a4cf26151c405bd14b56e1b34014" gracePeriod=30 Sep 30 14:22:16 crc kubenswrapper[4676]: I0930 14:22:16.008375 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.1271508199999998 podStartE2EDuration="7.008361036s" podCreationTimestamp="2025-09-30 14:22:09 +0000 UTC" firstStartedPulling="2025-09-30 14:22:11.173327889 +0000 UTC m=+1435.156416318" lastFinishedPulling="2025-09-30 14:22:15.054538105 +0000 UTC m=+1439.037626534" observedRunningTime="2025-09-30 14:22:16.006143048 +0000 UTC m=+1439.989231547" watchObservedRunningTime="2025-09-30 14:22:16.008361036 +0000 UTC m=+1439.991449465" Sep 30 14:22:16 crc kubenswrapper[4676]: I0930 14:22:16.013499 4676 generic.go:334] "Generic (PLEG): container finished" podID="be110bf6-fee0-4775-a76c-938038512ef6" containerID="ae5d5cccd9f0468b8be7e5bb1a335da24dcb42caca75add54f016e59b38f7d4c" exitCode=0 Sep 30 14:22:16 crc kubenswrapper[4676]: I0930 14:22:16.013546 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z9nrs" event={"ID":"be110bf6-fee0-4775-a76c-938038512ef6","Type":"ContainerDied","Data":"ae5d5cccd9f0468b8be7e5bb1a335da24dcb42caca75add54f016e59b38f7d4c"} Sep 30 14:22:16 crc kubenswrapper[4676]: I0930 14:22:16.013570 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z9nrs" event={"ID":"be110bf6-fee0-4775-a76c-938038512ef6","Type":"ContainerStarted","Data":"d98931d4a192aa0a637c8193c426a0ec484a7a486365741213ea8e7d4cfc7811"} Sep 30 14:22:16 crc kubenswrapper[4676]: I0930 14:22:16.036213 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.142146532 podStartE2EDuration="7.036188123s" podCreationTimestamp="2025-09-30 14:22:09 +0000 UTC" firstStartedPulling="2025-09-30 14:22:11.172887938 +0000 UTC m=+1435.155976367" lastFinishedPulling="2025-09-30 14:22:15.066929529 +0000 UTC m=+1439.050017958" observedRunningTime="2025-09-30 14:22:16.027365643 +0000 UTC m=+1440.010454082" watchObservedRunningTime="2025-09-30 14:22:16.036188123 +0000 UTC m=+1440.019276552" Sep 30 14:22:16 crc kubenswrapper[4676]: I0930 14:22:16.051374 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.25918187 podStartE2EDuration="7.0513598s" podCreationTimestamp="2025-09-30 14:22:09 +0000 UTC" firstStartedPulling="2025-09-30 14:22:11.292158793 +0000 UTC m=+1435.275247212" lastFinishedPulling="2025-09-30 14:22:15.084336713 +0000 UTC m=+1439.067425142" observedRunningTime="2025-09-30 14:22:16.049976564 +0000 UTC m=+1440.033064993" watchObservedRunningTime="2025-09-30 14:22:16.0513598 +0000 UTC m=+1440.034448229" Sep 30 14:22:16 crc kubenswrapper[4676]: I0930 14:22:16.099288 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.978455887 podStartE2EDuration="7.099261311s" podCreationTimestamp="2025-09-30 14:22:09 +0000 UTC" firstStartedPulling="2025-09-30 14:22:10.879268319 +0000 UTC m=+1434.862356748" lastFinishedPulling="2025-09-30 14:22:15.000073733 +0000 UTC m=+1438.983162172" observedRunningTime="2025-09-30 14:22:16.097452694 +0000 UTC m=+1440.080541133" watchObservedRunningTime="2025-09-30 14:22:16.099261311 +0000 UTC m=+1440.082349740" Sep 30 14:22:16 crc kubenswrapper[4676]: I0930 14:22:16.655195 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 14:22:16 crc kubenswrapper[4676]: I0930 14:22:16.762200 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0eee199-f211-4092-9600-25fef2b42baf-combined-ca-bundle\") pod \"c0eee199-f211-4092-9600-25fef2b42baf\" (UID: \"c0eee199-f211-4092-9600-25fef2b42baf\") " Sep 30 14:22:16 crc kubenswrapper[4676]: I0930 14:22:16.762344 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0eee199-f211-4092-9600-25fef2b42baf-config-data\") pod \"c0eee199-f211-4092-9600-25fef2b42baf\" (UID: \"c0eee199-f211-4092-9600-25fef2b42baf\") " Sep 30 14:22:16 crc kubenswrapper[4676]: I0930 14:22:16.763620 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0eee199-f211-4092-9600-25fef2b42baf-logs\") pod \"c0eee199-f211-4092-9600-25fef2b42baf\" (UID: \"c0eee199-f211-4092-9600-25fef2b42baf\") " Sep 30 14:22:16 crc kubenswrapper[4676]: I0930 14:22:16.763663 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pc9wr\" (UniqueName: \"kubernetes.io/projected/c0eee199-f211-4092-9600-25fef2b42baf-kube-api-access-pc9wr\") pod \"c0eee199-f211-4092-9600-25fef2b42baf\" (UID: \"c0eee199-f211-4092-9600-25fef2b42baf\") " Sep 30 14:22:16 crc kubenswrapper[4676]: I0930 14:22:16.764119 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0eee199-f211-4092-9600-25fef2b42baf-logs" (OuterVolumeSpecName: "logs") pod "c0eee199-f211-4092-9600-25fef2b42baf" (UID: "c0eee199-f211-4092-9600-25fef2b42baf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:22:16 crc kubenswrapper[4676]: I0930 14:22:16.765005 4676 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0eee199-f211-4092-9600-25fef2b42baf-logs\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:16 crc kubenswrapper[4676]: I0930 14:22:16.770113 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0eee199-f211-4092-9600-25fef2b42baf-kube-api-access-pc9wr" (OuterVolumeSpecName: "kube-api-access-pc9wr") pod "c0eee199-f211-4092-9600-25fef2b42baf" (UID: "c0eee199-f211-4092-9600-25fef2b42baf"). InnerVolumeSpecName "kube-api-access-pc9wr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:22:16 crc kubenswrapper[4676]: I0930 14:22:16.794739 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0eee199-f211-4092-9600-25fef2b42baf-config-data" (OuterVolumeSpecName: "config-data") pod "c0eee199-f211-4092-9600-25fef2b42baf" (UID: "c0eee199-f211-4092-9600-25fef2b42baf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:22:16 crc kubenswrapper[4676]: I0930 14:22:16.802091 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0eee199-f211-4092-9600-25fef2b42baf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c0eee199-f211-4092-9600-25fef2b42baf" (UID: "c0eee199-f211-4092-9600-25fef2b42baf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:22:16 crc kubenswrapper[4676]: I0930 14:22:16.868062 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0eee199-f211-4092-9600-25fef2b42baf-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:16 crc kubenswrapper[4676]: I0930 14:22:16.868356 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pc9wr\" (UniqueName: \"kubernetes.io/projected/c0eee199-f211-4092-9600-25fef2b42baf-kube-api-access-pc9wr\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:16 crc kubenswrapper[4676]: I0930 14:22:16.868371 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0eee199-f211-4092-9600-25fef2b42baf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:17 crc kubenswrapper[4676]: I0930 14:22:17.025521 4676 generic.go:334] "Generic (PLEG): container finished" podID="c0eee199-f211-4092-9600-25fef2b42baf" containerID="f217f1307b5fb7a42b33bf9c1d2775be94b89adaf13b609cf4c268e3b4708ed4" exitCode=0 Sep 30 14:22:17 crc kubenswrapper[4676]: I0930 14:22:17.025549 4676 generic.go:334] "Generic (PLEG): container finished" podID="c0eee199-f211-4092-9600-25fef2b42baf" containerID="0f4807b52c1a96059749cdcbe06bef070962521a1c2c849e41bc17c39067ea91" exitCode=143 Sep 30 14:22:17 crc kubenswrapper[4676]: I0930 14:22:17.025580 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 14:22:17 crc kubenswrapper[4676]: I0930 14:22:17.025652 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c0eee199-f211-4092-9600-25fef2b42baf","Type":"ContainerDied","Data":"f217f1307b5fb7a42b33bf9c1d2775be94b89adaf13b609cf4c268e3b4708ed4"} Sep 30 14:22:17 crc kubenswrapper[4676]: I0930 14:22:17.025746 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c0eee199-f211-4092-9600-25fef2b42baf","Type":"ContainerDied","Data":"0f4807b52c1a96059749cdcbe06bef070962521a1c2c849e41bc17c39067ea91"} Sep 30 14:22:17 crc kubenswrapper[4676]: I0930 14:22:17.025765 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c0eee199-f211-4092-9600-25fef2b42baf","Type":"ContainerDied","Data":"93d652576c3cb18aa0cdbd1913a1e61ed1c8668f7667c882e4f46836e8ccd810"} Sep 30 14:22:17 crc kubenswrapper[4676]: I0930 14:22:17.026671 4676 scope.go:117] "RemoveContainer" containerID="f217f1307b5fb7a42b33bf9c1d2775be94b89adaf13b609cf4c268e3b4708ed4" Sep 30 14:22:17 crc kubenswrapper[4676]: I0930 14:22:17.029063 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z9nrs" event={"ID":"be110bf6-fee0-4775-a76c-938038512ef6","Type":"ContainerStarted","Data":"a8d91489a641685c7920b38d3cd7ed9dbd4943a183729eb3f8b638cffb8cb83a"} Sep 30 14:22:17 crc kubenswrapper[4676]: I0930 14:22:17.053851 4676 scope.go:117] "RemoveContainer" containerID="0f4807b52c1a96059749cdcbe06bef070962521a1c2c849e41bc17c39067ea91" Sep 30 14:22:17 crc kubenswrapper[4676]: I0930 14:22:17.073622 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 14:22:17 crc kubenswrapper[4676]: I0930 14:22:17.082357 4676 scope.go:117] "RemoveContainer" containerID="f217f1307b5fb7a42b33bf9c1d2775be94b89adaf13b609cf4c268e3b4708ed4" Sep 30 14:22:17 crc kubenswrapper[4676]: E0930 14:22:17.083065 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f217f1307b5fb7a42b33bf9c1d2775be94b89adaf13b609cf4c268e3b4708ed4\": container with ID starting with f217f1307b5fb7a42b33bf9c1d2775be94b89adaf13b609cf4c268e3b4708ed4 not found: ID does not exist" containerID="f217f1307b5fb7a42b33bf9c1d2775be94b89adaf13b609cf4c268e3b4708ed4" Sep 30 14:22:17 crc kubenswrapper[4676]: I0930 14:22:17.083152 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f217f1307b5fb7a42b33bf9c1d2775be94b89adaf13b609cf4c268e3b4708ed4"} err="failed to get container status \"f217f1307b5fb7a42b33bf9c1d2775be94b89adaf13b609cf4c268e3b4708ed4\": rpc error: code = NotFound desc = could not find container \"f217f1307b5fb7a42b33bf9c1d2775be94b89adaf13b609cf4c268e3b4708ed4\": container with ID starting with f217f1307b5fb7a42b33bf9c1d2775be94b89adaf13b609cf4c268e3b4708ed4 not found: ID does not exist" Sep 30 14:22:17 crc kubenswrapper[4676]: I0930 14:22:17.083201 4676 scope.go:117] "RemoveContainer" containerID="0f4807b52c1a96059749cdcbe06bef070962521a1c2c849e41bc17c39067ea91" Sep 30 14:22:17 crc kubenswrapper[4676]: E0930 14:22:17.083834 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f4807b52c1a96059749cdcbe06bef070962521a1c2c849e41bc17c39067ea91\": container with ID starting with 0f4807b52c1a96059749cdcbe06bef070962521a1c2c849e41bc17c39067ea91 not found: ID does not exist" containerID="0f4807b52c1a96059749cdcbe06bef070962521a1c2c849e41bc17c39067ea91" Sep 30 14:22:17 crc kubenswrapper[4676]: I0930 14:22:17.083914 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f4807b52c1a96059749cdcbe06bef070962521a1c2c849e41bc17c39067ea91"} err="failed to get container status \"0f4807b52c1a96059749cdcbe06bef070962521a1c2c849e41bc17c39067ea91\": rpc error: code = NotFound desc = could not find container \"0f4807b52c1a96059749cdcbe06bef070962521a1c2c849e41bc17c39067ea91\": container with ID starting with 0f4807b52c1a96059749cdcbe06bef070962521a1c2c849e41bc17c39067ea91 not found: ID does not exist" Sep 30 14:22:17 crc kubenswrapper[4676]: I0930 14:22:17.083941 4676 scope.go:117] "RemoveContainer" containerID="f217f1307b5fb7a42b33bf9c1d2775be94b89adaf13b609cf4c268e3b4708ed4" Sep 30 14:22:17 crc kubenswrapper[4676]: I0930 14:22:17.084424 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f217f1307b5fb7a42b33bf9c1d2775be94b89adaf13b609cf4c268e3b4708ed4"} err="failed to get container status \"f217f1307b5fb7a42b33bf9c1d2775be94b89adaf13b609cf4c268e3b4708ed4\": rpc error: code = NotFound desc = could not find container \"f217f1307b5fb7a42b33bf9c1d2775be94b89adaf13b609cf4c268e3b4708ed4\": container with ID starting with f217f1307b5fb7a42b33bf9c1d2775be94b89adaf13b609cf4c268e3b4708ed4 not found: ID does not exist" Sep 30 14:22:17 crc kubenswrapper[4676]: I0930 14:22:17.084474 4676 scope.go:117] "RemoveContainer" containerID="0f4807b52c1a96059749cdcbe06bef070962521a1c2c849e41bc17c39067ea91" Sep 30 14:22:17 crc kubenswrapper[4676]: I0930 14:22:17.084696 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 14:22:17 crc kubenswrapper[4676]: I0930 14:22:17.084768 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f4807b52c1a96059749cdcbe06bef070962521a1c2c849e41bc17c39067ea91"} err="failed to get container status \"0f4807b52c1a96059749cdcbe06bef070962521a1c2c849e41bc17c39067ea91\": rpc error: code = NotFound desc = could not find container \"0f4807b52c1a96059749cdcbe06bef070962521a1c2c849e41bc17c39067ea91\": container with ID starting with 0f4807b52c1a96059749cdcbe06bef070962521a1c2c849e41bc17c39067ea91 not found: ID does not exist" Sep 30 14:22:17 crc kubenswrapper[4676]: I0930 14:22:17.112606 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Sep 30 14:22:17 crc kubenswrapper[4676]: E0930 14:22:17.113195 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0eee199-f211-4092-9600-25fef2b42baf" containerName="nova-metadata-log" Sep 30 14:22:17 crc kubenswrapper[4676]: I0930 14:22:17.113212 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0eee199-f211-4092-9600-25fef2b42baf" containerName="nova-metadata-log" Sep 30 14:22:17 crc kubenswrapper[4676]: E0930 14:22:17.113267 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0eee199-f211-4092-9600-25fef2b42baf" containerName="nova-metadata-metadata" Sep 30 14:22:17 crc kubenswrapper[4676]: I0930 14:22:17.113275 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0eee199-f211-4092-9600-25fef2b42baf" containerName="nova-metadata-metadata" Sep 30 14:22:17 crc kubenswrapper[4676]: I0930 14:22:17.113501 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0eee199-f211-4092-9600-25fef2b42baf" containerName="nova-metadata-log" Sep 30 14:22:17 crc kubenswrapper[4676]: I0930 14:22:17.113521 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0eee199-f211-4092-9600-25fef2b42baf" containerName="nova-metadata-metadata" Sep 30 14:22:17 crc kubenswrapper[4676]: I0930 14:22:17.114782 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 14:22:17 crc kubenswrapper[4676]: I0930 14:22:17.117673 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Sep 30 14:22:17 crc kubenswrapper[4676]: I0930 14:22:17.117749 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Sep 30 14:22:17 crc kubenswrapper[4676]: I0930 14:22:17.129818 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 14:22:17 crc kubenswrapper[4676]: I0930 14:22:17.176856 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e28bd01-5bf5-45e9-9b7d-e47c0828c533-logs\") pod \"nova-metadata-0\" (UID: \"4e28bd01-5bf5-45e9-9b7d-e47c0828c533\") " pod="openstack/nova-metadata-0" Sep 30 14:22:17 crc kubenswrapper[4676]: I0930 14:22:17.176969 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxxxn\" (UniqueName: \"kubernetes.io/projected/4e28bd01-5bf5-45e9-9b7d-e47c0828c533-kube-api-access-bxxxn\") pod \"nova-metadata-0\" (UID: \"4e28bd01-5bf5-45e9-9b7d-e47c0828c533\") " pod="openstack/nova-metadata-0" Sep 30 14:22:17 crc kubenswrapper[4676]: I0930 14:22:17.177053 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e28bd01-5bf5-45e9-9b7d-e47c0828c533-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4e28bd01-5bf5-45e9-9b7d-e47c0828c533\") " pod="openstack/nova-metadata-0" Sep 30 14:22:17 crc kubenswrapper[4676]: I0930 14:22:17.177226 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e28bd01-5bf5-45e9-9b7d-e47c0828c533-config-data\") pod \"nova-metadata-0\" (UID: \"4e28bd01-5bf5-45e9-9b7d-e47c0828c533\") " pod="openstack/nova-metadata-0" Sep 30 14:22:17 crc kubenswrapper[4676]: I0930 14:22:17.177296 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e28bd01-5bf5-45e9-9b7d-e47c0828c533-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4e28bd01-5bf5-45e9-9b7d-e47c0828c533\") " pod="openstack/nova-metadata-0" Sep 30 14:22:17 crc kubenswrapper[4676]: I0930 14:22:17.279077 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e28bd01-5bf5-45e9-9b7d-e47c0828c533-config-data\") pod \"nova-metadata-0\" (UID: \"4e28bd01-5bf5-45e9-9b7d-e47c0828c533\") " pod="openstack/nova-metadata-0" Sep 30 14:22:17 crc kubenswrapper[4676]: I0930 14:22:17.279178 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e28bd01-5bf5-45e9-9b7d-e47c0828c533-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4e28bd01-5bf5-45e9-9b7d-e47c0828c533\") " pod="openstack/nova-metadata-0" Sep 30 14:22:17 crc kubenswrapper[4676]: I0930 14:22:17.279810 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e28bd01-5bf5-45e9-9b7d-e47c0828c533-logs\") pod \"nova-metadata-0\" (UID: \"4e28bd01-5bf5-45e9-9b7d-e47c0828c533\") " pod="openstack/nova-metadata-0" Sep 30 14:22:17 crc kubenswrapper[4676]: I0930 14:22:17.279864 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxxxn\" (UniqueName: \"kubernetes.io/projected/4e28bd01-5bf5-45e9-9b7d-e47c0828c533-kube-api-access-bxxxn\") pod \"nova-metadata-0\" (UID: \"4e28bd01-5bf5-45e9-9b7d-e47c0828c533\") " pod="openstack/nova-metadata-0" Sep 30 14:22:17 crc kubenswrapper[4676]: I0930 14:22:17.279949 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e28bd01-5bf5-45e9-9b7d-e47c0828c533-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4e28bd01-5bf5-45e9-9b7d-e47c0828c533\") " pod="openstack/nova-metadata-0" Sep 30 14:22:17 crc kubenswrapper[4676]: I0930 14:22:17.280396 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e28bd01-5bf5-45e9-9b7d-e47c0828c533-logs\") pod \"nova-metadata-0\" (UID: \"4e28bd01-5bf5-45e9-9b7d-e47c0828c533\") " pod="openstack/nova-metadata-0" Sep 30 14:22:17 crc kubenswrapper[4676]: I0930 14:22:17.290018 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e28bd01-5bf5-45e9-9b7d-e47c0828c533-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4e28bd01-5bf5-45e9-9b7d-e47c0828c533\") " pod="openstack/nova-metadata-0" Sep 30 14:22:17 crc kubenswrapper[4676]: I0930 14:22:17.292686 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e28bd01-5bf5-45e9-9b7d-e47c0828c533-config-data\") pod \"nova-metadata-0\" (UID: \"4e28bd01-5bf5-45e9-9b7d-e47c0828c533\") " pod="openstack/nova-metadata-0" Sep 30 14:22:17 crc kubenswrapper[4676]: I0930 14:22:17.296190 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e28bd01-5bf5-45e9-9b7d-e47c0828c533-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4e28bd01-5bf5-45e9-9b7d-e47c0828c533\") " pod="openstack/nova-metadata-0" Sep 30 14:22:17 crc kubenswrapper[4676]: I0930 14:22:17.307469 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxxxn\" (UniqueName: \"kubernetes.io/projected/4e28bd01-5bf5-45e9-9b7d-e47c0828c533-kube-api-access-bxxxn\") pod \"nova-metadata-0\" (UID: \"4e28bd01-5bf5-45e9-9b7d-e47c0828c533\") " pod="openstack/nova-metadata-0" Sep 30 14:22:17 crc kubenswrapper[4676]: I0930 14:22:17.439061 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 14:22:17 crc kubenswrapper[4676]: I0930 14:22:17.453618 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0eee199-f211-4092-9600-25fef2b42baf" path="/var/lib/kubelet/pods/c0eee199-f211-4092-9600-25fef2b42baf/volumes" Sep 30 14:22:17 crc kubenswrapper[4676]: I0930 14:22:17.888952 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 14:22:17 crc kubenswrapper[4676]: W0930 14:22:17.891635 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e28bd01_5bf5_45e9_9b7d_e47c0828c533.slice/crio-8cee40c262ada7a0db4fdfecc847ff3788420d10dc208335602ee44bcab436ab WatchSource:0}: Error finding container 8cee40c262ada7a0db4fdfecc847ff3788420d10dc208335602ee44bcab436ab: Status 404 returned error can't find the container with id 8cee40c262ada7a0db4fdfecc847ff3788420d10dc208335602ee44bcab436ab Sep 30 14:22:18 crc kubenswrapper[4676]: I0930 14:22:18.038797 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4e28bd01-5bf5-45e9-9b7d-e47c0828c533","Type":"ContainerStarted","Data":"8cee40c262ada7a0db4fdfecc847ff3788420d10dc208335602ee44bcab436ab"} Sep 30 14:22:18 crc kubenswrapper[4676]: I0930 14:22:18.042909 4676 generic.go:334] "Generic (PLEG): container finished" podID="be110bf6-fee0-4775-a76c-938038512ef6" containerID="a8d91489a641685c7920b38d3cd7ed9dbd4943a183729eb3f8b638cffb8cb83a" exitCode=0 Sep 30 14:22:18 crc kubenswrapper[4676]: I0930 14:22:18.042941 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z9nrs" event={"ID":"be110bf6-fee0-4775-a76c-938038512ef6","Type":"ContainerDied","Data":"a8d91489a641685c7920b38d3cd7ed9dbd4943a183729eb3f8b638cffb8cb83a"} Sep 30 14:22:19 crc kubenswrapper[4676]: I0930 14:22:19.055328 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z9nrs" event={"ID":"be110bf6-fee0-4775-a76c-938038512ef6","Type":"ContainerStarted","Data":"422ec58290029a743e195f429ff10be0a08f1f2ce25f3b54d3f5e408ea47c275"} Sep 30 14:22:19 crc kubenswrapper[4676]: I0930 14:22:19.057128 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4e28bd01-5bf5-45e9-9b7d-e47c0828c533","Type":"ContainerStarted","Data":"278fb22e25d5db98e26196f0f6f87029e00fc7a04d7fed14dd6c59f7546eee32"} Sep 30 14:22:19 crc kubenswrapper[4676]: I0930 14:22:19.057176 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4e28bd01-5bf5-45e9-9b7d-e47c0828c533","Type":"ContainerStarted","Data":"bb5c973481bbdd56894ab22a3dd28fb3fb887ed6e41a7dc9282c7cf9afa50b57"} Sep 30 14:22:19 crc kubenswrapper[4676]: I0930 14:22:19.080339 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-z9nrs" podStartSLOduration=3.617362883 podStartE2EDuration="6.080321658s" podCreationTimestamp="2025-09-30 14:22:13 +0000 UTC" firstStartedPulling="2025-09-30 14:22:16.01498579 +0000 UTC m=+1439.998074219" lastFinishedPulling="2025-09-30 14:22:18.477944565 +0000 UTC m=+1442.461032994" observedRunningTime="2025-09-30 14:22:19.074451434 +0000 UTC m=+1443.057539863" watchObservedRunningTime="2025-09-30 14:22:19.080321658 +0000 UTC m=+1443.063410087" Sep 30 14:22:19 crc kubenswrapper[4676]: I0930 14:22:19.109908 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.109865869 podStartE2EDuration="2.109865869s" podCreationTimestamp="2025-09-30 14:22:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:22:19.098340448 +0000 UTC m=+1443.081428887" watchObservedRunningTime="2025-09-30 14:22:19.109865869 +0000 UTC m=+1443.092954298" Sep 30 14:22:20 crc kubenswrapper[4676]: I0930 14:22:20.189011 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Sep 30 14:22:20 crc kubenswrapper[4676]: I0930 14:22:20.433459 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 14:22:20 crc kubenswrapper[4676]: I0930 14:22:20.433495 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 14:22:20 crc kubenswrapper[4676]: I0930 14:22:20.524841 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Sep 30 14:22:20 crc kubenswrapper[4676]: I0930 14:22:20.524900 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Sep 30 14:22:20 crc kubenswrapper[4676]: I0930 14:22:20.554025 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Sep 30 14:22:20 crc kubenswrapper[4676]: I0930 14:22:20.620814 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-845d6d6f59-dc5jg" Sep 30 14:22:20 crc kubenswrapper[4676]: I0930 14:22:20.690932 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-bwmdv"] Sep 30 14:22:20 crc kubenswrapper[4676]: I0930 14:22:20.691183 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5784cf869f-bwmdv" podUID="5bccc396-3182-4955-8d75-93c5b0b221c6" containerName="dnsmasq-dns" containerID="cri-o://f48b8383470f3a455212ee5d9c3258a9e675b448fe0402af5be5813bca42ec5e" gracePeriod=10 Sep 30 14:22:21 crc kubenswrapper[4676]: I0930 14:22:21.090073 4676 generic.go:334] "Generic (PLEG): container finished" podID="5bccc396-3182-4955-8d75-93c5b0b221c6" containerID="f48b8383470f3a455212ee5d9c3258a9e675b448fe0402af5be5813bca42ec5e" exitCode=0 Sep 30 14:22:21 crc kubenswrapper[4676]: I0930 14:22:21.090321 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-bwmdv" event={"ID":"5bccc396-3182-4955-8d75-93c5b0b221c6","Type":"ContainerDied","Data":"f48b8383470f3a455212ee5d9c3258a9e675b448fe0402af5be5813bca42ec5e"} Sep 30 14:22:21 crc kubenswrapper[4676]: I0930 14:22:21.092421 4676 generic.go:334] "Generic (PLEG): container finished" podID="1d2e03c1-e9fc-4b6d-a755-0582ad936263" containerID="26d0c3000bad072b538f1bc904894eea3b3bb242351875c7c5b939e6c2114141" exitCode=0 Sep 30 14:22:21 crc kubenswrapper[4676]: I0930 14:22:21.092498 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-jpk85" event={"ID":"1d2e03c1-e9fc-4b6d-a755-0582ad936263","Type":"ContainerDied","Data":"26d0c3000bad072b538f1bc904894eea3b3bb242351875c7c5b939e6c2114141"} Sep 30 14:22:21 crc kubenswrapper[4676]: I0930 14:22:21.144614 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Sep 30 14:22:21 crc kubenswrapper[4676]: I0930 14:22:21.292918 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-bwmdv" Sep 30 14:22:21 crc kubenswrapper[4676]: I0930 14:22:21.398638 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5bccc396-3182-4955-8d75-93c5b0b221c6-dns-swift-storage-0\") pod \"5bccc396-3182-4955-8d75-93c5b0b221c6\" (UID: \"5bccc396-3182-4955-8d75-93c5b0b221c6\") " Sep 30 14:22:21 crc kubenswrapper[4676]: I0930 14:22:21.398790 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bccc396-3182-4955-8d75-93c5b0b221c6-config\") pod \"5bccc396-3182-4955-8d75-93c5b0b221c6\" (UID: \"5bccc396-3182-4955-8d75-93c5b0b221c6\") " Sep 30 14:22:21 crc kubenswrapper[4676]: I0930 14:22:21.398862 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5bccc396-3182-4955-8d75-93c5b0b221c6-ovsdbserver-nb\") pod \"5bccc396-3182-4955-8d75-93c5b0b221c6\" (UID: \"5bccc396-3182-4955-8d75-93c5b0b221c6\") " Sep 30 14:22:21 crc kubenswrapper[4676]: I0930 14:22:21.398911 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nh9vj\" (UniqueName: \"kubernetes.io/projected/5bccc396-3182-4955-8d75-93c5b0b221c6-kube-api-access-nh9vj\") pod \"5bccc396-3182-4955-8d75-93c5b0b221c6\" (UID: \"5bccc396-3182-4955-8d75-93c5b0b221c6\") " Sep 30 14:22:21 crc kubenswrapper[4676]: I0930 14:22:21.399052 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5bccc396-3182-4955-8d75-93c5b0b221c6-dns-svc\") pod \"5bccc396-3182-4955-8d75-93c5b0b221c6\" (UID: \"5bccc396-3182-4955-8d75-93c5b0b221c6\") " Sep 30 14:22:21 crc kubenswrapper[4676]: I0930 14:22:21.399096 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5bccc396-3182-4955-8d75-93c5b0b221c6-ovsdbserver-sb\") pod \"5bccc396-3182-4955-8d75-93c5b0b221c6\" (UID: \"5bccc396-3182-4955-8d75-93c5b0b221c6\") " Sep 30 14:22:21 crc kubenswrapper[4676]: I0930 14:22:21.406360 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bccc396-3182-4955-8d75-93c5b0b221c6-kube-api-access-nh9vj" (OuterVolumeSpecName: "kube-api-access-nh9vj") pod "5bccc396-3182-4955-8d75-93c5b0b221c6" (UID: "5bccc396-3182-4955-8d75-93c5b0b221c6"). InnerVolumeSpecName "kube-api-access-nh9vj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:22:21 crc kubenswrapper[4676]: I0930 14:22:21.467571 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bccc396-3182-4955-8d75-93c5b0b221c6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5bccc396-3182-4955-8d75-93c5b0b221c6" (UID: "5bccc396-3182-4955-8d75-93c5b0b221c6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:22:21 crc kubenswrapper[4676]: I0930 14:22:21.488639 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bccc396-3182-4955-8d75-93c5b0b221c6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5bccc396-3182-4955-8d75-93c5b0b221c6" (UID: "5bccc396-3182-4955-8d75-93c5b0b221c6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:22:21 crc kubenswrapper[4676]: I0930 14:22:21.490305 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bccc396-3182-4955-8d75-93c5b0b221c6-config" (OuterVolumeSpecName: "config") pod "5bccc396-3182-4955-8d75-93c5b0b221c6" (UID: "5bccc396-3182-4955-8d75-93c5b0b221c6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:22:21 crc kubenswrapper[4676]: I0930 14:22:21.490813 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bccc396-3182-4955-8d75-93c5b0b221c6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5bccc396-3182-4955-8d75-93c5b0b221c6" (UID: "5bccc396-3182-4955-8d75-93c5b0b221c6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:22:21 crc kubenswrapper[4676]: I0930 14:22:21.502348 4676 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5bccc396-3182-4955-8d75-93c5b0b221c6-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:21 crc kubenswrapper[4676]: I0930 14:22:21.503537 4676 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5bccc396-3182-4955-8d75-93c5b0b221c6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:21 crc kubenswrapper[4676]: I0930 14:22:21.503648 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bccc396-3182-4955-8d75-93c5b0b221c6-config\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:21 crc kubenswrapper[4676]: I0930 14:22:21.503736 4676 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5bccc396-3182-4955-8d75-93c5b0b221c6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:21 crc kubenswrapper[4676]: I0930 14:22:21.504374 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nh9vj\" (UniqueName: \"kubernetes.io/projected/5bccc396-3182-4955-8d75-93c5b0b221c6-kube-api-access-nh9vj\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:21 crc kubenswrapper[4676]: I0930 14:22:21.515487 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="28068cf5-7fa1-4380-be99-4c0951283c8a" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.188:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 14:22:21 crc kubenswrapper[4676]: I0930 14:22:21.515852 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="28068cf5-7fa1-4380-be99-4c0951283c8a" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.188:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 14:22:21 crc kubenswrapper[4676]: I0930 14:22:21.517391 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bccc396-3182-4955-8d75-93c5b0b221c6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5bccc396-3182-4955-8d75-93c5b0b221c6" (UID: "5bccc396-3182-4955-8d75-93c5b0b221c6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:22:21 crc kubenswrapper[4676]: I0930 14:22:21.606594 4676 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5bccc396-3182-4955-8d75-93c5b0b221c6-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:22 crc kubenswrapper[4676]: I0930 14:22:22.122766 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-bwmdv" event={"ID":"5bccc396-3182-4955-8d75-93c5b0b221c6","Type":"ContainerDied","Data":"87d6b23ede78bd8fae1eb6c886955a685b894bb2f23c7ba61cea814508eebdea"} Sep 30 14:22:22 crc kubenswrapper[4676]: I0930 14:22:22.122839 4676 scope.go:117] "RemoveContainer" containerID="f48b8383470f3a455212ee5d9c3258a9e675b448fe0402af5be5813bca42ec5e" Sep 30 14:22:22 crc kubenswrapper[4676]: I0930 14:22:22.122971 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-bwmdv" Sep 30 14:22:22 crc kubenswrapper[4676]: I0930 14:22:22.166293 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-bwmdv"] Sep 30 14:22:22 crc kubenswrapper[4676]: I0930 14:22:22.173657 4676 scope.go:117] "RemoveContainer" containerID="c2cb59ad197d6250a746c1d49fc9115f9f36368e6ecfaa85e386183dd88b5d7e" Sep 30 14:22:22 crc kubenswrapper[4676]: I0930 14:22:22.175365 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-bwmdv"] Sep 30 14:22:22 crc kubenswrapper[4676]: I0930 14:22:22.439157 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 14:22:22 crc kubenswrapper[4676]: I0930 14:22:22.439217 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 14:22:22 crc kubenswrapper[4676]: I0930 14:22:22.505489 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-jpk85" Sep 30 14:22:22 crc kubenswrapper[4676]: I0930 14:22:22.625375 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d2e03c1-e9fc-4b6d-a755-0582ad936263-combined-ca-bundle\") pod \"1d2e03c1-e9fc-4b6d-a755-0582ad936263\" (UID: \"1d2e03c1-e9fc-4b6d-a755-0582ad936263\") " Sep 30 14:22:22 crc kubenswrapper[4676]: I0930 14:22:22.625473 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d2e03c1-e9fc-4b6d-a755-0582ad936263-scripts\") pod \"1d2e03c1-e9fc-4b6d-a755-0582ad936263\" (UID: \"1d2e03c1-e9fc-4b6d-a755-0582ad936263\") " Sep 30 14:22:22 crc kubenswrapper[4676]: I0930 14:22:22.625502 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gg5xz\" (UniqueName: \"kubernetes.io/projected/1d2e03c1-e9fc-4b6d-a755-0582ad936263-kube-api-access-gg5xz\") pod \"1d2e03c1-e9fc-4b6d-a755-0582ad936263\" (UID: \"1d2e03c1-e9fc-4b6d-a755-0582ad936263\") " Sep 30 14:22:22 crc kubenswrapper[4676]: I0930 14:22:22.625557 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d2e03c1-e9fc-4b6d-a755-0582ad936263-config-data\") pod \"1d2e03c1-e9fc-4b6d-a755-0582ad936263\" (UID: \"1d2e03c1-e9fc-4b6d-a755-0582ad936263\") " Sep 30 14:22:22 crc kubenswrapper[4676]: I0930 14:22:22.631678 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d2e03c1-e9fc-4b6d-a755-0582ad936263-scripts" (OuterVolumeSpecName: "scripts") pod "1d2e03c1-e9fc-4b6d-a755-0582ad936263" (UID: "1d2e03c1-e9fc-4b6d-a755-0582ad936263"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:22:22 crc kubenswrapper[4676]: I0930 14:22:22.636641 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d2e03c1-e9fc-4b6d-a755-0582ad936263-kube-api-access-gg5xz" (OuterVolumeSpecName: "kube-api-access-gg5xz") pod "1d2e03c1-e9fc-4b6d-a755-0582ad936263" (UID: "1d2e03c1-e9fc-4b6d-a755-0582ad936263"). InnerVolumeSpecName "kube-api-access-gg5xz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:22:22 crc kubenswrapper[4676]: I0930 14:22:22.657387 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d2e03c1-e9fc-4b6d-a755-0582ad936263-config-data" (OuterVolumeSpecName: "config-data") pod "1d2e03c1-e9fc-4b6d-a755-0582ad936263" (UID: "1d2e03c1-e9fc-4b6d-a755-0582ad936263"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:22:22 crc kubenswrapper[4676]: I0930 14:22:22.663989 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d2e03c1-e9fc-4b6d-a755-0582ad936263-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1d2e03c1-e9fc-4b6d-a755-0582ad936263" (UID: "1d2e03c1-e9fc-4b6d-a755-0582ad936263"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:22:22 crc kubenswrapper[4676]: I0930 14:22:22.728032 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d2e03c1-e9fc-4b6d-a755-0582ad936263-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:22 crc kubenswrapper[4676]: I0930 14:22:22.728078 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d2e03c1-e9fc-4b6d-a755-0582ad936263-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:22 crc kubenswrapper[4676]: I0930 14:22:22.728093 4676 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d2e03c1-e9fc-4b6d-a755-0582ad936263-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:22 crc kubenswrapper[4676]: I0930 14:22:22.728105 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gg5xz\" (UniqueName: \"kubernetes.io/projected/1d2e03c1-e9fc-4b6d-a755-0582ad936263-kube-api-access-gg5xz\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:23 crc kubenswrapper[4676]: I0930 14:22:23.147372 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-jpk85" event={"ID":"1d2e03c1-e9fc-4b6d-a755-0582ad936263","Type":"ContainerDied","Data":"3695bfc2014ea5138bc0f4db58aa049426a17af7a7d5dda0a53ddc9b7ba611e8"} Sep 30 14:22:23 crc kubenswrapper[4676]: I0930 14:22:23.147777 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3695bfc2014ea5138bc0f4db58aa049426a17af7a7d5dda0a53ddc9b7ba611e8" Sep 30 14:22:23 crc kubenswrapper[4676]: I0930 14:22:23.147649 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-jpk85" Sep 30 14:22:23 crc kubenswrapper[4676]: I0930 14:22:23.302624 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 14:22:23 crc kubenswrapper[4676]: I0930 14:22:23.302863 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="28068cf5-7fa1-4380-be99-4c0951283c8a" containerName="nova-api-log" containerID="cri-o://da4fc51b2a61e554d610da7848635d7d2345c3e8056a1910424b8488a66bb762" gracePeriod=30 Sep 30 14:22:23 crc kubenswrapper[4676]: I0930 14:22:23.303295 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="28068cf5-7fa1-4380-be99-4c0951283c8a" containerName="nova-api-api" containerID="cri-o://c398912b19fcae65168856e20bfce851fe8904dce4ccbeede142a806f649703e" gracePeriod=30 Sep 30 14:22:23 crc kubenswrapper[4676]: I0930 14:22:23.323832 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 14:22:23 crc kubenswrapper[4676]: I0930 14:22:23.324070 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="29ef64ab-bbf7-4ab9-ba3d-938dd6d90221" containerName="nova-scheduler-scheduler" containerID="cri-o://08d535ee44351c2ed718113b2ea1c57740e056373fbd52286dc8732c20e09ed4" gracePeriod=30 Sep 30 14:22:23 crc kubenswrapper[4676]: I0930 14:22:23.364623 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 14:22:23 crc kubenswrapper[4676]: I0930 14:22:23.365463 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4e28bd01-5bf5-45e9-9b7d-e47c0828c533" containerName="nova-metadata-metadata" containerID="cri-o://278fb22e25d5db98e26196f0f6f87029e00fc7a04d7fed14dd6c59f7546eee32" gracePeriod=30 Sep 30 14:22:23 crc kubenswrapper[4676]: I0930 14:22:23.365823 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4e28bd01-5bf5-45e9-9b7d-e47c0828c533" containerName="nova-metadata-log" containerID="cri-o://bb5c973481bbdd56894ab22a3dd28fb3fb887ed6e41a7dc9282c7cf9afa50b57" gracePeriod=30 Sep 30 14:22:23 crc kubenswrapper[4676]: I0930 14:22:23.450659 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bccc396-3182-4955-8d75-93c5b0b221c6" path="/var/lib/kubelet/pods/5bccc396-3182-4955-8d75-93c5b0b221c6/volumes" Sep 30 14:22:23 crc kubenswrapper[4676]: I0930 14:22:23.791606 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-z9nrs" Sep 30 14:22:23 crc kubenswrapper[4676]: I0930 14:22:23.791656 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-z9nrs" Sep 30 14:22:24 crc kubenswrapper[4676]: I0930 14:22:24.123724 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 14:22:24 crc kubenswrapper[4676]: I0930 14:22:24.160921 4676 generic.go:334] "Generic (PLEG): container finished" podID="28068cf5-7fa1-4380-be99-4c0951283c8a" containerID="da4fc51b2a61e554d610da7848635d7d2345c3e8056a1910424b8488a66bb762" exitCode=143 Sep 30 14:22:24 crc kubenswrapper[4676]: I0930 14:22:24.161042 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"28068cf5-7fa1-4380-be99-4c0951283c8a","Type":"ContainerDied","Data":"da4fc51b2a61e554d610da7848635d7d2345c3e8056a1910424b8488a66bb762"} Sep 30 14:22:24 crc kubenswrapper[4676]: I0930 14:22:24.163095 4676 generic.go:334] "Generic (PLEG): container finished" podID="279b0c8e-a8bf-457a-a345-ba9c1309c118" containerID="c71d74e0bbf7d6f48c794a196ed4ac131c92a4a7148bf33f1be1e83fd93f7e73" exitCode=0 Sep 30 14:22:24 crc kubenswrapper[4676]: I0930 14:22:24.163148 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-bwwr8" event={"ID":"279b0c8e-a8bf-457a-a345-ba9c1309c118","Type":"ContainerDied","Data":"c71d74e0bbf7d6f48c794a196ed4ac131c92a4a7148bf33f1be1e83fd93f7e73"} Sep 30 14:22:24 crc kubenswrapper[4676]: I0930 14:22:24.166119 4676 generic.go:334] "Generic (PLEG): container finished" podID="4e28bd01-5bf5-45e9-9b7d-e47c0828c533" containerID="278fb22e25d5db98e26196f0f6f87029e00fc7a04d7fed14dd6c59f7546eee32" exitCode=0 Sep 30 14:22:24 crc kubenswrapper[4676]: I0930 14:22:24.166159 4676 generic.go:334] "Generic (PLEG): container finished" podID="4e28bd01-5bf5-45e9-9b7d-e47c0828c533" containerID="bb5c973481bbdd56894ab22a3dd28fb3fb887ed6e41a7dc9282c7cf9afa50b57" exitCode=143 Sep 30 14:22:24 crc kubenswrapper[4676]: I0930 14:22:24.166188 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4e28bd01-5bf5-45e9-9b7d-e47c0828c533","Type":"ContainerDied","Data":"278fb22e25d5db98e26196f0f6f87029e00fc7a04d7fed14dd6c59f7546eee32"} Sep 30 14:22:24 crc kubenswrapper[4676]: I0930 14:22:24.166217 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4e28bd01-5bf5-45e9-9b7d-e47c0828c533","Type":"ContainerDied","Data":"bb5c973481bbdd56894ab22a3dd28fb3fb887ed6e41a7dc9282c7cf9afa50b57"} Sep 30 14:22:24 crc kubenswrapper[4676]: I0930 14:22:24.166229 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4e28bd01-5bf5-45e9-9b7d-e47c0828c533","Type":"ContainerDied","Data":"8cee40c262ada7a0db4fdfecc847ff3788420d10dc208335602ee44bcab436ab"} Sep 30 14:22:24 crc kubenswrapper[4676]: I0930 14:22:24.166248 4676 scope.go:117] "RemoveContainer" containerID="278fb22e25d5db98e26196f0f6f87029e00fc7a04d7fed14dd6c59f7546eee32" Sep 30 14:22:24 crc kubenswrapper[4676]: I0930 14:22:24.166369 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 14:22:24 crc kubenswrapper[4676]: I0930 14:22:24.191772 4676 scope.go:117] "RemoveContainer" containerID="bb5c973481bbdd56894ab22a3dd28fb3fb887ed6e41a7dc9282c7cf9afa50b57" Sep 30 14:22:24 crc kubenswrapper[4676]: I0930 14:22:24.208857 4676 scope.go:117] "RemoveContainer" containerID="278fb22e25d5db98e26196f0f6f87029e00fc7a04d7fed14dd6c59f7546eee32" Sep 30 14:22:24 crc kubenswrapper[4676]: E0930 14:22:24.209282 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"278fb22e25d5db98e26196f0f6f87029e00fc7a04d7fed14dd6c59f7546eee32\": container with ID starting with 278fb22e25d5db98e26196f0f6f87029e00fc7a04d7fed14dd6c59f7546eee32 not found: ID does not exist" containerID="278fb22e25d5db98e26196f0f6f87029e00fc7a04d7fed14dd6c59f7546eee32" Sep 30 14:22:24 crc kubenswrapper[4676]: I0930 14:22:24.209314 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"278fb22e25d5db98e26196f0f6f87029e00fc7a04d7fed14dd6c59f7546eee32"} err="failed to get container status \"278fb22e25d5db98e26196f0f6f87029e00fc7a04d7fed14dd6c59f7546eee32\": rpc error: code = NotFound desc = could not find container \"278fb22e25d5db98e26196f0f6f87029e00fc7a04d7fed14dd6c59f7546eee32\": container with ID starting with 278fb22e25d5db98e26196f0f6f87029e00fc7a04d7fed14dd6c59f7546eee32 not found: ID does not exist" Sep 30 14:22:24 crc kubenswrapper[4676]: I0930 14:22:24.209334 4676 scope.go:117] "RemoveContainer" containerID="bb5c973481bbdd56894ab22a3dd28fb3fb887ed6e41a7dc9282c7cf9afa50b57" Sep 30 14:22:24 crc kubenswrapper[4676]: E0930 14:22:24.210831 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb5c973481bbdd56894ab22a3dd28fb3fb887ed6e41a7dc9282c7cf9afa50b57\": container with ID starting with bb5c973481bbdd56894ab22a3dd28fb3fb887ed6e41a7dc9282c7cf9afa50b57 not found: ID does not exist" containerID="bb5c973481bbdd56894ab22a3dd28fb3fb887ed6e41a7dc9282c7cf9afa50b57" Sep 30 14:22:24 crc kubenswrapper[4676]: I0930 14:22:24.210885 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb5c973481bbdd56894ab22a3dd28fb3fb887ed6e41a7dc9282c7cf9afa50b57"} err="failed to get container status \"bb5c973481bbdd56894ab22a3dd28fb3fb887ed6e41a7dc9282c7cf9afa50b57\": rpc error: code = NotFound desc = could not find container \"bb5c973481bbdd56894ab22a3dd28fb3fb887ed6e41a7dc9282c7cf9afa50b57\": container with ID starting with bb5c973481bbdd56894ab22a3dd28fb3fb887ed6e41a7dc9282c7cf9afa50b57 not found: ID does not exist" Sep 30 14:22:24 crc kubenswrapper[4676]: I0930 14:22:24.210913 4676 scope.go:117] "RemoveContainer" containerID="278fb22e25d5db98e26196f0f6f87029e00fc7a04d7fed14dd6c59f7546eee32" Sep 30 14:22:24 crc kubenswrapper[4676]: I0930 14:22:24.211198 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"278fb22e25d5db98e26196f0f6f87029e00fc7a04d7fed14dd6c59f7546eee32"} err="failed to get container status \"278fb22e25d5db98e26196f0f6f87029e00fc7a04d7fed14dd6c59f7546eee32\": rpc error: code = NotFound desc = could not find container \"278fb22e25d5db98e26196f0f6f87029e00fc7a04d7fed14dd6c59f7546eee32\": container with ID starting with 278fb22e25d5db98e26196f0f6f87029e00fc7a04d7fed14dd6c59f7546eee32 not found: ID does not exist" Sep 30 14:22:24 crc kubenswrapper[4676]: I0930 14:22:24.211217 4676 scope.go:117] "RemoveContainer" containerID="bb5c973481bbdd56894ab22a3dd28fb3fb887ed6e41a7dc9282c7cf9afa50b57" Sep 30 14:22:24 crc kubenswrapper[4676]: I0930 14:22:24.211564 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb5c973481bbdd56894ab22a3dd28fb3fb887ed6e41a7dc9282c7cf9afa50b57"} err="failed to get container status \"bb5c973481bbdd56894ab22a3dd28fb3fb887ed6e41a7dc9282c7cf9afa50b57\": rpc error: code = NotFound desc = could not find container \"bb5c973481bbdd56894ab22a3dd28fb3fb887ed6e41a7dc9282c7cf9afa50b57\": container with ID starting with bb5c973481bbdd56894ab22a3dd28fb3fb887ed6e41a7dc9282c7cf9afa50b57 not found: ID does not exist" Sep 30 14:22:24 crc kubenswrapper[4676]: I0930 14:22:24.261061 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e28bd01-5bf5-45e9-9b7d-e47c0828c533-config-data\") pod \"4e28bd01-5bf5-45e9-9b7d-e47c0828c533\" (UID: \"4e28bd01-5bf5-45e9-9b7d-e47c0828c533\") " Sep 30 14:22:24 crc kubenswrapper[4676]: I0930 14:22:24.261272 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e28bd01-5bf5-45e9-9b7d-e47c0828c533-nova-metadata-tls-certs\") pod \"4e28bd01-5bf5-45e9-9b7d-e47c0828c533\" (UID: \"4e28bd01-5bf5-45e9-9b7d-e47c0828c533\") " Sep 30 14:22:24 crc kubenswrapper[4676]: I0930 14:22:24.261342 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e28bd01-5bf5-45e9-9b7d-e47c0828c533-logs\") pod \"4e28bd01-5bf5-45e9-9b7d-e47c0828c533\" (UID: \"4e28bd01-5bf5-45e9-9b7d-e47c0828c533\") " Sep 30 14:22:24 crc kubenswrapper[4676]: I0930 14:22:24.261582 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxxxn\" (UniqueName: \"kubernetes.io/projected/4e28bd01-5bf5-45e9-9b7d-e47c0828c533-kube-api-access-bxxxn\") pod \"4e28bd01-5bf5-45e9-9b7d-e47c0828c533\" (UID: \"4e28bd01-5bf5-45e9-9b7d-e47c0828c533\") " Sep 30 14:22:24 crc kubenswrapper[4676]: I0930 14:22:24.261616 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e28bd01-5bf5-45e9-9b7d-e47c0828c533-combined-ca-bundle\") pod \"4e28bd01-5bf5-45e9-9b7d-e47c0828c533\" (UID: \"4e28bd01-5bf5-45e9-9b7d-e47c0828c533\") " Sep 30 14:22:24 crc kubenswrapper[4676]: I0930 14:22:24.262186 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e28bd01-5bf5-45e9-9b7d-e47c0828c533-logs" (OuterVolumeSpecName: "logs") pod "4e28bd01-5bf5-45e9-9b7d-e47c0828c533" (UID: "4e28bd01-5bf5-45e9-9b7d-e47c0828c533"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:22:24 crc kubenswrapper[4676]: I0930 14:22:24.262368 4676 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e28bd01-5bf5-45e9-9b7d-e47c0828c533-logs\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:24 crc kubenswrapper[4676]: I0930 14:22:24.288155 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e28bd01-5bf5-45e9-9b7d-e47c0828c533-kube-api-access-bxxxn" (OuterVolumeSpecName: "kube-api-access-bxxxn") pod "4e28bd01-5bf5-45e9-9b7d-e47c0828c533" (UID: "4e28bd01-5bf5-45e9-9b7d-e47c0828c533"). InnerVolumeSpecName "kube-api-access-bxxxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:22:24 crc kubenswrapper[4676]: I0930 14:22:24.297949 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e28bd01-5bf5-45e9-9b7d-e47c0828c533-config-data" (OuterVolumeSpecName: "config-data") pod "4e28bd01-5bf5-45e9-9b7d-e47c0828c533" (UID: "4e28bd01-5bf5-45e9-9b7d-e47c0828c533"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:22:24 crc kubenswrapper[4676]: I0930 14:22:24.300563 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e28bd01-5bf5-45e9-9b7d-e47c0828c533-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4e28bd01-5bf5-45e9-9b7d-e47c0828c533" (UID: "4e28bd01-5bf5-45e9-9b7d-e47c0828c533"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:22:24 crc kubenswrapper[4676]: I0930 14:22:24.335128 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e28bd01-5bf5-45e9-9b7d-e47c0828c533-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "4e28bd01-5bf5-45e9-9b7d-e47c0828c533" (UID: "4e28bd01-5bf5-45e9-9b7d-e47c0828c533"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:22:24 crc kubenswrapper[4676]: I0930 14:22:24.364707 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxxxn\" (UniqueName: \"kubernetes.io/projected/4e28bd01-5bf5-45e9-9b7d-e47c0828c533-kube-api-access-bxxxn\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:24 crc kubenswrapper[4676]: I0930 14:22:24.364758 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e28bd01-5bf5-45e9-9b7d-e47c0828c533-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:24 crc kubenswrapper[4676]: I0930 14:22:24.364769 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e28bd01-5bf5-45e9-9b7d-e47c0828c533-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:24 crc kubenswrapper[4676]: I0930 14:22:24.364781 4676 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e28bd01-5bf5-45e9-9b7d-e47c0828c533-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:24 crc kubenswrapper[4676]: I0930 14:22:24.521636 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 14:22:24 crc kubenswrapper[4676]: I0930 14:22:24.528983 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 14:22:24 crc kubenswrapper[4676]: I0930 14:22:24.539085 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Sep 30 14:22:24 crc kubenswrapper[4676]: E0930 14:22:24.540059 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e28bd01-5bf5-45e9-9b7d-e47c0828c533" containerName="nova-metadata-metadata" Sep 30 14:22:24 crc kubenswrapper[4676]: I0930 14:22:24.540098 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e28bd01-5bf5-45e9-9b7d-e47c0828c533" containerName="nova-metadata-metadata" Sep 30 14:22:24 crc kubenswrapper[4676]: E0930 14:22:24.540131 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bccc396-3182-4955-8d75-93c5b0b221c6" containerName="init" Sep 30 14:22:24 crc kubenswrapper[4676]: I0930 14:22:24.540144 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bccc396-3182-4955-8d75-93c5b0b221c6" containerName="init" Sep 30 14:22:24 crc kubenswrapper[4676]: E0930 14:22:24.540178 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bccc396-3182-4955-8d75-93c5b0b221c6" containerName="dnsmasq-dns" Sep 30 14:22:24 crc kubenswrapper[4676]: I0930 14:22:24.540192 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bccc396-3182-4955-8d75-93c5b0b221c6" containerName="dnsmasq-dns" Sep 30 14:22:24 crc kubenswrapper[4676]: E0930 14:22:24.540231 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d2e03c1-e9fc-4b6d-a755-0582ad936263" containerName="nova-manage" Sep 30 14:22:24 crc kubenswrapper[4676]: I0930 14:22:24.540245 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d2e03c1-e9fc-4b6d-a755-0582ad936263" containerName="nova-manage" Sep 30 14:22:24 crc kubenswrapper[4676]: E0930 14:22:24.540274 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e28bd01-5bf5-45e9-9b7d-e47c0828c533" containerName="nova-metadata-log" Sep 30 14:22:24 crc kubenswrapper[4676]: I0930 14:22:24.540285 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e28bd01-5bf5-45e9-9b7d-e47c0828c533" containerName="nova-metadata-log" Sep 30 14:22:24 crc kubenswrapper[4676]: I0930 14:22:24.540622 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e28bd01-5bf5-45e9-9b7d-e47c0828c533" containerName="nova-metadata-log" Sep 30 14:22:24 crc kubenswrapper[4676]: I0930 14:22:24.540659 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d2e03c1-e9fc-4b6d-a755-0582ad936263" containerName="nova-manage" Sep 30 14:22:24 crc kubenswrapper[4676]: I0930 14:22:24.540695 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e28bd01-5bf5-45e9-9b7d-e47c0828c533" containerName="nova-metadata-metadata" Sep 30 14:22:24 crc kubenswrapper[4676]: I0930 14:22:24.540729 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bccc396-3182-4955-8d75-93c5b0b221c6" containerName="dnsmasq-dns" Sep 30 14:22:24 crc kubenswrapper[4676]: I0930 14:22:24.542667 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 14:22:24 crc kubenswrapper[4676]: I0930 14:22:24.547951 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Sep 30 14:22:24 crc kubenswrapper[4676]: I0930 14:22:24.548367 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Sep 30 14:22:24 crc kubenswrapper[4676]: I0930 14:22:24.549365 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 14:22:24 crc kubenswrapper[4676]: I0930 14:22:24.672160 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8jhn\" (UniqueName: \"kubernetes.io/projected/36d8ac97-2979-408c-add8-bbda7b01abe2-kube-api-access-k8jhn\") pod \"nova-metadata-0\" (UID: \"36d8ac97-2979-408c-add8-bbda7b01abe2\") " pod="openstack/nova-metadata-0" Sep 30 14:22:24 crc kubenswrapper[4676]: I0930 14:22:24.672561 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36d8ac97-2979-408c-add8-bbda7b01abe2-config-data\") pod \"nova-metadata-0\" (UID: \"36d8ac97-2979-408c-add8-bbda7b01abe2\") " pod="openstack/nova-metadata-0" Sep 30 14:22:24 crc kubenswrapper[4676]: I0930 14:22:24.672743 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/36d8ac97-2979-408c-add8-bbda7b01abe2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"36d8ac97-2979-408c-add8-bbda7b01abe2\") " pod="openstack/nova-metadata-0" Sep 30 14:22:24 crc kubenswrapper[4676]: I0930 14:22:24.673000 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36d8ac97-2979-408c-add8-bbda7b01abe2-logs\") pod \"nova-metadata-0\" (UID: \"36d8ac97-2979-408c-add8-bbda7b01abe2\") " pod="openstack/nova-metadata-0" Sep 30 14:22:24 crc kubenswrapper[4676]: I0930 14:22:24.673193 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36d8ac97-2979-408c-add8-bbda7b01abe2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"36d8ac97-2979-408c-add8-bbda7b01abe2\") " pod="openstack/nova-metadata-0" Sep 30 14:22:24 crc kubenswrapper[4676]: I0930 14:22:24.776334 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36d8ac97-2979-408c-add8-bbda7b01abe2-config-data\") pod \"nova-metadata-0\" (UID: \"36d8ac97-2979-408c-add8-bbda7b01abe2\") " pod="openstack/nova-metadata-0" Sep 30 14:22:24 crc kubenswrapper[4676]: I0930 14:22:24.776974 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/36d8ac97-2979-408c-add8-bbda7b01abe2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"36d8ac97-2979-408c-add8-bbda7b01abe2\") " pod="openstack/nova-metadata-0" Sep 30 14:22:24 crc kubenswrapper[4676]: I0930 14:22:24.777072 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36d8ac97-2979-408c-add8-bbda7b01abe2-logs\") pod \"nova-metadata-0\" (UID: \"36d8ac97-2979-408c-add8-bbda7b01abe2\") " pod="openstack/nova-metadata-0" Sep 30 14:22:24 crc kubenswrapper[4676]: I0930 14:22:24.777123 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36d8ac97-2979-408c-add8-bbda7b01abe2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"36d8ac97-2979-408c-add8-bbda7b01abe2\") " pod="openstack/nova-metadata-0" Sep 30 14:22:24 crc kubenswrapper[4676]: I0930 14:22:24.777287 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8jhn\" (UniqueName: \"kubernetes.io/projected/36d8ac97-2979-408c-add8-bbda7b01abe2-kube-api-access-k8jhn\") pod \"nova-metadata-0\" (UID: \"36d8ac97-2979-408c-add8-bbda7b01abe2\") " pod="openstack/nova-metadata-0" Sep 30 14:22:24 crc kubenswrapper[4676]: I0930 14:22:24.777717 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36d8ac97-2979-408c-add8-bbda7b01abe2-logs\") pod \"nova-metadata-0\" (UID: \"36d8ac97-2979-408c-add8-bbda7b01abe2\") " pod="openstack/nova-metadata-0" Sep 30 14:22:24 crc kubenswrapper[4676]: I0930 14:22:24.782677 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/36d8ac97-2979-408c-add8-bbda7b01abe2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"36d8ac97-2979-408c-add8-bbda7b01abe2\") " pod="openstack/nova-metadata-0" Sep 30 14:22:24 crc kubenswrapper[4676]: I0930 14:22:24.783471 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36d8ac97-2979-408c-add8-bbda7b01abe2-config-data\") pod \"nova-metadata-0\" (UID: \"36d8ac97-2979-408c-add8-bbda7b01abe2\") " pod="openstack/nova-metadata-0" Sep 30 14:22:24 crc kubenswrapper[4676]: I0930 14:22:24.784137 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36d8ac97-2979-408c-add8-bbda7b01abe2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"36d8ac97-2979-408c-add8-bbda7b01abe2\") " pod="openstack/nova-metadata-0" Sep 30 14:22:24 crc kubenswrapper[4676]: I0930 14:22:24.796212 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8jhn\" (UniqueName: \"kubernetes.io/projected/36d8ac97-2979-408c-add8-bbda7b01abe2-kube-api-access-k8jhn\") pod \"nova-metadata-0\" (UID: \"36d8ac97-2979-408c-add8-bbda7b01abe2\") " pod="openstack/nova-metadata-0" Sep 30 14:22:24 crc kubenswrapper[4676]: I0930 14:22:24.843170 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-z9nrs" podUID="be110bf6-fee0-4775-a76c-938038512ef6" containerName="registry-server" probeResult="failure" output=< Sep 30 14:22:24 crc kubenswrapper[4676]: timeout: failed to connect service ":50051" within 1s Sep 30 14:22:24 crc kubenswrapper[4676]: > Sep 30 14:22:24 crc kubenswrapper[4676]: I0930 14:22:24.875520 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 14:22:25 crc kubenswrapper[4676]: I0930 14:22:25.349371 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 14:22:25 crc kubenswrapper[4676]: I0930 14:22:25.445421 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e28bd01-5bf5-45e9-9b7d-e47c0828c533" path="/var/lib/kubelet/pods/4e28bd01-5bf5-45e9-9b7d-e47c0828c533/volumes" Sep 30 14:22:25 crc kubenswrapper[4676]: I0930 14:22:25.510728 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-bwwr8" Sep 30 14:22:25 crc kubenswrapper[4676]: E0930 14:22:25.527461 4676 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="08d535ee44351c2ed718113b2ea1c57740e056373fbd52286dc8732c20e09ed4" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 30 14:22:25 crc kubenswrapper[4676]: E0930 14:22:25.530186 4676 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="08d535ee44351c2ed718113b2ea1c57740e056373fbd52286dc8732c20e09ed4" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 30 14:22:25 crc kubenswrapper[4676]: E0930 14:22:25.531597 4676 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="08d535ee44351c2ed718113b2ea1c57740e056373fbd52286dc8732c20e09ed4" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 30 14:22:25 crc kubenswrapper[4676]: E0930 14:22:25.531662 4676 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="29ef64ab-bbf7-4ab9-ba3d-938dd6d90221" containerName="nova-scheduler-scheduler" Sep 30 14:22:25 crc kubenswrapper[4676]: I0930 14:22:25.595430 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/279b0c8e-a8bf-457a-a345-ba9c1309c118-config-data\") pod \"279b0c8e-a8bf-457a-a345-ba9c1309c118\" (UID: \"279b0c8e-a8bf-457a-a345-ba9c1309c118\") " Sep 30 14:22:25 crc kubenswrapper[4676]: I0930 14:22:25.595761 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gw4z4\" (UniqueName: \"kubernetes.io/projected/279b0c8e-a8bf-457a-a345-ba9c1309c118-kube-api-access-gw4z4\") pod \"279b0c8e-a8bf-457a-a345-ba9c1309c118\" (UID: \"279b0c8e-a8bf-457a-a345-ba9c1309c118\") " Sep 30 14:22:25 crc kubenswrapper[4676]: I0930 14:22:25.595878 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/279b0c8e-a8bf-457a-a345-ba9c1309c118-scripts\") pod \"279b0c8e-a8bf-457a-a345-ba9c1309c118\" (UID: \"279b0c8e-a8bf-457a-a345-ba9c1309c118\") " Sep 30 14:22:25 crc kubenswrapper[4676]: I0930 14:22:25.596009 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/279b0c8e-a8bf-457a-a345-ba9c1309c118-combined-ca-bundle\") pod \"279b0c8e-a8bf-457a-a345-ba9c1309c118\" (UID: \"279b0c8e-a8bf-457a-a345-ba9c1309c118\") " Sep 30 14:22:25 crc kubenswrapper[4676]: I0930 14:22:25.601288 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/279b0c8e-a8bf-457a-a345-ba9c1309c118-scripts" (OuterVolumeSpecName: "scripts") pod "279b0c8e-a8bf-457a-a345-ba9c1309c118" (UID: "279b0c8e-a8bf-457a-a345-ba9c1309c118"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:22:25 crc kubenswrapper[4676]: I0930 14:22:25.602919 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/279b0c8e-a8bf-457a-a345-ba9c1309c118-kube-api-access-gw4z4" (OuterVolumeSpecName: "kube-api-access-gw4z4") pod "279b0c8e-a8bf-457a-a345-ba9c1309c118" (UID: "279b0c8e-a8bf-457a-a345-ba9c1309c118"). InnerVolumeSpecName "kube-api-access-gw4z4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:22:25 crc kubenswrapper[4676]: I0930 14:22:25.624140 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/279b0c8e-a8bf-457a-a345-ba9c1309c118-config-data" (OuterVolumeSpecName: "config-data") pod "279b0c8e-a8bf-457a-a345-ba9c1309c118" (UID: "279b0c8e-a8bf-457a-a345-ba9c1309c118"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:22:25 crc kubenswrapper[4676]: I0930 14:22:25.626315 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/279b0c8e-a8bf-457a-a345-ba9c1309c118-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "279b0c8e-a8bf-457a-a345-ba9c1309c118" (UID: "279b0c8e-a8bf-457a-a345-ba9c1309c118"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:22:25 crc kubenswrapper[4676]: I0930 14:22:25.698038 4676 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/279b0c8e-a8bf-457a-a345-ba9c1309c118-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:25 crc kubenswrapper[4676]: I0930 14:22:25.698069 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/279b0c8e-a8bf-457a-a345-ba9c1309c118-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:25 crc kubenswrapper[4676]: I0930 14:22:25.698079 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/279b0c8e-a8bf-457a-a345-ba9c1309c118-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:25 crc kubenswrapper[4676]: I0930 14:22:25.698089 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gw4z4\" (UniqueName: \"kubernetes.io/projected/279b0c8e-a8bf-457a-a345-ba9c1309c118-kube-api-access-gw4z4\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:26 crc kubenswrapper[4676]: I0930 14:22:26.185158 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-bwwr8" event={"ID":"279b0c8e-a8bf-457a-a345-ba9c1309c118","Type":"ContainerDied","Data":"5581816deb3f4018449ef45c48c840d7ff63c089915f82f60cce25a453fda3fa"} Sep 30 14:22:26 crc kubenswrapper[4676]: I0930 14:22:26.185509 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5581816deb3f4018449ef45c48c840d7ff63c089915f82f60cce25a453fda3fa" Sep 30 14:22:26 crc kubenswrapper[4676]: I0930 14:22:26.185185 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-bwwr8" Sep 30 14:22:26 crc kubenswrapper[4676]: I0930 14:22:26.187184 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"36d8ac97-2979-408c-add8-bbda7b01abe2","Type":"ContainerStarted","Data":"42e55a79f6f42cc0e144da5c8e0ea9b45d7d73d68fa705e299fd82c634c2dcc2"} Sep 30 14:22:26 crc kubenswrapper[4676]: I0930 14:22:26.187209 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"36d8ac97-2979-408c-add8-bbda7b01abe2","Type":"ContainerStarted","Data":"a0b9aa2d62eab5874cba513c04d560a51ae84688cd11d0f202bcfd7dd8b21fe6"} Sep 30 14:22:26 crc kubenswrapper[4676]: I0930 14:22:26.187219 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"36d8ac97-2979-408c-add8-bbda7b01abe2","Type":"ContainerStarted","Data":"a2d859741ac220cb8e732ae37d5c62693400855ad8279fe9234915301d2535d9"} Sep 30 14:22:26 crc kubenswrapper[4676]: I0930 14:22:26.260806 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.260785401 podStartE2EDuration="2.260785401s" podCreationTimestamp="2025-09-30 14:22:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:22:26.213074315 +0000 UTC m=+1450.196162744" watchObservedRunningTime="2025-09-30 14:22:26.260785401 +0000 UTC m=+1450.243873830" Sep 30 14:22:26 crc kubenswrapper[4676]: I0930 14:22:26.341961 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Sep 30 14:22:26 crc kubenswrapper[4676]: E0930 14:22:26.342966 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="279b0c8e-a8bf-457a-a345-ba9c1309c118" containerName="nova-cell1-conductor-db-sync" Sep 30 14:22:26 crc kubenswrapper[4676]: I0930 14:22:26.342989 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="279b0c8e-a8bf-457a-a345-ba9c1309c118" containerName="nova-cell1-conductor-db-sync" Sep 30 14:22:26 crc kubenswrapper[4676]: I0930 14:22:26.344031 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="279b0c8e-a8bf-457a-a345-ba9c1309c118" containerName="nova-cell1-conductor-db-sync" Sep 30 14:22:26 crc kubenswrapper[4676]: I0930 14:22:26.353329 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Sep 30 14:22:26 crc kubenswrapper[4676]: I0930 14:22:26.369035 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Sep 30 14:22:26 crc kubenswrapper[4676]: I0930 14:22:26.411424 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Sep 30 14:22:26 crc kubenswrapper[4676]: I0930 14:22:26.435750 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3250c50-5426-440d-a8ba-9a4f75001b16-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f3250c50-5426-440d-a8ba-9a4f75001b16\") " pod="openstack/nova-cell1-conductor-0" Sep 30 14:22:26 crc kubenswrapper[4676]: I0930 14:22:26.435801 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3250c50-5426-440d-a8ba-9a4f75001b16-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f3250c50-5426-440d-a8ba-9a4f75001b16\") " pod="openstack/nova-cell1-conductor-0" Sep 30 14:22:26 crc kubenswrapper[4676]: I0930 14:22:26.435879 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk7mp\" (UniqueName: \"kubernetes.io/projected/f3250c50-5426-440d-a8ba-9a4f75001b16-kube-api-access-rk7mp\") pod \"nova-cell1-conductor-0\" (UID: \"f3250c50-5426-440d-a8ba-9a4f75001b16\") " pod="openstack/nova-cell1-conductor-0" Sep 30 14:22:26 crc kubenswrapper[4676]: I0930 14:22:26.537988 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3250c50-5426-440d-a8ba-9a4f75001b16-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f3250c50-5426-440d-a8ba-9a4f75001b16\") " pod="openstack/nova-cell1-conductor-0" Sep 30 14:22:26 crc kubenswrapper[4676]: I0930 14:22:26.538041 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3250c50-5426-440d-a8ba-9a4f75001b16-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f3250c50-5426-440d-a8ba-9a4f75001b16\") " pod="openstack/nova-cell1-conductor-0" Sep 30 14:22:26 crc kubenswrapper[4676]: I0930 14:22:26.538155 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rk7mp\" (UniqueName: \"kubernetes.io/projected/f3250c50-5426-440d-a8ba-9a4f75001b16-kube-api-access-rk7mp\") pod \"nova-cell1-conductor-0\" (UID: \"f3250c50-5426-440d-a8ba-9a4f75001b16\") " pod="openstack/nova-cell1-conductor-0" Sep 30 14:22:26 crc kubenswrapper[4676]: I0930 14:22:26.545494 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3250c50-5426-440d-a8ba-9a4f75001b16-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f3250c50-5426-440d-a8ba-9a4f75001b16\") " pod="openstack/nova-cell1-conductor-0" Sep 30 14:22:26 crc kubenswrapper[4676]: I0930 14:22:26.545510 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3250c50-5426-440d-a8ba-9a4f75001b16-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f3250c50-5426-440d-a8ba-9a4f75001b16\") " pod="openstack/nova-cell1-conductor-0" Sep 30 14:22:26 crc kubenswrapper[4676]: I0930 14:22:26.555111 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk7mp\" (UniqueName: \"kubernetes.io/projected/f3250c50-5426-440d-a8ba-9a4f75001b16-kube-api-access-rk7mp\") pod \"nova-cell1-conductor-0\" (UID: \"f3250c50-5426-440d-a8ba-9a4f75001b16\") " pod="openstack/nova-cell1-conductor-0" Sep 30 14:22:26 crc kubenswrapper[4676]: I0930 14:22:26.716149 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Sep 30 14:22:27 crc kubenswrapper[4676]: I0930 14:22:27.187512 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 14:22:27 crc kubenswrapper[4676]: I0930 14:22:27.196825 4676 generic.go:334] "Generic (PLEG): container finished" podID="28068cf5-7fa1-4380-be99-4c0951283c8a" containerID="c398912b19fcae65168856e20bfce851fe8904dce4ccbeede142a806f649703e" exitCode=0 Sep 30 14:22:27 crc kubenswrapper[4676]: I0930 14:22:27.196867 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 14:22:27 crc kubenswrapper[4676]: I0930 14:22:27.196942 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"28068cf5-7fa1-4380-be99-4c0951283c8a","Type":"ContainerDied","Data":"c398912b19fcae65168856e20bfce851fe8904dce4ccbeede142a806f649703e"} Sep 30 14:22:27 crc kubenswrapper[4676]: I0930 14:22:27.196972 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"28068cf5-7fa1-4380-be99-4c0951283c8a","Type":"ContainerDied","Data":"c10e65cf28c1ab8d96f6411a387e7fc9adef3094f3833ed748c9431a67b8dee0"} Sep 30 14:22:27 crc kubenswrapper[4676]: I0930 14:22:27.196992 4676 scope.go:117] "RemoveContainer" containerID="c398912b19fcae65168856e20bfce851fe8904dce4ccbeede142a806f649703e" Sep 30 14:22:27 crc kubenswrapper[4676]: I0930 14:22:27.227124 4676 scope.go:117] "RemoveContainer" containerID="da4fc51b2a61e554d610da7848635d7d2345c3e8056a1910424b8488a66bb762" Sep 30 14:22:27 crc kubenswrapper[4676]: I0930 14:22:27.258688 4676 scope.go:117] "RemoveContainer" containerID="c398912b19fcae65168856e20bfce851fe8904dce4ccbeede142a806f649703e" Sep 30 14:22:27 crc kubenswrapper[4676]: E0930 14:22:27.259363 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c398912b19fcae65168856e20bfce851fe8904dce4ccbeede142a806f649703e\": container with ID starting with c398912b19fcae65168856e20bfce851fe8904dce4ccbeede142a806f649703e not found: ID does not exist" containerID="c398912b19fcae65168856e20bfce851fe8904dce4ccbeede142a806f649703e" Sep 30 14:22:27 crc kubenswrapper[4676]: I0930 14:22:27.259424 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c398912b19fcae65168856e20bfce851fe8904dce4ccbeede142a806f649703e"} err="failed to get container status \"c398912b19fcae65168856e20bfce851fe8904dce4ccbeede142a806f649703e\": rpc error: code = NotFound desc = could not find container \"c398912b19fcae65168856e20bfce851fe8904dce4ccbeede142a806f649703e\": container with ID starting with c398912b19fcae65168856e20bfce851fe8904dce4ccbeede142a806f649703e not found: ID does not exist" Sep 30 14:22:27 crc kubenswrapper[4676]: I0930 14:22:27.259451 4676 scope.go:117] "RemoveContainer" containerID="da4fc51b2a61e554d610da7848635d7d2345c3e8056a1910424b8488a66bb762" Sep 30 14:22:27 crc kubenswrapper[4676]: E0930 14:22:27.261806 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da4fc51b2a61e554d610da7848635d7d2345c3e8056a1910424b8488a66bb762\": container with ID starting with da4fc51b2a61e554d610da7848635d7d2345c3e8056a1910424b8488a66bb762 not found: ID does not exist" containerID="da4fc51b2a61e554d610da7848635d7d2345c3e8056a1910424b8488a66bb762" Sep 30 14:22:27 crc kubenswrapper[4676]: I0930 14:22:27.261842 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da4fc51b2a61e554d610da7848635d7d2345c3e8056a1910424b8488a66bb762"} err="failed to get container status \"da4fc51b2a61e554d610da7848635d7d2345c3e8056a1910424b8488a66bb762\": rpc error: code = NotFound desc = could not find container \"da4fc51b2a61e554d610da7848635d7d2345c3e8056a1910424b8488a66bb762\": container with ID starting with da4fc51b2a61e554d610da7848635d7d2345c3e8056a1910424b8488a66bb762 not found: ID does not exist" Sep 30 14:22:27 crc kubenswrapper[4676]: I0930 14:22:27.283863 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Sep 30 14:22:27 crc kubenswrapper[4676]: I0930 14:22:27.351782 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28068cf5-7fa1-4380-be99-4c0951283c8a-logs\") pod \"28068cf5-7fa1-4380-be99-4c0951283c8a\" (UID: \"28068cf5-7fa1-4380-be99-4c0951283c8a\") " Sep 30 14:22:27 crc kubenswrapper[4676]: I0930 14:22:27.351891 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28068cf5-7fa1-4380-be99-4c0951283c8a-combined-ca-bundle\") pod \"28068cf5-7fa1-4380-be99-4c0951283c8a\" (UID: \"28068cf5-7fa1-4380-be99-4c0951283c8a\") " Sep 30 14:22:27 crc kubenswrapper[4676]: I0930 14:22:27.351965 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28068cf5-7fa1-4380-be99-4c0951283c8a-config-data\") pod \"28068cf5-7fa1-4380-be99-4c0951283c8a\" (UID: \"28068cf5-7fa1-4380-be99-4c0951283c8a\") " Sep 30 14:22:27 crc kubenswrapper[4676]: I0930 14:22:27.351989 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8w2r\" (UniqueName: \"kubernetes.io/projected/28068cf5-7fa1-4380-be99-4c0951283c8a-kube-api-access-s8w2r\") pod \"28068cf5-7fa1-4380-be99-4c0951283c8a\" (UID: \"28068cf5-7fa1-4380-be99-4c0951283c8a\") " Sep 30 14:22:27 crc kubenswrapper[4676]: I0930 14:22:27.352599 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28068cf5-7fa1-4380-be99-4c0951283c8a-logs" (OuterVolumeSpecName: "logs") pod "28068cf5-7fa1-4380-be99-4c0951283c8a" (UID: "28068cf5-7fa1-4380-be99-4c0951283c8a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:22:27 crc kubenswrapper[4676]: I0930 14:22:27.352715 4676 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28068cf5-7fa1-4380-be99-4c0951283c8a-logs\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:27 crc kubenswrapper[4676]: I0930 14:22:27.356305 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28068cf5-7fa1-4380-be99-4c0951283c8a-kube-api-access-s8w2r" (OuterVolumeSpecName: "kube-api-access-s8w2r") pod "28068cf5-7fa1-4380-be99-4c0951283c8a" (UID: "28068cf5-7fa1-4380-be99-4c0951283c8a"). InnerVolumeSpecName "kube-api-access-s8w2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:22:27 crc kubenswrapper[4676]: I0930 14:22:27.380216 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28068cf5-7fa1-4380-be99-4c0951283c8a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "28068cf5-7fa1-4380-be99-4c0951283c8a" (UID: "28068cf5-7fa1-4380-be99-4c0951283c8a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:22:27 crc kubenswrapper[4676]: I0930 14:22:27.383077 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28068cf5-7fa1-4380-be99-4c0951283c8a-config-data" (OuterVolumeSpecName: "config-data") pod "28068cf5-7fa1-4380-be99-4c0951283c8a" (UID: "28068cf5-7fa1-4380-be99-4c0951283c8a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:22:27 crc kubenswrapper[4676]: I0930 14:22:27.454502 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28068cf5-7fa1-4380-be99-4c0951283c8a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:27 crc kubenswrapper[4676]: I0930 14:22:27.454534 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28068cf5-7fa1-4380-be99-4c0951283c8a-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:27 crc kubenswrapper[4676]: I0930 14:22:27.454542 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8w2r\" (UniqueName: \"kubernetes.io/projected/28068cf5-7fa1-4380-be99-4c0951283c8a-kube-api-access-s8w2r\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:27 crc kubenswrapper[4676]: I0930 14:22:27.524849 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 14:22:27 crc kubenswrapper[4676]: I0930 14:22:27.541368 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Sep 30 14:22:27 crc kubenswrapper[4676]: I0930 14:22:27.559900 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Sep 30 14:22:27 crc kubenswrapper[4676]: E0930 14:22:27.560418 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28068cf5-7fa1-4380-be99-4c0951283c8a" containerName="nova-api-api" Sep 30 14:22:27 crc kubenswrapper[4676]: I0930 14:22:27.560439 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="28068cf5-7fa1-4380-be99-4c0951283c8a" containerName="nova-api-api" Sep 30 14:22:27 crc kubenswrapper[4676]: E0930 14:22:27.560471 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28068cf5-7fa1-4380-be99-4c0951283c8a" containerName="nova-api-log" Sep 30 14:22:27 crc kubenswrapper[4676]: I0930 14:22:27.560479 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="28068cf5-7fa1-4380-be99-4c0951283c8a" containerName="nova-api-log" Sep 30 14:22:27 crc kubenswrapper[4676]: I0930 14:22:27.560702 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="28068cf5-7fa1-4380-be99-4c0951283c8a" containerName="nova-api-api" Sep 30 14:22:27 crc kubenswrapper[4676]: I0930 14:22:27.560737 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="28068cf5-7fa1-4380-be99-4c0951283c8a" containerName="nova-api-log" Sep 30 14:22:27 crc kubenswrapper[4676]: I0930 14:22:27.561973 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 14:22:27 crc kubenswrapper[4676]: I0930 14:22:27.565976 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Sep 30 14:22:27 crc kubenswrapper[4676]: I0930 14:22:27.573022 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 14:22:27 crc kubenswrapper[4676]: I0930 14:22:27.595996 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Sep 30 14:22:27 crc kubenswrapper[4676]: I0930 14:22:27.659601 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a-config-data\") pod \"nova-api-0\" (UID: \"ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a\") " pod="openstack/nova-api-0" Sep 30 14:22:27 crc kubenswrapper[4676]: I0930 14:22:27.659675 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qp2t\" (UniqueName: \"kubernetes.io/projected/ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a-kube-api-access-7qp2t\") pod \"nova-api-0\" (UID: \"ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a\") " pod="openstack/nova-api-0" Sep 30 14:22:27 crc kubenswrapper[4676]: I0930 14:22:27.659830 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a-logs\") pod \"nova-api-0\" (UID: \"ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a\") " pod="openstack/nova-api-0" Sep 30 14:22:27 crc kubenswrapper[4676]: I0930 14:22:27.659847 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a\") " pod="openstack/nova-api-0" Sep 30 14:22:27 crc kubenswrapper[4676]: I0930 14:22:27.737812 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 14:22:27 crc kubenswrapper[4676]: I0930 14:22:27.761971 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a-logs\") pod \"nova-api-0\" (UID: \"ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a\") " pod="openstack/nova-api-0" Sep 30 14:22:27 crc kubenswrapper[4676]: I0930 14:22:27.762014 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a\") " pod="openstack/nova-api-0" Sep 30 14:22:27 crc kubenswrapper[4676]: I0930 14:22:27.762045 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a-config-data\") pod \"nova-api-0\" (UID: \"ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a\") " pod="openstack/nova-api-0" Sep 30 14:22:27 crc kubenswrapper[4676]: I0930 14:22:27.762115 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qp2t\" (UniqueName: \"kubernetes.io/projected/ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a-kube-api-access-7qp2t\") pod \"nova-api-0\" (UID: \"ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a\") " pod="openstack/nova-api-0" Sep 30 14:22:27 crc kubenswrapper[4676]: I0930 14:22:27.763700 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a-logs\") pod \"nova-api-0\" (UID: \"ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a\") " pod="openstack/nova-api-0" Sep 30 14:22:27 crc kubenswrapper[4676]: I0930 14:22:27.766506 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a\") " pod="openstack/nova-api-0" Sep 30 14:22:27 crc kubenswrapper[4676]: I0930 14:22:27.768127 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a-config-data\") pod \"nova-api-0\" (UID: \"ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a\") " pod="openstack/nova-api-0" Sep 30 14:22:27 crc kubenswrapper[4676]: I0930 14:22:27.782395 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qp2t\" (UniqueName: \"kubernetes.io/projected/ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a-kube-api-access-7qp2t\") pod \"nova-api-0\" (UID: \"ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a\") " pod="openstack/nova-api-0" Sep 30 14:22:27 crc kubenswrapper[4676]: I0930 14:22:27.863332 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jp4ns\" (UniqueName: \"kubernetes.io/projected/29ef64ab-bbf7-4ab9-ba3d-938dd6d90221-kube-api-access-jp4ns\") pod \"29ef64ab-bbf7-4ab9-ba3d-938dd6d90221\" (UID: \"29ef64ab-bbf7-4ab9-ba3d-938dd6d90221\") " Sep 30 14:22:27 crc kubenswrapper[4676]: I0930 14:22:27.863384 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29ef64ab-bbf7-4ab9-ba3d-938dd6d90221-combined-ca-bundle\") pod \"29ef64ab-bbf7-4ab9-ba3d-938dd6d90221\" (UID: \"29ef64ab-bbf7-4ab9-ba3d-938dd6d90221\") " Sep 30 14:22:27 crc kubenswrapper[4676]: I0930 14:22:27.863450 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29ef64ab-bbf7-4ab9-ba3d-938dd6d90221-config-data\") pod \"29ef64ab-bbf7-4ab9-ba3d-938dd6d90221\" (UID: \"29ef64ab-bbf7-4ab9-ba3d-938dd6d90221\") " Sep 30 14:22:27 crc kubenswrapper[4676]: I0930 14:22:27.866769 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29ef64ab-bbf7-4ab9-ba3d-938dd6d90221-kube-api-access-jp4ns" (OuterVolumeSpecName: "kube-api-access-jp4ns") pod "29ef64ab-bbf7-4ab9-ba3d-938dd6d90221" (UID: "29ef64ab-bbf7-4ab9-ba3d-938dd6d90221"). InnerVolumeSpecName "kube-api-access-jp4ns". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:22:27 crc kubenswrapper[4676]: I0930 14:22:27.885134 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 14:22:27 crc kubenswrapper[4676]: I0930 14:22:27.894447 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29ef64ab-bbf7-4ab9-ba3d-938dd6d90221-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "29ef64ab-bbf7-4ab9-ba3d-938dd6d90221" (UID: "29ef64ab-bbf7-4ab9-ba3d-938dd6d90221"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:22:27 crc kubenswrapper[4676]: I0930 14:22:27.896978 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29ef64ab-bbf7-4ab9-ba3d-938dd6d90221-config-data" (OuterVolumeSpecName: "config-data") pod "29ef64ab-bbf7-4ab9-ba3d-938dd6d90221" (UID: "29ef64ab-bbf7-4ab9-ba3d-938dd6d90221"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:22:27 crc kubenswrapper[4676]: I0930 14:22:27.965181 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jp4ns\" (UniqueName: \"kubernetes.io/projected/29ef64ab-bbf7-4ab9-ba3d-938dd6d90221-kube-api-access-jp4ns\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:27 crc kubenswrapper[4676]: I0930 14:22:27.965218 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29ef64ab-bbf7-4ab9-ba3d-938dd6d90221-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:27 crc kubenswrapper[4676]: I0930 14:22:27.965228 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29ef64ab-bbf7-4ab9-ba3d-938dd6d90221-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:28 crc kubenswrapper[4676]: I0930 14:22:28.212752 4676 generic.go:334] "Generic (PLEG): container finished" podID="29ef64ab-bbf7-4ab9-ba3d-938dd6d90221" containerID="08d535ee44351c2ed718113b2ea1c57740e056373fbd52286dc8732c20e09ed4" exitCode=0 Sep 30 14:22:28 crc kubenswrapper[4676]: I0930 14:22:28.213149 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 14:22:28 crc kubenswrapper[4676]: I0930 14:22:28.213223 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"29ef64ab-bbf7-4ab9-ba3d-938dd6d90221","Type":"ContainerDied","Data":"08d535ee44351c2ed718113b2ea1c57740e056373fbd52286dc8732c20e09ed4"} Sep 30 14:22:28 crc kubenswrapper[4676]: I0930 14:22:28.213272 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"29ef64ab-bbf7-4ab9-ba3d-938dd6d90221","Type":"ContainerDied","Data":"655f5b7434d20d0d0e2c11b0bea5e7dea41f26699d8203ce6eb06255a22bb5b8"} Sep 30 14:22:28 crc kubenswrapper[4676]: I0930 14:22:28.213325 4676 scope.go:117] "RemoveContainer" containerID="08d535ee44351c2ed718113b2ea1c57740e056373fbd52286dc8732c20e09ed4" Sep 30 14:22:28 crc kubenswrapper[4676]: I0930 14:22:28.224166 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f3250c50-5426-440d-a8ba-9a4f75001b16","Type":"ContainerStarted","Data":"485bed5e50751a50759e23adde3c90d7255ecaafcc9d38be3c600fb334bbc933"} Sep 30 14:22:28 crc kubenswrapper[4676]: I0930 14:22:28.224217 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f3250c50-5426-440d-a8ba-9a4f75001b16","Type":"ContainerStarted","Data":"de18b6ce9ecea93e6275d8c4d2f1397696be6317715e7437a8dc38c80995f42a"} Sep 30 14:22:28 crc kubenswrapper[4676]: I0930 14:22:28.224424 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Sep 30 14:22:28 crc kubenswrapper[4676]: I0930 14:22:28.243198 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.243177996 podStartE2EDuration="2.243177996s" podCreationTimestamp="2025-09-30 14:22:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:22:28.242928559 +0000 UTC m=+1452.226016999" watchObservedRunningTime="2025-09-30 14:22:28.243177996 +0000 UTC m=+1452.226266425" Sep 30 14:22:28 crc kubenswrapper[4676]: I0930 14:22:28.249639 4676 scope.go:117] "RemoveContainer" containerID="08d535ee44351c2ed718113b2ea1c57740e056373fbd52286dc8732c20e09ed4" Sep 30 14:22:28 crc kubenswrapper[4676]: E0930 14:22:28.249958 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08d535ee44351c2ed718113b2ea1c57740e056373fbd52286dc8732c20e09ed4\": container with ID starting with 08d535ee44351c2ed718113b2ea1c57740e056373fbd52286dc8732c20e09ed4 not found: ID does not exist" containerID="08d535ee44351c2ed718113b2ea1c57740e056373fbd52286dc8732c20e09ed4" Sep 30 14:22:28 crc kubenswrapper[4676]: I0930 14:22:28.249991 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08d535ee44351c2ed718113b2ea1c57740e056373fbd52286dc8732c20e09ed4"} err="failed to get container status \"08d535ee44351c2ed718113b2ea1c57740e056373fbd52286dc8732c20e09ed4\": rpc error: code = NotFound desc = could not find container \"08d535ee44351c2ed718113b2ea1c57740e056373fbd52286dc8732c20e09ed4\": container with ID starting with 08d535ee44351c2ed718113b2ea1c57740e056373fbd52286dc8732c20e09ed4 not found: ID does not exist" Sep 30 14:22:28 crc kubenswrapper[4676]: I0930 14:22:28.264115 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 14:22:28 crc kubenswrapper[4676]: I0930 14:22:28.277654 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 14:22:28 crc kubenswrapper[4676]: I0930 14:22:28.288136 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 14:22:28 crc kubenswrapper[4676]: E0930 14:22:28.288606 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29ef64ab-bbf7-4ab9-ba3d-938dd6d90221" containerName="nova-scheduler-scheduler" Sep 30 14:22:28 crc kubenswrapper[4676]: I0930 14:22:28.288628 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="29ef64ab-bbf7-4ab9-ba3d-938dd6d90221" containerName="nova-scheduler-scheduler" Sep 30 14:22:28 crc kubenswrapper[4676]: I0930 14:22:28.288847 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="29ef64ab-bbf7-4ab9-ba3d-938dd6d90221" containerName="nova-scheduler-scheduler" Sep 30 14:22:28 crc kubenswrapper[4676]: I0930 14:22:28.289559 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 14:22:28 crc kubenswrapper[4676]: I0930 14:22:28.292285 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Sep 30 14:22:28 crc kubenswrapper[4676]: I0930 14:22:28.315262 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 14:22:28 crc kubenswrapper[4676]: I0930 14:22:28.374727 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c789c39-fefd-45f4-b5f2-1e85711b5d88-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2c789c39-fefd-45f4-b5f2-1e85711b5d88\") " pod="openstack/nova-scheduler-0" Sep 30 14:22:28 crc kubenswrapper[4676]: I0930 14:22:28.375255 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfpk4\" (UniqueName: \"kubernetes.io/projected/2c789c39-fefd-45f4-b5f2-1e85711b5d88-kube-api-access-rfpk4\") pod \"nova-scheduler-0\" (UID: \"2c789c39-fefd-45f4-b5f2-1e85711b5d88\") " pod="openstack/nova-scheduler-0" Sep 30 14:22:28 crc kubenswrapper[4676]: I0930 14:22:28.375391 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c789c39-fefd-45f4-b5f2-1e85711b5d88-config-data\") pod \"nova-scheduler-0\" (UID: \"2c789c39-fefd-45f4-b5f2-1e85711b5d88\") " pod="openstack/nova-scheduler-0" Sep 30 14:22:28 crc kubenswrapper[4676]: W0930 14:22:28.383356 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea1ffd3a_4915_4ac8_9bdd_ee0bbad2e94a.slice/crio-64ecba4ef51dcb744f30879a755e9091300ef82e1090a1c11b57a9592fc2940b WatchSource:0}: Error finding container 64ecba4ef51dcb744f30879a755e9091300ef82e1090a1c11b57a9592fc2940b: Status 404 returned error can't find the container with id 64ecba4ef51dcb744f30879a755e9091300ef82e1090a1c11b57a9592fc2940b Sep 30 14:22:28 crc kubenswrapper[4676]: I0930 14:22:28.389872 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 14:22:28 crc kubenswrapper[4676]: I0930 14:22:28.477037 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfpk4\" (UniqueName: \"kubernetes.io/projected/2c789c39-fefd-45f4-b5f2-1e85711b5d88-kube-api-access-rfpk4\") pod \"nova-scheduler-0\" (UID: \"2c789c39-fefd-45f4-b5f2-1e85711b5d88\") " pod="openstack/nova-scheduler-0" Sep 30 14:22:28 crc kubenswrapper[4676]: I0930 14:22:28.477093 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c789c39-fefd-45f4-b5f2-1e85711b5d88-config-data\") pod \"nova-scheduler-0\" (UID: \"2c789c39-fefd-45f4-b5f2-1e85711b5d88\") " pod="openstack/nova-scheduler-0" Sep 30 14:22:28 crc kubenswrapper[4676]: I0930 14:22:28.477224 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c789c39-fefd-45f4-b5f2-1e85711b5d88-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2c789c39-fefd-45f4-b5f2-1e85711b5d88\") " pod="openstack/nova-scheduler-0" Sep 30 14:22:28 crc kubenswrapper[4676]: I0930 14:22:28.483554 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c789c39-fefd-45f4-b5f2-1e85711b5d88-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2c789c39-fefd-45f4-b5f2-1e85711b5d88\") " pod="openstack/nova-scheduler-0" Sep 30 14:22:28 crc kubenswrapper[4676]: I0930 14:22:28.484076 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c789c39-fefd-45f4-b5f2-1e85711b5d88-config-data\") pod \"nova-scheduler-0\" (UID: \"2c789c39-fefd-45f4-b5f2-1e85711b5d88\") " pod="openstack/nova-scheduler-0" Sep 30 14:22:28 crc kubenswrapper[4676]: I0930 14:22:28.495792 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfpk4\" (UniqueName: \"kubernetes.io/projected/2c789c39-fefd-45f4-b5f2-1e85711b5d88-kube-api-access-rfpk4\") pod \"nova-scheduler-0\" (UID: \"2c789c39-fefd-45f4-b5f2-1e85711b5d88\") " pod="openstack/nova-scheduler-0" Sep 30 14:22:28 crc kubenswrapper[4676]: I0930 14:22:28.609237 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 14:22:29 crc kubenswrapper[4676]: W0930 14:22:29.066827 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c789c39_fefd_45f4_b5f2_1e85711b5d88.slice/crio-1873c4369e07d3b9b6f9d00e2c85fb66b0a527855abb33e55106cb1deb811a00 WatchSource:0}: Error finding container 1873c4369e07d3b9b6f9d00e2c85fb66b0a527855abb33e55106cb1deb811a00: Status 404 returned error can't find the container with id 1873c4369e07d3b9b6f9d00e2c85fb66b0a527855abb33e55106cb1deb811a00 Sep 30 14:22:29 crc kubenswrapper[4676]: I0930 14:22:29.068116 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 14:22:29 crc kubenswrapper[4676]: I0930 14:22:29.237987 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2c789c39-fefd-45f4-b5f2-1e85711b5d88","Type":"ContainerStarted","Data":"1873c4369e07d3b9b6f9d00e2c85fb66b0a527855abb33e55106cb1deb811a00"} Sep 30 14:22:29 crc kubenswrapper[4676]: I0930 14:22:29.240679 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a","Type":"ContainerStarted","Data":"a3d445756cdce10eeb166f4cafd30feb11ae8771d7477a6e69b9bfdeb67a8793"} Sep 30 14:22:29 crc kubenswrapper[4676]: I0930 14:22:29.240737 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a","Type":"ContainerStarted","Data":"a8633ab480e720f25318372559fffacd77af460ef7c8cb22da21da981d967676"} Sep 30 14:22:29 crc kubenswrapper[4676]: I0930 14:22:29.240755 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a","Type":"ContainerStarted","Data":"64ecba4ef51dcb744f30879a755e9091300ef82e1090a1c11b57a9592fc2940b"} Sep 30 14:22:29 crc kubenswrapper[4676]: I0930 14:22:29.268468 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.268447714 podStartE2EDuration="2.268447714s" podCreationTimestamp="2025-09-30 14:22:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:22:29.259277284 +0000 UTC m=+1453.242365723" watchObservedRunningTime="2025-09-30 14:22:29.268447714 +0000 UTC m=+1453.251536143" Sep 30 14:22:29 crc kubenswrapper[4676]: I0930 14:22:29.446101 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28068cf5-7fa1-4380-be99-4c0951283c8a" path="/var/lib/kubelet/pods/28068cf5-7fa1-4380-be99-4c0951283c8a/volumes" Sep 30 14:22:29 crc kubenswrapper[4676]: I0930 14:22:29.446690 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29ef64ab-bbf7-4ab9-ba3d-938dd6d90221" path="/var/lib/kubelet/pods/29ef64ab-bbf7-4ab9-ba3d-938dd6d90221/volumes" Sep 30 14:22:29 crc kubenswrapper[4676]: I0930 14:22:29.876600 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 14:22:29 crc kubenswrapper[4676]: I0930 14:22:29.877982 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 14:22:30 crc kubenswrapper[4676]: I0930 14:22:30.259458 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2c789c39-fefd-45f4-b5f2-1e85711b5d88","Type":"ContainerStarted","Data":"c28faf240d8a12e9f2d3fcd25b1b64c2869979e9fa93b3d2c1075499cbd773d0"} Sep 30 14:22:30 crc kubenswrapper[4676]: I0930 14:22:30.278426 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.278407981 podStartE2EDuration="2.278407981s" podCreationTimestamp="2025-09-30 14:22:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:22:30.274860068 +0000 UTC m=+1454.257948497" watchObservedRunningTime="2025-09-30 14:22:30.278407981 +0000 UTC m=+1454.261496410" Sep 30 14:22:31 crc kubenswrapper[4676]: I0930 14:22:31.288420 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 14:22:31 crc kubenswrapper[4676]: I0930 14:22:31.288848 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="ad8d4649-f28a-4d12-884f-44308450c02b" containerName="kube-state-metrics" containerID="cri-o://f5f8191a45fd716a3ac9c49c6e38b898a0404f7fa3f2f5cdcf56ce5ef014b738" gracePeriod=30 Sep 30 14:22:31 crc kubenswrapper[4676]: I0930 14:22:31.848068 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 30 14:22:31 crc kubenswrapper[4676]: I0930 14:22:31.945926 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-785br\" (UniqueName: \"kubernetes.io/projected/ad8d4649-f28a-4d12-884f-44308450c02b-kube-api-access-785br\") pod \"ad8d4649-f28a-4d12-884f-44308450c02b\" (UID: \"ad8d4649-f28a-4d12-884f-44308450c02b\") " Sep 30 14:22:31 crc kubenswrapper[4676]: I0930 14:22:31.968065 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad8d4649-f28a-4d12-884f-44308450c02b-kube-api-access-785br" (OuterVolumeSpecName: "kube-api-access-785br") pod "ad8d4649-f28a-4d12-884f-44308450c02b" (UID: "ad8d4649-f28a-4d12-884f-44308450c02b"). InnerVolumeSpecName "kube-api-access-785br". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:22:32 crc kubenswrapper[4676]: I0930 14:22:32.047984 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-785br\" (UniqueName: \"kubernetes.io/projected/ad8d4649-f28a-4d12-884f-44308450c02b-kube-api-access-785br\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:32 crc kubenswrapper[4676]: I0930 14:22:32.278616 4676 generic.go:334] "Generic (PLEG): container finished" podID="ad8d4649-f28a-4d12-884f-44308450c02b" containerID="f5f8191a45fd716a3ac9c49c6e38b898a0404f7fa3f2f5cdcf56ce5ef014b738" exitCode=2 Sep 30 14:22:32 crc kubenswrapper[4676]: I0930 14:22:32.278657 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ad8d4649-f28a-4d12-884f-44308450c02b","Type":"ContainerDied","Data":"f5f8191a45fd716a3ac9c49c6e38b898a0404f7fa3f2f5cdcf56ce5ef014b738"} Sep 30 14:22:32 crc kubenswrapper[4676]: I0930 14:22:32.278682 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ad8d4649-f28a-4d12-884f-44308450c02b","Type":"ContainerDied","Data":"489c0e0f1d022156a69c64aa6de6015d64a148f632f31bbe91c64858f5e199d0"} Sep 30 14:22:32 crc kubenswrapper[4676]: I0930 14:22:32.278697 4676 scope.go:117] "RemoveContainer" containerID="f5f8191a45fd716a3ac9c49c6e38b898a0404f7fa3f2f5cdcf56ce5ef014b738" Sep 30 14:22:32 crc kubenswrapper[4676]: I0930 14:22:32.278702 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 30 14:22:32 crc kubenswrapper[4676]: I0930 14:22:32.307853 4676 scope.go:117] "RemoveContainer" containerID="f5f8191a45fd716a3ac9c49c6e38b898a0404f7fa3f2f5cdcf56ce5ef014b738" Sep 30 14:22:32 crc kubenswrapper[4676]: E0930 14:22:32.309082 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5f8191a45fd716a3ac9c49c6e38b898a0404f7fa3f2f5cdcf56ce5ef014b738\": container with ID starting with f5f8191a45fd716a3ac9c49c6e38b898a0404f7fa3f2f5cdcf56ce5ef014b738 not found: ID does not exist" containerID="f5f8191a45fd716a3ac9c49c6e38b898a0404f7fa3f2f5cdcf56ce5ef014b738" Sep 30 14:22:32 crc kubenswrapper[4676]: I0930 14:22:32.309121 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5f8191a45fd716a3ac9c49c6e38b898a0404f7fa3f2f5cdcf56ce5ef014b738"} err="failed to get container status \"f5f8191a45fd716a3ac9c49c6e38b898a0404f7fa3f2f5cdcf56ce5ef014b738\": rpc error: code = NotFound desc = could not find container \"f5f8191a45fd716a3ac9c49c6e38b898a0404f7fa3f2f5cdcf56ce5ef014b738\": container with ID starting with f5f8191a45fd716a3ac9c49c6e38b898a0404f7fa3f2f5cdcf56ce5ef014b738 not found: ID does not exist" Sep 30 14:22:32 crc kubenswrapper[4676]: I0930 14:22:32.314234 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 14:22:32 crc kubenswrapper[4676]: I0930 14:22:32.330405 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 14:22:32 crc kubenswrapper[4676]: I0930 14:22:32.339698 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 14:22:32 crc kubenswrapper[4676]: E0930 14:22:32.340192 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad8d4649-f28a-4d12-884f-44308450c02b" containerName="kube-state-metrics" Sep 30 14:22:32 crc kubenswrapper[4676]: I0930 14:22:32.340206 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad8d4649-f28a-4d12-884f-44308450c02b" containerName="kube-state-metrics" Sep 30 14:22:32 crc kubenswrapper[4676]: I0930 14:22:32.340437 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad8d4649-f28a-4d12-884f-44308450c02b" containerName="kube-state-metrics" Sep 30 14:22:32 crc kubenswrapper[4676]: I0930 14:22:32.341186 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 30 14:22:32 crc kubenswrapper[4676]: I0930 14:22:32.345227 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Sep 30 14:22:32 crc kubenswrapper[4676]: I0930 14:22:32.345422 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Sep 30 14:22:32 crc kubenswrapper[4676]: I0930 14:22:32.350039 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 14:22:32 crc kubenswrapper[4676]: I0930 14:22:32.456387 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sg5p4\" (UniqueName: \"kubernetes.io/projected/ab292b94-70ab-4d77-9100-d6db2654e3e2-kube-api-access-sg5p4\") pod \"kube-state-metrics-0\" (UID: \"ab292b94-70ab-4d77-9100-d6db2654e3e2\") " pod="openstack/kube-state-metrics-0" Sep 30 14:22:32 crc kubenswrapper[4676]: I0930 14:22:32.456525 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab292b94-70ab-4d77-9100-d6db2654e3e2-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"ab292b94-70ab-4d77-9100-d6db2654e3e2\") " pod="openstack/kube-state-metrics-0" Sep 30 14:22:32 crc kubenswrapper[4676]: I0930 14:22:32.456598 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/ab292b94-70ab-4d77-9100-d6db2654e3e2-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"ab292b94-70ab-4d77-9100-d6db2654e3e2\") " pod="openstack/kube-state-metrics-0" Sep 30 14:22:32 crc kubenswrapper[4676]: I0930 14:22:32.456656 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab292b94-70ab-4d77-9100-d6db2654e3e2-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"ab292b94-70ab-4d77-9100-d6db2654e3e2\") " pod="openstack/kube-state-metrics-0" Sep 30 14:22:32 crc kubenswrapper[4676]: I0930 14:22:32.558279 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sg5p4\" (UniqueName: \"kubernetes.io/projected/ab292b94-70ab-4d77-9100-d6db2654e3e2-kube-api-access-sg5p4\") pod \"kube-state-metrics-0\" (UID: \"ab292b94-70ab-4d77-9100-d6db2654e3e2\") " pod="openstack/kube-state-metrics-0" Sep 30 14:22:32 crc kubenswrapper[4676]: I0930 14:22:32.558374 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab292b94-70ab-4d77-9100-d6db2654e3e2-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"ab292b94-70ab-4d77-9100-d6db2654e3e2\") " pod="openstack/kube-state-metrics-0" Sep 30 14:22:32 crc kubenswrapper[4676]: I0930 14:22:32.558401 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/ab292b94-70ab-4d77-9100-d6db2654e3e2-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"ab292b94-70ab-4d77-9100-d6db2654e3e2\") " pod="openstack/kube-state-metrics-0" Sep 30 14:22:32 crc kubenswrapper[4676]: I0930 14:22:32.558446 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab292b94-70ab-4d77-9100-d6db2654e3e2-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"ab292b94-70ab-4d77-9100-d6db2654e3e2\") " pod="openstack/kube-state-metrics-0" Sep 30 14:22:32 crc kubenswrapper[4676]: I0930 14:22:32.562794 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab292b94-70ab-4d77-9100-d6db2654e3e2-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"ab292b94-70ab-4d77-9100-d6db2654e3e2\") " pod="openstack/kube-state-metrics-0" Sep 30 14:22:32 crc kubenswrapper[4676]: I0930 14:22:32.562801 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/ab292b94-70ab-4d77-9100-d6db2654e3e2-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"ab292b94-70ab-4d77-9100-d6db2654e3e2\") " pod="openstack/kube-state-metrics-0" Sep 30 14:22:32 crc kubenswrapper[4676]: I0930 14:22:32.563651 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab292b94-70ab-4d77-9100-d6db2654e3e2-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"ab292b94-70ab-4d77-9100-d6db2654e3e2\") " pod="openstack/kube-state-metrics-0" Sep 30 14:22:32 crc kubenswrapper[4676]: I0930 14:22:32.580016 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sg5p4\" (UniqueName: \"kubernetes.io/projected/ab292b94-70ab-4d77-9100-d6db2654e3e2-kube-api-access-sg5p4\") pod \"kube-state-metrics-0\" (UID: \"ab292b94-70ab-4d77-9100-d6db2654e3e2\") " pod="openstack/kube-state-metrics-0" Sep 30 14:22:32 crc kubenswrapper[4676]: I0930 14:22:32.667277 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 30 14:22:33 crc kubenswrapper[4676]: I0930 14:22:33.155727 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 14:22:33 crc kubenswrapper[4676]: I0930 14:22:33.156443 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="551886de-0586-40d2-9d65-f19496c555db" containerName="ceilometer-central-agent" containerID="cri-o://bc962a29570941ce10e5edf24761529f5cfee013d3ceb729eee3c55dddb9277d" gracePeriod=30 Sep 30 14:22:33 crc kubenswrapper[4676]: I0930 14:22:33.156524 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="551886de-0586-40d2-9d65-f19496c555db" containerName="sg-core" containerID="cri-o://8bef1ec8969b5f57b63e138b94702194da7c4a0eb4136ff8dbe0405806081bd4" gracePeriod=30 Sep 30 14:22:33 crc kubenswrapper[4676]: I0930 14:22:33.156578 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="551886de-0586-40d2-9d65-f19496c555db" containerName="proxy-httpd" containerID="cri-o://aa687090bceded467ced263240b51137f6f0f09d3b601ca8d12e12c3ea6beb4e" gracePeriod=30 Sep 30 14:22:33 crc kubenswrapper[4676]: I0930 14:22:33.156591 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="551886de-0586-40d2-9d65-f19496c555db" containerName="ceilometer-notification-agent" containerID="cri-o://b095d607af66caf02415d6e41a8008b217cce04367dfa59a33694fb00433d26b" gracePeriod=30 Sep 30 14:22:33 crc kubenswrapper[4676]: I0930 14:22:33.189960 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 14:22:33 crc kubenswrapper[4676]: I0930 14:22:33.288154 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ab292b94-70ab-4d77-9100-d6db2654e3e2","Type":"ContainerStarted","Data":"8a823ec0a4ec09c4c1e10c9c5ab0ac55385b02ebef6e153d4161e87dfd7ac71d"} Sep 30 14:22:33 crc kubenswrapper[4676]: I0930 14:22:33.290355 4676 generic.go:334] "Generic (PLEG): container finished" podID="551886de-0586-40d2-9d65-f19496c555db" containerID="8bef1ec8969b5f57b63e138b94702194da7c4a0eb4136ff8dbe0405806081bd4" exitCode=2 Sep 30 14:22:33 crc kubenswrapper[4676]: I0930 14:22:33.290401 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"551886de-0586-40d2-9d65-f19496c555db","Type":"ContainerDied","Data":"8bef1ec8969b5f57b63e138b94702194da7c4a0eb4136ff8dbe0405806081bd4"} Sep 30 14:22:33 crc kubenswrapper[4676]: I0930 14:22:33.450579 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad8d4649-f28a-4d12-884f-44308450c02b" path="/var/lib/kubelet/pods/ad8d4649-f28a-4d12-884f-44308450c02b/volumes" Sep 30 14:22:33 crc kubenswrapper[4676]: I0930 14:22:33.609790 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Sep 30 14:22:33 crc kubenswrapper[4676]: I0930 14:22:33.838201 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-z9nrs" Sep 30 14:22:33 crc kubenswrapper[4676]: I0930 14:22:33.897660 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-z9nrs" Sep 30 14:22:34 crc kubenswrapper[4676]: I0930 14:22:34.073203 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-z9nrs"] Sep 30 14:22:34 crc kubenswrapper[4676]: I0930 14:22:34.302336 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ab292b94-70ab-4d77-9100-d6db2654e3e2","Type":"ContainerStarted","Data":"051aefd4b106b3f2b64024282db3ac899ee4d6dc2cbb9efe5e90fbf4b705d859"} Sep 30 14:22:34 crc kubenswrapper[4676]: I0930 14:22:34.302405 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Sep 30 14:22:34 crc kubenswrapper[4676]: I0930 14:22:34.305122 4676 generic.go:334] "Generic (PLEG): container finished" podID="551886de-0586-40d2-9d65-f19496c555db" containerID="aa687090bceded467ced263240b51137f6f0f09d3b601ca8d12e12c3ea6beb4e" exitCode=0 Sep 30 14:22:34 crc kubenswrapper[4676]: I0930 14:22:34.305154 4676 generic.go:334] "Generic (PLEG): container finished" podID="551886de-0586-40d2-9d65-f19496c555db" containerID="bc962a29570941ce10e5edf24761529f5cfee013d3ceb729eee3c55dddb9277d" exitCode=0 Sep 30 14:22:34 crc kubenswrapper[4676]: I0930 14:22:34.305199 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"551886de-0586-40d2-9d65-f19496c555db","Type":"ContainerDied","Data":"aa687090bceded467ced263240b51137f6f0f09d3b601ca8d12e12c3ea6beb4e"} Sep 30 14:22:34 crc kubenswrapper[4676]: I0930 14:22:34.305239 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"551886de-0586-40d2-9d65-f19496c555db","Type":"ContainerDied","Data":"bc962a29570941ce10e5edf24761529f5cfee013d3ceb729eee3c55dddb9277d"} Sep 30 14:22:34 crc kubenswrapper[4676]: I0930 14:22:34.323983 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.973507345 podStartE2EDuration="2.323961879s" podCreationTimestamp="2025-09-30 14:22:32 +0000 UTC" firstStartedPulling="2025-09-30 14:22:33.197050437 +0000 UTC m=+1457.180138866" lastFinishedPulling="2025-09-30 14:22:33.547504971 +0000 UTC m=+1457.530593400" observedRunningTime="2025-09-30 14:22:34.322306096 +0000 UTC m=+1458.305394535" watchObservedRunningTime="2025-09-30 14:22:34.323961879 +0000 UTC m=+1458.307050308" Sep 30 14:22:34 crc kubenswrapper[4676]: I0930 14:22:34.875986 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Sep 30 14:22:34 crc kubenswrapper[4676]: I0930 14:22:34.876043 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Sep 30 14:22:35 crc kubenswrapper[4676]: I0930 14:22:35.315910 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-z9nrs" podUID="be110bf6-fee0-4775-a76c-938038512ef6" containerName="registry-server" containerID="cri-o://422ec58290029a743e195f429ff10be0a08f1f2ce25f3b54d3f5e408ea47c275" gracePeriod=2 Sep 30 14:22:35 crc kubenswrapper[4676]: I0930 14:22:35.903260 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="36d8ac97-2979-408c-add8-bbda7b01abe2" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.193:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 14:22:35 crc kubenswrapper[4676]: I0930 14:22:35.903325 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="36d8ac97-2979-408c-add8-bbda7b01abe2" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.193:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 14:22:35 crc kubenswrapper[4676]: I0930 14:22:35.917305 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z9nrs" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.033724 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be110bf6-fee0-4775-a76c-938038512ef6-catalog-content\") pod \"be110bf6-fee0-4775-a76c-938038512ef6\" (UID: \"be110bf6-fee0-4775-a76c-938038512ef6\") " Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.033855 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2nct9\" (UniqueName: \"kubernetes.io/projected/be110bf6-fee0-4775-a76c-938038512ef6-kube-api-access-2nct9\") pod \"be110bf6-fee0-4775-a76c-938038512ef6\" (UID: \"be110bf6-fee0-4775-a76c-938038512ef6\") " Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.033985 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be110bf6-fee0-4775-a76c-938038512ef6-utilities\") pod \"be110bf6-fee0-4775-a76c-938038512ef6\" (UID: \"be110bf6-fee0-4775-a76c-938038512ef6\") " Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.034929 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be110bf6-fee0-4775-a76c-938038512ef6-utilities" (OuterVolumeSpecName: "utilities") pod "be110bf6-fee0-4775-a76c-938038512ef6" (UID: "be110bf6-fee0-4775-a76c-938038512ef6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.040342 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be110bf6-fee0-4775-a76c-938038512ef6-kube-api-access-2nct9" (OuterVolumeSpecName: "kube-api-access-2nct9") pod "be110bf6-fee0-4775-a76c-938038512ef6" (UID: "be110bf6-fee0-4775-a76c-938038512ef6"). InnerVolumeSpecName "kube-api-access-2nct9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.082912 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be110bf6-fee0-4775-a76c-938038512ef6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "be110bf6-fee0-4775-a76c-938038512ef6" (UID: "be110bf6-fee0-4775-a76c-938038512ef6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.135812 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2nct9\" (UniqueName: \"kubernetes.io/projected/be110bf6-fee0-4775-a76c-938038512ef6-kube-api-access-2nct9\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.136097 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be110bf6-fee0-4775-a76c-938038512ef6-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.136187 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be110bf6-fee0-4775-a76c-938038512ef6-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.150416 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.239537 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/551886de-0586-40d2-9d65-f19496c555db-log-httpd\") pod \"551886de-0586-40d2-9d65-f19496c555db\" (UID: \"551886de-0586-40d2-9d65-f19496c555db\") " Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.239826 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/551886de-0586-40d2-9d65-f19496c555db-combined-ca-bundle\") pod \"551886de-0586-40d2-9d65-f19496c555db\" (UID: \"551886de-0586-40d2-9d65-f19496c555db\") " Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.239965 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/551886de-0586-40d2-9d65-f19496c555db-sg-core-conf-yaml\") pod \"551886de-0586-40d2-9d65-f19496c555db\" (UID: \"551886de-0586-40d2-9d65-f19496c555db\") " Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.240169 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/551886de-0586-40d2-9d65-f19496c555db-config-data\") pod \"551886de-0586-40d2-9d65-f19496c555db\" (UID: \"551886de-0586-40d2-9d65-f19496c555db\") " Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.240258 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/551886de-0586-40d2-9d65-f19496c555db-scripts\") pod \"551886de-0586-40d2-9d65-f19496c555db\" (UID: \"551886de-0586-40d2-9d65-f19496c555db\") " Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.240340 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/551886de-0586-40d2-9d65-f19496c555db-run-httpd\") pod \"551886de-0586-40d2-9d65-f19496c555db\" (UID: \"551886de-0586-40d2-9d65-f19496c555db\") " Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.240475 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxntr\" (UniqueName: \"kubernetes.io/projected/551886de-0586-40d2-9d65-f19496c555db-kube-api-access-hxntr\") pod \"551886de-0586-40d2-9d65-f19496c555db\" (UID: \"551886de-0586-40d2-9d65-f19496c555db\") " Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.244511 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/551886de-0586-40d2-9d65-f19496c555db-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "551886de-0586-40d2-9d65-f19496c555db" (UID: "551886de-0586-40d2-9d65-f19496c555db"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.245208 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/551886de-0586-40d2-9d65-f19496c555db-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "551886de-0586-40d2-9d65-f19496c555db" (UID: "551886de-0586-40d2-9d65-f19496c555db"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.247052 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/551886de-0586-40d2-9d65-f19496c555db-kube-api-access-hxntr" (OuterVolumeSpecName: "kube-api-access-hxntr") pod "551886de-0586-40d2-9d65-f19496c555db" (UID: "551886de-0586-40d2-9d65-f19496c555db"). InnerVolumeSpecName "kube-api-access-hxntr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.253012 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/551886de-0586-40d2-9d65-f19496c555db-scripts" (OuterVolumeSpecName: "scripts") pod "551886de-0586-40d2-9d65-f19496c555db" (UID: "551886de-0586-40d2-9d65-f19496c555db"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.279202 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/551886de-0586-40d2-9d65-f19496c555db-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "551886de-0586-40d2-9d65-f19496c555db" (UID: "551886de-0586-40d2-9d65-f19496c555db"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.323329 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/551886de-0586-40d2-9d65-f19496c555db-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "551886de-0586-40d2-9d65-f19496c555db" (UID: "551886de-0586-40d2-9d65-f19496c555db"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.327242 4676 generic.go:334] "Generic (PLEG): container finished" podID="551886de-0586-40d2-9d65-f19496c555db" containerID="b095d607af66caf02415d6e41a8008b217cce04367dfa59a33694fb00433d26b" exitCode=0 Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.327359 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.328291 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"551886de-0586-40d2-9d65-f19496c555db","Type":"ContainerDied","Data":"b095d607af66caf02415d6e41a8008b217cce04367dfa59a33694fb00433d26b"} Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.328330 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"551886de-0586-40d2-9d65-f19496c555db","Type":"ContainerDied","Data":"02bef6143c61f7e746313cd45735112772f757786d32f59f3834ba0fbf7dee37"} Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.328351 4676 scope.go:117] "RemoveContainer" containerID="aa687090bceded467ced263240b51137f6f0f09d3b601ca8d12e12c3ea6beb4e" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.334670 4676 generic.go:334] "Generic (PLEG): container finished" podID="be110bf6-fee0-4775-a76c-938038512ef6" containerID="422ec58290029a743e195f429ff10be0a08f1f2ce25f3b54d3f5e408ea47c275" exitCode=0 Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.334737 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z9nrs" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.334914 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z9nrs" event={"ID":"be110bf6-fee0-4775-a76c-938038512ef6","Type":"ContainerDied","Data":"422ec58290029a743e195f429ff10be0a08f1f2ce25f3b54d3f5e408ea47c275"} Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.335041 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z9nrs" event={"ID":"be110bf6-fee0-4775-a76c-938038512ef6","Type":"ContainerDied","Data":"d98931d4a192aa0a637c8193c426a0ec484a7a486365741213ea8e7d4cfc7811"} Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.343009 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxntr\" (UniqueName: \"kubernetes.io/projected/551886de-0586-40d2-9d65-f19496c555db-kube-api-access-hxntr\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.343051 4676 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/551886de-0586-40d2-9d65-f19496c555db-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.343063 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/551886de-0586-40d2-9d65-f19496c555db-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.343075 4676 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/551886de-0586-40d2-9d65-f19496c555db-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.343087 4676 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/551886de-0586-40d2-9d65-f19496c555db-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.343096 4676 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/551886de-0586-40d2-9d65-f19496c555db-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.355679 4676 scope.go:117] "RemoveContainer" containerID="8bef1ec8969b5f57b63e138b94702194da7c4a0eb4136ff8dbe0405806081bd4" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.383520 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-z9nrs"] Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.389495 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/551886de-0586-40d2-9d65-f19496c555db-config-data" (OuterVolumeSpecName: "config-data") pod "551886de-0586-40d2-9d65-f19496c555db" (UID: "551886de-0586-40d2-9d65-f19496c555db"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.392466 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-z9nrs"] Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.395998 4676 scope.go:117] "RemoveContainer" containerID="b095d607af66caf02415d6e41a8008b217cce04367dfa59a33694fb00433d26b" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.418958 4676 scope.go:117] "RemoveContainer" containerID="bc962a29570941ce10e5edf24761529f5cfee013d3ceb729eee3c55dddb9277d" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.444555 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/551886de-0586-40d2-9d65-f19496c555db-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.498584 4676 scope.go:117] "RemoveContainer" containerID="aa687090bceded467ced263240b51137f6f0f09d3b601ca8d12e12c3ea6beb4e" Sep 30 14:22:36 crc kubenswrapper[4676]: E0930 14:22:36.499068 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa687090bceded467ced263240b51137f6f0f09d3b601ca8d12e12c3ea6beb4e\": container with ID starting with aa687090bceded467ced263240b51137f6f0f09d3b601ca8d12e12c3ea6beb4e not found: ID does not exist" containerID="aa687090bceded467ced263240b51137f6f0f09d3b601ca8d12e12c3ea6beb4e" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.499109 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa687090bceded467ced263240b51137f6f0f09d3b601ca8d12e12c3ea6beb4e"} err="failed to get container status \"aa687090bceded467ced263240b51137f6f0f09d3b601ca8d12e12c3ea6beb4e\": rpc error: code = NotFound desc = could not find container \"aa687090bceded467ced263240b51137f6f0f09d3b601ca8d12e12c3ea6beb4e\": container with ID starting with aa687090bceded467ced263240b51137f6f0f09d3b601ca8d12e12c3ea6beb4e not found: ID does not exist" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.499140 4676 scope.go:117] "RemoveContainer" containerID="8bef1ec8969b5f57b63e138b94702194da7c4a0eb4136ff8dbe0405806081bd4" Sep 30 14:22:36 crc kubenswrapper[4676]: E0930 14:22:36.499653 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bef1ec8969b5f57b63e138b94702194da7c4a0eb4136ff8dbe0405806081bd4\": container with ID starting with 8bef1ec8969b5f57b63e138b94702194da7c4a0eb4136ff8dbe0405806081bd4 not found: ID does not exist" containerID="8bef1ec8969b5f57b63e138b94702194da7c4a0eb4136ff8dbe0405806081bd4" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.499684 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bef1ec8969b5f57b63e138b94702194da7c4a0eb4136ff8dbe0405806081bd4"} err="failed to get container status \"8bef1ec8969b5f57b63e138b94702194da7c4a0eb4136ff8dbe0405806081bd4\": rpc error: code = NotFound desc = could not find container \"8bef1ec8969b5f57b63e138b94702194da7c4a0eb4136ff8dbe0405806081bd4\": container with ID starting with 8bef1ec8969b5f57b63e138b94702194da7c4a0eb4136ff8dbe0405806081bd4 not found: ID does not exist" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.499704 4676 scope.go:117] "RemoveContainer" containerID="b095d607af66caf02415d6e41a8008b217cce04367dfa59a33694fb00433d26b" Sep 30 14:22:36 crc kubenswrapper[4676]: E0930 14:22:36.500056 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b095d607af66caf02415d6e41a8008b217cce04367dfa59a33694fb00433d26b\": container with ID starting with b095d607af66caf02415d6e41a8008b217cce04367dfa59a33694fb00433d26b not found: ID does not exist" containerID="b095d607af66caf02415d6e41a8008b217cce04367dfa59a33694fb00433d26b" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.500089 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b095d607af66caf02415d6e41a8008b217cce04367dfa59a33694fb00433d26b"} err="failed to get container status \"b095d607af66caf02415d6e41a8008b217cce04367dfa59a33694fb00433d26b\": rpc error: code = NotFound desc = could not find container \"b095d607af66caf02415d6e41a8008b217cce04367dfa59a33694fb00433d26b\": container with ID starting with b095d607af66caf02415d6e41a8008b217cce04367dfa59a33694fb00433d26b not found: ID does not exist" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.500108 4676 scope.go:117] "RemoveContainer" containerID="bc962a29570941ce10e5edf24761529f5cfee013d3ceb729eee3c55dddb9277d" Sep 30 14:22:36 crc kubenswrapper[4676]: E0930 14:22:36.500388 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc962a29570941ce10e5edf24761529f5cfee013d3ceb729eee3c55dddb9277d\": container with ID starting with bc962a29570941ce10e5edf24761529f5cfee013d3ceb729eee3c55dddb9277d not found: ID does not exist" containerID="bc962a29570941ce10e5edf24761529f5cfee013d3ceb729eee3c55dddb9277d" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.500427 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc962a29570941ce10e5edf24761529f5cfee013d3ceb729eee3c55dddb9277d"} err="failed to get container status \"bc962a29570941ce10e5edf24761529f5cfee013d3ceb729eee3c55dddb9277d\": rpc error: code = NotFound desc = could not find container \"bc962a29570941ce10e5edf24761529f5cfee013d3ceb729eee3c55dddb9277d\": container with ID starting with bc962a29570941ce10e5edf24761529f5cfee013d3ceb729eee3c55dddb9277d not found: ID does not exist" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.500447 4676 scope.go:117] "RemoveContainer" containerID="422ec58290029a743e195f429ff10be0a08f1f2ce25f3b54d3f5e408ea47c275" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.519247 4676 scope.go:117] "RemoveContainer" containerID="a8d91489a641685c7920b38d3cd7ed9dbd4943a183729eb3f8b638cffb8cb83a" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.567370 4676 scope.go:117] "RemoveContainer" containerID="ae5d5cccd9f0468b8be7e5bb1a335da24dcb42caca75add54f016e59b38f7d4c" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.590030 4676 scope.go:117] "RemoveContainer" containerID="422ec58290029a743e195f429ff10be0a08f1f2ce25f3b54d3f5e408ea47c275" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.590071 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/kube-state-metrics-0" podUID="ad8d4649-f28a-4d12-884f-44308450c02b" containerName="kube-state-metrics" probeResult="failure" output="Get \"http://10.217.0.105:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 14:22:36 crc kubenswrapper[4676]: E0930 14:22:36.593383 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"422ec58290029a743e195f429ff10be0a08f1f2ce25f3b54d3f5e408ea47c275\": container with ID starting with 422ec58290029a743e195f429ff10be0a08f1f2ce25f3b54d3f5e408ea47c275 not found: ID does not exist" containerID="422ec58290029a743e195f429ff10be0a08f1f2ce25f3b54d3f5e408ea47c275" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.593599 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"422ec58290029a743e195f429ff10be0a08f1f2ce25f3b54d3f5e408ea47c275"} err="failed to get container status \"422ec58290029a743e195f429ff10be0a08f1f2ce25f3b54d3f5e408ea47c275\": rpc error: code = NotFound desc = could not find container \"422ec58290029a743e195f429ff10be0a08f1f2ce25f3b54d3f5e408ea47c275\": container with ID starting with 422ec58290029a743e195f429ff10be0a08f1f2ce25f3b54d3f5e408ea47c275 not found: ID does not exist" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.593755 4676 scope.go:117] "RemoveContainer" containerID="a8d91489a641685c7920b38d3cd7ed9dbd4943a183729eb3f8b638cffb8cb83a" Sep 30 14:22:36 crc kubenswrapper[4676]: E0930 14:22:36.594324 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8d91489a641685c7920b38d3cd7ed9dbd4943a183729eb3f8b638cffb8cb83a\": container with ID starting with a8d91489a641685c7920b38d3cd7ed9dbd4943a183729eb3f8b638cffb8cb83a not found: ID does not exist" containerID="a8d91489a641685c7920b38d3cd7ed9dbd4943a183729eb3f8b638cffb8cb83a" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.594396 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8d91489a641685c7920b38d3cd7ed9dbd4943a183729eb3f8b638cffb8cb83a"} err="failed to get container status \"a8d91489a641685c7920b38d3cd7ed9dbd4943a183729eb3f8b638cffb8cb83a\": rpc error: code = NotFound desc = could not find container \"a8d91489a641685c7920b38d3cd7ed9dbd4943a183729eb3f8b638cffb8cb83a\": container with ID starting with a8d91489a641685c7920b38d3cd7ed9dbd4943a183729eb3f8b638cffb8cb83a not found: ID does not exist" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.594423 4676 scope.go:117] "RemoveContainer" containerID="ae5d5cccd9f0468b8be7e5bb1a335da24dcb42caca75add54f016e59b38f7d4c" Sep 30 14:22:36 crc kubenswrapper[4676]: E0930 14:22:36.594951 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae5d5cccd9f0468b8be7e5bb1a335da24dcb42caca75add54f016e59b38f7d4c\": container with ID starting with ae5d5cccd9f0468b8be7e5bb1a335da24dcb42caca75add54f016e59b38f7d4c not found: ID does not exist" containerID="ae5d5cccd9f0468b8be7e5bb1a335da24dcb42caca75add54f016e59b38f7d4c" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.595117 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae5d5cccd9f0468b8be7e5bb1a335da24dcb42caca75add54f016e59b38f7d4c"} err="failed to get container status \"ae5d5cccd9f0468b8be7e5bb1a335da24dcb42caca75add54f016e59b38f7d4c\": rpc error: code = NotFound desc = could not find container \"ae5d5cccd9f0468b8be7e5bb1a335da24dcb42caca75add54f016e59b38f7d4c\": container with ID starting with ae5d5cccd9f0468b8be7e5bb1a335da24dcb42caca75add54f016e59b38f7d4c not found: ID does not exist" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.669197 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.683406 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.694076 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 14:22:36 crc kubenswrapper[4676]: E0930 14:22:36.694601 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be110bf6-fee0-4775-a76c-938038512ef6" containerName="extract-content" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.694623 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="be110bf6-fee0-4775-a76c-938038512ef6" containerName="extract-content" Sep 30 14:22:36 crc kubenswrapper[4676]: E0930 14:22:36.694648 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be110bf6-fee0-4775-a76c-938038512ef6" containerName="extract-utilities" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.694657 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="be110bf6-fee0-4775-a76c-938038512ef6" containerName="extract-utilities" Sep 30 14:22:36 crc kubenswrapper[4676]: E0930 14:22:36.694681 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="551886de-0586-40d2-9d65-f19496c555db" containerName="sg-core" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.694691 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="551886de-0586-40d2-9d65-f19496c555db" containerName="sg-core" Sep 30 14:22:36 crc kubenswrapper[4676]: E0930 14:22:36.694711 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="551886de-0586-40d2-9d65-f19496c555db" containerName="ceilometer-notification-agent" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.694720 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="551886de-0586-40d2-9d65-f19496c555db" containerName="ceilometer-notification-agent" Sep 30 14:22:36 crc kubenswrapper[4676]: E0930 14:22:36.694732 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be110bf6-fee0-4775-a76c-938038512ef6" containerName="registry-server" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.694740 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="be110bf6-fee0-4775-a76c-938038512ef6" containerName="registry-server" Sep 30 14:22:36 crc kubenswrapper[4676]: E0930 14:22:36.694755 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="551886de-0586-40d2-9d65-f19496c555db" containerName="ceilometer-central-agent" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.694763 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="551886de-0586-40d2-9d65-f19496c555db" containerName="ceilometer-central-agent" Sep 30 14:22:36 crc kubenswrapper[4676]: E0930 14:22:36.694778 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="551886de-0586-40d2-9d65-f19496c555db" containerName="proxy-httpd" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.694786 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="551886de-0586-40d2-9d65-f19496c555db" containerName="proxy-httpd" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.695007 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="be110bf6-fee0-4775-a76c-938038512ef6" containerName="registry-server" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.695029 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="551886de-0586-40d2-9d65-f19496c555db" containerName="sg-core" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.695042 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="551886de-0586-40d2-9d65-f19496c555db" containerName="proxy-httpd" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.695057 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="551886de-0586-40d2-9d65-f19496c555db" containerName="ceilometer-notification-agent" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.695072 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="551886de-0586-40d2-9d65-f19496c555db" containerName="ceilometer-central-agent" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.696766 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.699183 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.699266 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.700503 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.707476 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.752799 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73a5791c-b9cc-4142-a4d8-fb05651a30af-config-data\") pod \"ceilometer-0\" (UID: \"73a5791c-b9cc-4142-a4d8-fb05651a30af\") " pod="openstack/ceilometer-0" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.753319 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/73a5791c-b9cc-4142-a4d8-fb05651a30af-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"73a5791c-b9cc-4142-a4d8-fb05651a30af\") " pod="openstack/ceilometer-0" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.753466 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73a5791c-b9cc-4142-a4d8-fb05651a30af-scripts\") pod \"ceilometer-0\" (UID: \"73a5791c-b9cc-4142-a4d8-fb05651a30af\") " pod="openstack/ceilometer-0" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.753630 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73a5791c-b9cc-4142-a4d8-fb05651a30af-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"73a5791c-b9cc-4142-a4d8-fb05651a30af\") " pod="openstack/ceilometer-0" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.753814 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.753974 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dc46v\" (UniqueName: \"kubernetes.io/projected/73a5791c-b9cc-4142-a4d8-fb05651a30af-kube-api-access-dc46v\") pod \"ceilometer-0\" (UID: \"73a5791c-b9cc-4142-a4d8-fb05651a30af\") " pod="openstack/ceilometer-0" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.754119 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73a5791c-b9cc-4142-a4d8-fb05651a30af-run-httpd\") pod \"ceilometer-0\" (UID: \"73a5791c-b9cc-4142-a4d8-fb05651a30af\") " pod="openstack/ceilometer-0" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.754241 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73a5791c-b9cc-4142-a4d8-fb05651a30af-log-httpd\") pod \"ceilometer-0\" (UID: \"73a5791c-b9cc-4142-a4d8-fb05651a30af\") " pod="openstack/ceilometer-0" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.754357 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/73a5791c-b9cc-4142-a4d8-fb05651a30af-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"73a5791c-b9cc-4142-a4d8-fb05651a30af\") " pod="openstack/ceilometer-0" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.857143 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73a5791c-b9cc-4142-a4d8-fb05651a30af-run-httpd\") pod \"ceilometer-0\" (UID: \"73a5791c-b9cc-4142-a4d8-fb05651a30af\") " pod="openstack/ceilometer-0" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.857226 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73a5791c-b9cc-4142-a4d8-fb05651a30af-log-httpd\") pod \"ceilometer-0\" (UID: \"73a5791c-b9cc-4142-a4d8-fb05651a30af\") " pod="openstack/ceilometer-0" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.857301 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/73a5791c-b9cc-4142-a4d8-fb05651a30af-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"73a5791c-b9cc-4142-a4d8-fb05651a30af\") " pod="openstack/ceilometer-0" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.857400 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73a5791c-b9cc-4142-a4d8-fb05651a30af-config-data\") pod \"ceilometer-0\" (UID: \"73a5791c-b9cc-4142-a4d8-fb05651a30af\") " pod="openstack/ceilometer-0" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.857420 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/73a5791c-b9cc-4142-a4d8-fb05651a30af-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"73a5791c-b9cc-4142-a4d8-fb05651a30af\") " pod="openstack/ceilometer-0" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.857460 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73a5791c-b9cc-4142-a4d8-fb05651a30af-scripts\") pod \"ceilometer-0\" (UID: \"73a5791c-b9cc-4142-a4d8-fb05651a30af\") " pod="openstack/ceilometer-0" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.857534 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73a5791c-b9cc-4142-a4d8-fb05651a30af-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"73a5791c-b9cc-4142-a4d8-fb05651a30af\") " pod="openstack/ceilometer-0" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.857571 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dc46v\" (UniqueName: \"kubernetes.io/projected/73a5791c-b9cc-4142-a4d8-fb05651a30af-kube-api-access-dc46v\") pod \"ceilometer-0\" (UID: \"73a5791c-b9cc-4142-a4d8-fb05651a30af\") " pod="openstack/ceilometer-0" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.857711 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73a5791c-b9cc-4142-a4d8-fb05651a30af-log-httpd\") pod \"ceilometer-0\" (UID: \"73a5791c-b9cc-4142-a4d8-fb05651a30af\") " pod="openstack/ceilometer-0" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.858322 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73a5791c-b9cc-4142-a4d8-fb05651a30af-run-httpd\") pod \"ceilometer-0\" (UID: \"73a5791c-b9cc-4142-a4d8-fb05651a30af\") " pod="openstack/ceilometer-0" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.861575 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73a5791c-b9cc-4142-a4d8-fb05651a30af-scripts\") pod \"ceilometer-0\" (UID: \"73a5791c-b9cc-4142-a4d8-fb05651a30af\") " pod="openstack/ceilometer-0" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.861723 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73a5791c-b9cc-4142-a4d8-fb05651a30af-config-data\") pod \"ceilometer-0\" (UID: \"73a5791c-b9cc-4142-a4d8-fb05651a30af\") " pod="openstack/ceilometer-0" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.863391 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/73a5791c-b9cc-4142-a4d8-fb05651a30af-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"73a5791c-b9cc-4142-a4d8-fb05651a30af\") " pod="openstack/ceilometer-0" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.868112 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/73a5791c-b9cc-4142-a4d8-fb05651a30af-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"73a5791c-b9cc-4142-a4d8-fb05651a30af\") " pod="openstack/ceilometer-0" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.873032 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73a5791c-b9cc-4142-a4d8-fb05651a30af-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"73a5791c-b9cc-4142-a4d8-fb05651a30af\") " pod="openstack/ceilometer-0" Sep 30 14:22:36 crc kubenswrapper[4676]: I0930 14:22:36.876059 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dc46v\" (UniqueName: \"kubernetes.io/projected/73a5791c-b9cc-4142-a4d8-fb05651a30af-kube-api-access-dc46v\") pod \"ceilometer-0\" (UID: \"73a5791c-b9cc-4142-a4d8-fb05651a30af\") " pod="openstack/ceilometer-0" Sep 30 14:22:37 crc kubenswrapper[4676]: I0930 14:22:37.020098 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 14:22:38 crc kubenswrapper[4676]: I0930 14:22:37.454061 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="551886de-0586-40d2-9d65-f19496c555db" path="/var/lib/kubelet/pods/551886de-0586-40d2-9d65-f19496c555db/volumes" Sep 30 14:22:38 crc kubenswrapper[4676]: I0930 14:22:37.455849 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be110bf6-fee0-4775-a76c-938038512ef6" path="/var/lib/kubelet/pods/be110bf6-fee0-4775-a76c-938038512ef6/volumes" Sep 30 14:22:38 crc kubenswrapper[4676]: I0930 14:22:37.512271 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 14:22:38 crc kubenswrapper[4676]: W0930 14:22:37.516644 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod73a5791c_b9cc_4142_a4d8_fb05651a30af.slice/crio-12377e7449161fbf854adea3b4d9ada41c73dd39468ee410a1d076a4468f3fb0 WatchSource:0}: Error finding container 12377e7449161fbf854adea3b4d9ada41c73dd39468ee410a1d076a4468f3fb0: Status 404 returned error can't find the container with id 12377e7449161fbf854adea3b4d9ada41c73dd39468ee410a1d076a4468f3fb0 Sep 30 14:22:38 crc kubenswrapper[4676]: I0930 14:22:37.885797 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 14:22:38 crc kubenswrapper[4676]: I0930 14:22:37.885860 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 14:22:38 crc kubenswrapper[4676]: I0930 14:22:38.357301 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73a5791c-b9cc-4142-a4d8-fb05651a30af","Type":"ContainerStarted","Data":"12377e7449161fbf854adea3b4d9ada41c73dd39468ee410a1d076a4468f3fb0"} Sep 30 14:22:38 crc kubenswrapper[4676]: I0930 14:22:38.609453 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Sep 30 14:22:38 crc kubenswrapper[4676]: I0930 14:22:38.636062 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Sep 30 14:22:38 crc kubenswrapper[4676]: I0930 14:22:38.978470 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.195:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 14:22:38 crc kubenswrapper[4676]: I0930 14:22:38.978557 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.195:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 14:22:39 crc kubenswrapper[4676]: I0930 14:22:39.369145 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73a5791c-b9cc-4142-a4d8-fb05651a30af","Type":"ContainerStarted","Data":"acfbbd4f569177f7f7385e31ee7738a11a51457da6b4f231a50fd18a85e57e43"} Sep 30 14:22:39 crc kubenswrapper[4676]: I0930 14:22:39.369729 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73a5791c-b9cc-4142-a4d8-fb05651a30af","Type":"ContainerStarted","Data":"35862c4715b94808f02bf304718d30edfc4467ba6621b0ea7be5192d549fb009"} Sep 30 14:22:39 crc kubenswrapper[4676]: I0930 14:22:39.397584 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Sep 30 14:22:40 crc kubenswrapper[4676]: I0930 14:22:40.392208 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73a5791c-b9cc-4142-a4d8-fb05651a30af","Type":"ContainerStarted","Data":"94aa762bebba35a3d55e67b72f084b5ad319a22f9b2bfeafeda69e48da3dd073"} Sep 30 14:22:41 crc kubenswrapper[4676]: I0930 14:22:41.403539 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73a5791c-b9cc-4142-a4d8-fb05651a30af","Type":"ContainerStarted","Data":"cec1a94a39a82935e5a597ffc5613557865d42884d23c26753dd5041e9bf8cd9"} Sep 30 14:22:41 crc kubenswrapper[4676]: I0930 14:22:41.403914 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 14:22:41 crc kubenswrapper[4676]: I0930 14:22:41.429157 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.001878057 podStartE2EDuration="5.429133217s" podCreationTimestamp="2025-09-30 14:22:36 +0000 UTC" firstStartedPulling="2025-09-30 14:22:37.540059285 +0000 UTC m=+1461.523147714" lastFinishedPulling="2025-09-30 14:22:40.967314445 +0000 UTC m=+1464.950402874" observedRunningTime="2025-09-30 14:22:41.427696239 +0000 UTC m=+1465.410784678" watchObservedRunningTime="2025-09-30 14:22:41.429133217 +0000 UTC m=+1465.412221646" Sep 30 14:22:42 crc kubenswrapper[4676]: I0930 14:22:42.686995 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Sep 30 14:22:44 crc kubenswrapper[4676]: I0930 14:22:44.883645 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Sep 30 14:22:44 crc kubenswrapper[4676]: I0930 14:22:44.884292 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Sep 30 14:22:44 crc kubenswrapper[4676]: I0930 14:22:44.889673 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Sep 30 14:22:44 crc kubenswrapper[4676]: I0930 14:22:44.892738 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Sep 30 14:22:46 crc kubenswrapper[4676]: I0930 14:22:46.440326 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 14:22:46 crc kubenswrapper[4676]: I0930 14:22:46.472739 4676 generic.go:334] "Generic (PLEG): container finished" podID="818f60d7-30b2-4ace-a2d0-51f551976dca" containerID="be75cfbb6f3ecbff302c559f35f0f56ccfd1a4cf26151c405bd14b56e1b34014" exitCode=137 Sep 30 14:22:46 crc kubenswrapper[4676]: I0930 14:22:46.472788 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 14:22:46 crc kubenswrapper[4676]: I0930 14:22:46.472780 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"818f60d7-30b2-4ace-a2d0-51f551976dca","Type":"ContainerDied","Data":"be75cfbb6f3ecbff302c559f35f0f56ccfd1a4cf26151c405bd14b56e1b34014"} Sep 30 14:22:46 crc kubenswrapper[4676]: I0930 14:22:46.472835 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"818f60d7-30b2-4ace-a2d0-51f551976dca","Type":"ContainerDied","Data":"d4fd30550352f4bc0cbe97b899e1143d5a6e98070774025f9ae2f2667ceeb39c"} Sep 30 14:22:46 crc kubenswrapper[4676]: I0930 14:22:46.472857 4676 scope.go:117] "RemoveContainer" containerID="be75cfbb6f3ecbff302c559f35f0f56ccfd1a4cf26151c405bd14b56e1b34014" Sep 30 14:22:46 crc kubenswrapper[4676]: I0930 14:22:46.512157 4676 scope.go:117] "RemoveContainer" containerID="be75cfbb6f3ecbff302c559f35f0f56ccfd1a4cf26151c405bd14b56e1b34014" Sep 30 14:22:46 crc kubenswrapper[4676]: E0930 14:22:46.515776 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be75cfbb6f3ecbff302c559f35f0f56ccfd1a4cf26151c405bd14b56e1b34014\": container with ID starting with be75cfbb6f3ecbff302c559f35f0f56ccfd1a4cf26151c405bd14b56e1b34014 not found: ID does not exist" containerID="be75cfbb6f3ecbff302c559f35f0f56ccfd1a4cf26151c405bd14b56e1b34014" Sep 30 14:22:46 crc kubenswrapper[4676]: I0930 14:22:46.515830 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be75cfbb6f3ecbff302c559f35f0f56ccfd1a4cf26151c405bd14b56e1b34014"} err="failed to get container status \"be75cfbb6f3ecbff302c559f35f0f56ccfd1a4cf26151c405bd14b56e1b34014\": rpc error: code = NotFound desc = could not find container \"be75cfbb6f3ecbff302c559f35f0f56ccfd1a4cf26151c405bd14b56e1b34014\": container with ID starting with be75cfbb6f3ecbff302c559f35f0f56ccfd1a4cf26151c405bd14b56e1b34014 not found: ID does not exist" Sep 30 14:22:46 crc kubenswrapper[4676]: I0930 14:22:46.562586 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvg7d\" (UniqueName: \"kubernetes.io/projected/818f60d7-30b2-4ace-a2d0-51f551976dca-kube-api-access-kvg7d\") pod \"818f60d7-30b2-4ace-a2d0-51f551976dca\" (UID: \"818f60d7-30b2-4ace-a2d0-51f551976dca\") " Sep 30 14:22:46 crc kubenswrapper[4676]: I0930 14:22:46.562700 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/818f60d7-30b2-4ace-a2d0-51f551976dca-config-data\") pod \"818f60d7-30b2-4ace-a2d0-51f551976dca\" (UID: \"818f60d7-30b2-4ace-a2d0-51f551976dca\") " Sep 30 14:22:46 crc kubenswrapper[4676]: I0930 14:22:46.562938 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/818f60d7-30b2-4ace-a2d0-51f551976dca-combined-ca-bundle\") pod \"818f60d7-30b2-4ace-a2d0-51f551976dca\" (UID: \"818f60d7-30b2-4ace-a2d0-51f551976dca\") " Sep 30 14:22:46 crc kubenswrapper[4676]: I0930 14:22:46.569264 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/818f60d7-30b2-4ace-a2d0-51f551976dca-kube-api-access-kvg7d" (OuterVolumeSpecName: "kube-api-access-kvg7d") pod "818f60d7-30b2-4ace-a2d0-51f551976dca" (UID: "818f60d7-30b2-4ace-a2d0-51f551976dca"). InnerVolumeSpecName "kube-api-access-kvg7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:22:46 crc kubenswrapper[4676]: I0930 14:22:46.592356 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/818f60d7-30b2-4ace-a2d0-51f551976dca-config-data" (OuterVolumeSpecName: "config-data") pod "818f60d7-30b2-4ace-a2d0-51f551976dca" (UID: "818f60d7-30b2-4ace-a2d0-51f551976dca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:22:46 crc kubenswrapper[4676]: I0930 14:22:46.599576 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/818f60d7-30b2-4ace-a2d0-51f551976dca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "818f60d7-30b2-4ace-a2d0-51f551976dca" (UID: "818f60d7-30b2-4ace-a2d0-51f551976dca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:22:46 crc kubenswrapper[4676]: I0930 14:22:46.665119 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/818f60d7-30b2-4ace-a2d0-51f551976dca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:46 crc kubenswrapper[4676]: I0930 14:22:46.665154 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvg7d\" (UniqueName: \"kubernetes.io/projected/818f60d7-30b2-4ace-a2d0-51f551976dca-kube-api-access-kvg7d\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:46 crc kubenswrapper[4676]: I0930 14:22:46.665168 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/818f60d7-30b2-4ace-a2d0-51f551976dca-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:46 crc kubenswrapper[4676]: I0930 14:22:46.806860 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 14:22:46 crc kubenswrapper[4676]: I0930 14:22:46.819740 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 14:22:46 crc kubenswrapper[4676]: I0930 14:22:46.830418 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 14:22:46 crc kubenswrapper[4676]: E0930 14:22:46.831069 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="818f60d7-30b2-4ace-a2d0-51f551976dca" containerName="nova-cell1-novncproxy-novncproxy" Sep 30 14:22:46 crc kubenswrapper[4676]: I0930 14:22:46.831178 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="818f60d7-30b2-4ace-a2d0-51f551976dca" containerName="nova-cell1-novncproxy-novncproxy" Sep 30 14:22:46 crc kubenswrapper[4676]: I0930 14:22:46.831514 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="818f60d7-30b2-4ace-a2d0-51f551976dca" containerName="nova-cell1-novncproxy-novncproxy" Sep 30 14:22:46 crc kubenswrapper[4676]: I0930 14:22:46.832283 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 14:22:46 crc kubenswrapper[4676]: I0930 14:22:46.834328 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Sep 30 14:22:46 crc kubenswrapper[4676]: I0930 14:22:46.834517 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Sep 30 14:22:46 crc kubenswrapper[4676]: I0930 14:22:46.834444 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Sep 30 14:22:46 crc kubenswrapper[4676]: I0930 14:22:46.841137 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 14:22:46 crc kubenswrapper[4676]: I0930 14:22:46.868578 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/1cc6f007-8ed9-4512-8b1b-70e2081f873a-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"1cc6f007-8ed9-4512-8b1b-70e2081f873a\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 14:22:46 crc kubenswrapper[4676]: I0930 14:22:46.868990 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/1cc6f007-8ed9-4512-8b1b-70e2081f873a-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"1cc6f007-8ed9-4512-8b1b-70e2081f873a\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 14:22:46 crc kubenswrapper[4676]: I0930 14:22:46.869176 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szvsl\" (UniqueName: \"kubernetes.io/projected/1cc6f007-8ed9-4512-8b1b-70e2081f873a-kube-api-access-szvsl\") pod \"nova-cell1-novncproxy-0\" (UID: \"1cc6f007-8ed9-4512-8b1b-70e2081f873a\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 14:22:46 crc kubenswrapper[4676]: I0930 14:22:46.869281 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cc6f007-8ed9-4512-8b1b-70e2081f873a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1cc6f007-8ed9-4512-8b1b-70e2081f873a\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 14:22:46 crc kubenswrapper[4676]: I0930 14:22:46.869310 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cc6f007-8ed9-4512-8b1b-70e2081f873a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1cc6f007-8ed9-4512-8b1b-70e2081f873a\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 14:22:46 crc kubenswrapper[4676]: I0930 14:22:46.970961 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cc6f007-8ed9-4512-8b1b-70e2081f873a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1cc6f007-8ed9-4512-8b1b-70e2081f873a\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 14:22:46 crc kubenswrapper[4676]: I0930 14:22:46.971005 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cc6f007-8ed9-4512-8b1b-70e2081f873a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1cc6f007-8ed9-4512-8b1b-70e2081f873a\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 14:22:46 crc kubenswrapper[4676]: I0930 14:22:46.971037 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/1cc6f007-8ed9-4512-8b1b-70e2081f873a-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"1cc6f007-8ed9-4512-8b1b-70e2081f873a\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 14:22:46 crc kubenswrapper[4676]: I0930 14:22:46.971149 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/1cc6f007-8ed9-4512-8b1b-70e2081f873a-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"1cc6f007-8ed9-4512-8b1b-70e2081f873a\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 14:22:46 crc kubenswrapper[4676]: I0930 14:22:46.971240 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szvsl\" (UniqueName: \"kubernetes.io/projected/1cc6f007-8ed9-4512-8b1b-70e2081f873a-kube-api-access-szvsl\") pod \"nova-cell1-novncproxy-0\" (UID: \"1cc6f007-8ed9-4512-8b1b-70e2081f873a\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 14:22:46 crc kubenswrapper[4676]: I0930 14:22:46.975684 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cc6f007-8ed9-4512-8b1b-70e2081f873a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1cc6f007-8ed9-4512-8b1b-70e2081f873a\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 14:22:46 crc kubenswrapper[4676]: I0930 14:22:46.975915 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/1cc6f007-8ed9-4512-8b1b-70e2081f873a-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"1cc6f007-8ed9-4512-8b1b-70e2081f873a\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 14:22:46 crc kubenswrapper[4676]: I0930 14:22:46.976388 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cc6f007-8ed9-4512-8b1b-70e2081f873a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1cc6f007-8ed9-4512-8b1b-70e2081f873a\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 14:22:46 crc kubenswrapper[4676]: I0930 14:22:46.976776 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/1cc6f007-8ed9-4512-8b1b-70e2081f873a-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"1cc6f007-8ed9-4512-8b1b-70e2081f873a\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 14:22:46 crc kubenswrapper[4676]: I0930 14:22:46.986032 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szvsl\" (UniqueName: \"kubernetes.io/projected/1cc6f007-8ed9-4512-8b1b-70e2081f873a-kube-api-access-szvsl\") pod \"nova-cell1-novncproxy-0\" (UID: \"1cc6f007-8ed9-4512-8b1b-70e2081f873a\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 14:22:47 crc kubenswrapper[4676]: I0930 14:22:47.166070 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 14:22:47 crc kubenswrapper[4676]: I0930 14:22:47.445140 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="818f60d7-30b2-4ace-a2d0-51f551976dca" path="/var/lib/kubelet/pods/818f60d7-30b2-4ace-a2d0-51f551976dca/volumes" Sep 30 14:22:47 crc kubenswrapper[4676]: I0930 14:22:47.647004 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 14:22:47 crc kubenswrapper[4676]: I0930 14:22:47.890189 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Sep 30 14:22:47 crc kubenswrapper[4676]: I0930 14:22:47.891027 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Sep 30 14:22:47 crc kubenswrapper[4676]: I0930 14:22:47.891124 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Sep 30 14:22:47 crc kubenswrapper[4676]: I0930 14:22:47.893568 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Sep 30 14:22:48 crc kubenswrapper[4676]: I0930 14:22:48.492230 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1cc6f007-8ed9-4512-8b1b-70e2081f873a","Type":"ContainerStarted","Data":"1ac57050cb5a1c5aae7cfb180dd258c0e2cfc270a0c0e37a51985bd8e9efb4bf"} Sep 30 14:22:48 crc kubenswrapper[4676]: I0930 14:22:48.492283 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1cc6f007-8ed9-4512-8b1b-70e2081f873a","Type":"ContainerStarted","Data":"ac74ff6d81617d4458f2200d87d15991b57d42955ade8bc4d3465acdd4c61786"} Sep 30 14:22:48 crc kubenswrapper[4676]: I0930 14:22:48.492743 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Sep 30 14:22:48 crc kubenswrapper[4676]: I0930 14:22:48.509417 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Sep 30 14:22:48 crc kubenswrapper[4676]: I0930 14:22:48.530386 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.530363262 podStartE2EDuration="2.530363262s" podCreationTimestamp="2025-09-30 14:22:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:22:48.522907266 +0000 UTC m=+1472.505995695" watchObservedRunningTime="2025-09-30 14:22:48.530363262 +0000 UTC m=+1472.513451691" Sep 30 14:22:48 crc kubenswrapper[4676]: I0930 14:22:48.721575 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-dq8jp"] Sep 30 14:22:48 crc kubenswrapper[4676]: I0930 14:22:48.723966 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-dq8jp" Sep 30 14:22:48 crc kubenswrapper[4676]: I0930 14:22:48.734195 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-dq8jp"] Sep 30 14:22:48 crc kubenswrapper[4676]: I0930 14:22:48.814477 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6090ed01-b4ad-4d41-8e55-a250b4fb9d1c-config\") pod \"dnsmasq-dns-59cf4bdb65-dq8jp\" (UID: \"6090ed01-b4ad-4d41-8e55-a250b4fb9d1c\") " pod="openstack/dnsmasq-dns-59cf4bdb65-dq8jp" Sep 30 14:22:48 crc kubenswrapper[4676]: I0930 14:22:48.814756 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6090ed01-b4ad-4d41-8e55-a250b4fb9d1c-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-dq8jp\" (UID: \"6090ed01-b4ad-4d41-8e55-a250b4fb9d1c\") " pod="openstack/dnsmasq-dns-59cf4bdb65-dq8jp" Sep 30 14:22:48 crc kubenswrapper[4676]: I0930 14:22:48.814866 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7s7x\" (UniqueName: \"kubernetes.io/projected/6090ed01-b4ad-4d41-8e55-a250b4fb9d1c-kube-api-access-l7s7x\") pod \"dnsmasq-dns-59cf4bdb65-dq8jp\" (UID: \"6090ed01-b4ad-4d41-8e55-a250b4fb9d1c\") " pod="openstack/dnsmasq-dns-59cf4bdb65-dq8jp" Sep 30 14:22:48 crc kubenswrapper[4676]: I0930 14:22:48.815077 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6090ed01-b4ad-4d41-8e55-a250b4fb9d1c-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-dq8jp\" (UID: \"6090ed01-b4ad-4d41-8e55-a250b4fb9d1c\") " pod="openstack/dnsmasq-dns-59cf4bdb65-dq8jp" Sep 30 14:22:48 crc kubenswrapper[4676]: I0930 14:22:48.815183 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6090ed01-b4ad-4d41-8e55-a250b4fb9d1c-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-dq8jp\" (UID: \"6090ed01-b4ad-4d41-8e55-a250b4fb9d1c\") " pod="openstack/dnsmasq-dns-59cf4bdb65-dq8jp" Sep 30 14:22:48 crc kubenswrapper[4676]: I0930 14:22:48.815268 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6090ed01-b4ad-4d41-8e55-a250b4fb9d1c-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-dq8jp\" (UID: \"6090ed01-b4ad-4d41-8e55-a250b4fb9d1c\") " pod="openstack/dnsmasq-dns-59cf4bdb65-dq8jp" Sep 30 14:22:48 crc kubenswrapper[4676]: I0930 14:22:48.917171 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6090ed01-b4ad-4d41-8e55-a250b4fb9d1c-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-dq8jp\" (UID: \"6090ed01-b4ad-4d41-8e55-a250b4fb9d1c\") " pod="openstack/dnsmasq-dns-59cf4bdb65-dq8jp" Sep 30 14:22:48 crc kubenswrapper[4676]: I0930 14:22:48.917479 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6090ed01-b4ad-4d41-8e55-a250b4fb9d1c-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-dq8jp\" (UID: \"6090ed01-b4ad-4d41-8e55-a250b4fb9d1c\") " pod="openstack/dnsmasq-dns-59cf4bdb65-dq8jp" Sep 30 14:22:48 crc kubenswrapper[4676]: I0930 14:22:48.917581 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6090ed01-b4ad-4d41-8e55-a250b4fb9d1c-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-dq8jp\" (UID: \"6090ed01-b4ad-4d41-8e55-a250b4fb9d1c\") " pod="openstack/dnsmasq-dns-59cf4bdb65-dq8jp" Sep 30 14:22:48 crc kubenswrapper[4676]: I0930 14:22:48.917754 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6090ed01-b4ad-4d41-8e55-a250b4fb9d1c-config\") pod \"dnsmasq-dns-59cf4bdb65-dq8jp\" (UID: \"6090ed01-b4ad-4d41-8e55-a250b4fb9d1c\") " pod="openstack/dnsmasq-dns-59cf4bdb65-dq8jp" Sep 30 14:22:48 crc kubenswrapper[4676]: I0930 14:22:48.917919 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6090ed01-b4ad-4d41-8e55-a250b4fb9d1c-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-dq8jp\" (UID: \"6090ed01-b4ad-4d41-8e55-a250b4fb9d1c\") " pod="openstack/dnsmasq-dns-59cf4bdb65-dq8jp" Sep 30 14:22:48 crc kubenswrapper[4676]: I0930 14:22:48.918079 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7s7x\" (UniqueName: \"kubernetes.io/projected/6090ed01-b4ad-4d41-8e55-a250b4fb9d1c-kube-api-access-l7s7x\") pod \"dnsmasq-dns-59cf4bdb65-dq8jp\" (UID: \"6090ed01-b4ad-4d41-8e55-a250b4fb9d1c\") " pod="openstack/dnsmasq-dns-59cf4bdb65-dq8jp" Sep 30 14:22:48 crc kubenswrapper[4676]: I0930 14:22:48.918171 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6090ed01-b4ad-4d41-8e55-a250b4fb9d1c-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-dq8jp\" (UID: \"6090ed01-b4ad-4d41-8e55-a250b4fb9d1c\") " pod="openstack/dnsmasq-dns-59cf4bdb65-dq8jp" Sep 30 14:22:48 crc kubenswrapper[4676]: I0930 14:22:48.918323 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6090ed01-b4ad-4d41-8e55-a250b4fb9d1c-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-dq8jp\" (UID: \"6090ed01-b4ad-4d41-8e55-a250b4fb9d1c\") " pod="openstack/dnsmasq-dns-59cf4bdb65-dq8jp" Sep 30 14:22:48 crc kubenswrapper[4676]: I0930 14:22:48.918413 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6090ed01-b4ad-4d41-8e55-a250b4fb9d1c-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-dq8jp\" (UID: \"6090ed01-b4ad-4d41-8e55-a250b4fb9d1c\") " pod="openstack/dnsmasq-dns-59cf4bdb65-dq8jp" Sep 30 14:22:48 crc kubenswrapper[4676]: I0930 14:22:48.918989 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6090ed01-b4ad-4d41-8e55-a250b4fb9d1c-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-dq8jp\" (UID: \"6090ed01-b4ad-4d41-8e55-a250b4fb9d1c\") " pod="openstack/dnsmasq-dns-59cf4bdb65-dq8jp" Sep 30 14:22:48 crc kubenswrapper[4676]: I0930 14:22:48.919775 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6090ed01-b4ad-4d41-8e55-a250b4fb9d1c-config\") pod \"dnsmasq-dns-59cf4bdb65-dq8jp\" (UID: \"6090ed01-b4ad-4d41-8e55-a250b4fb9d1c\") " pod="openstack/dnsmasq-dns-59cf4bdb65-dq8jp" Sep 30 14:22:48 crc kubenswrapper[4676]: I0930 14:22:48.943239 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7s7x\" (UniqueName: \"kubernetes.io/projected/6090ed01-b4ad-4d41-8e55-a250b4fb9d1c-kube-api-access-l7s7x\") pod \"dnsmasq-dns-59cf4bdb65-dq8jp\" (UID: \"6090ed01-b4ad-4d41-8e55-a250b4fb9d1c\") " pod="openstack/dnsmasq-dns-59cf4bdb65-dq8jp" Sep 30 14:22:49 crc kubenswrapper[4676]: I0930 14:22:49.054279 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-dq8jp" Sep 30 14:22:49 crc kubenswrapper[4676]: I0930 14:22:49.574329 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-dq8jp"] Sep 30 14:22:49 crc kubenswrapper[4676]: W0930 14:22:49.586252 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6090ed01_b4ad_4d41_8e55_a250b4fb9d1c.slice/crio-e46645cd417d3e064b1bee34a8bbb604763c29d757046d9413f923d3d6bebbb3 WatchSource:0}: Error finding container e46645cd417d3e064b1bee34a8bbb604763c29d757046d9413f923d3d6bebbb3: Status 404 returned error can't find the container with id e46645cd417d3e064b1bee34a8bbb604763c29d757046d9413f923d3d6bebbb3 Sep 30 14:22:50 crc kubenswrapper[4676]: I0930 14:22:50.520592 4676 generic.go:334] "Generic (PLEG): container finished" podID="6090ed01-b4ad-4d41-8e55-a250b4fb9d1c" containerID="7774731a0beec422fd24a38ec6d0ca2e543cf4ef9448e3831525e186d53bcc46" exitCode=0 Sep 30 14:22:50 crc kubenswrapper[4676]: I0930 14:22:50.520818 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-dq8jp" event={"ID":"6090ed01-b4ad-4d41-8e55-a250b4fb9d1c","Type":"ContainerDied","Data":"7774731a0beec422fd24a38ec6d0ca2e543cf4ef9448e3831525e186d53bcc46"} Sep 30 14:22:50 crc kubenswrapper[4676]: I0930 14:22:50.522621 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-dq8jp" event={"ID":"6090ed01-b4ad-4d41-8e55-a250b4fb9d1c","Type":"ContainerStarted","Data":"e46645cd417d3e064b1bee34a8bbb604763c29d757046d9413f923d3d6bebbb3"} Sep 30 14:22:50 crc kubenswrapper[4676]: I0930 14:22:50.686712 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 14:22:50 crc kubenswrapper[4676]: I0930 14:22:50.687277 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="73a5791c-b9cc-4142-a4d8-fb05651a30af" containerName="ceilometer-central-agent" containerID="cri-o://35862c4715b94808f02bf304718d30edfc4467ba6621b0ea7be5192d549fb009" gracePeriod=30 Sep 30 14:22:50 crc kubenswrapper[4676]: I0930 14:22:50.687398 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="73a5791c-b9cc-4142-a4d8-fb05651a30af" containerName="proxy-httpd" containerID="cri-o://cec1a94a39a82935e5a597ffc5613557865d42884d23c26753dd5041e9bf8cd9" gracePeriod=30 Sep 30 14:22:50 crc kubenswrapper[4676]: I0930 14:22:50.687430 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="73a5791c-b9cc-4142-a4d8-fb05651a30af" containerName="sg-core" containerID="cri-o://94aa762bebba35a3d55e67b72f084b5ad319a22f9b2bfeafeda69e48da3dd073" gracePeriod=30 Sep 30 14:22:50 crc kubenswrapper[4676]: I0930 14:22:50.687477 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="73a5791c-b9cc-4142-a4d8-fb05651a30af" containerName="ceilometer-notification-agent" containerID="cri-o://acfbbd4f569177f7f7385e31ee7738a11a51457da6b4f231a50fd18a85e57e43" gracePeriod=30 Sep 30 14:22:51 crc kubenswrapper[4676]: I0930 14:22:51.520331 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 14:22:51 crc kubenswrapper[4676]: I0930 14:22:51.538158 4676 generic.go:334] "Generic (PLEG): container finished" podID="73a5791c-b9cc-4142-a4d8-fb05651a30af" containerID="cec1a94a39a82935e5a597ffc5613557865d42884d23c26753dd5041e9bf8cd9" exitCode=0 Sep 30 14:22:51 crc kubenswrapper[4676]: I0930 14:22:51.538200 4676 generic.go:334] "Generic (PLEG): container finished" podID="73a5791c-b9cc-4142-a4d8-fb05651a30af" containerID="94aa762bebba35a3d55e67b72f084b5ad319a22f9b2bfeafeda69e48da3dd073" exitCode=2 Sep 30 14:22:51 crc kubenswrapper[4676]: I0930 14:22:51.538217 4676 generic.go:334] "Generic (PLEG): container finished" podID="73a5791c-b9cc-4142-a4d8-fb05651a30af" containerID="acfbbd4f569177f7f7385e31ee7738a11a51457da6b4f231a50fd18a85e57e43" exitCode=0 Sep 30 14:22:51 crc kubenswrapper[4676]: I0930 14:22:51.538226 4676 generic.go:334] "Generic (PLEG): container finished" podID="73a5791c-b9cc-4142-a4d8-fb05651a30af" containerID="35862c4715b94808f02bf304718d30edfc4467ba6621b0ea7be5192d549fb009" exitCode=0 Sep 30 14:22:51 crc kubenswrapper[4676]: I0930 14:22:51.538263 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73a5791c-b9cc-4142-a4d8-fb05651a30af","Type":"ContainerDied","Data":"cec1a94a39a82935e5a597ffc5613557865d42884d23c26753dd5041e9bf8cd9"} Sep 30 14:22:51 crc kubenswrapper[4676]: I0930 14:22:51.538305 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73a5791c-b9cc-4142-a4d8-fb05651a30af","Type":"ContainerDied","Data":"94aa762bebba35a3d55e67b72f084b5ad319a22f9b2bfeafeda69e48da3dd073"} Sep 30 14:22:51 crc kubenswrapper[4676]: I0930 14:22:51.538318 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73a5791c-b9cc-4142-a4d8-fb05651a30af","Type":"ContainerDied","Data":"acfbbd4f569177f7f7385e31ee7738a11a51457da6b4f231a50fd18a85e57e43"} Sep 30 14:22:51 crc kubenswrapper[4676]: I0930 14:22:51.538328 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73a5791c-b9cc-4142-a4d8-fb05651a30af","Type":"ContainerDied","Data":"35862c4715b94808f02bf304718d30edfc4467ba6621b0ea7be5192d549fb009"} Sep 30 14:22:51 crc kubenswrapper[4676]: I0930 14:22:51.548563 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a" containerName="nova-api-log" containerID="cri-o://a8633ab480e720f25318372559fffacd77af460ef7c8cb22da21da981d967676" gracePeriod=30 Sep 30 14:22:51 crc kubenswrapper[4676]: I0930 14:22:51.549728 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-dq8jp" event={"ID":"6090ed01-b4ad-4d41-8e55-a250b4fb9d1c","Type":"ContainerStarted","Data":"03eaca873fb16fb2177e7dbbbc00649c914e4cfe297edfe2389b124064a3c3a1"} Sep 30 14:22:51 crc kubenswrapper[4676]: I0930 14:22:51.549764 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-59cf4bdb65-dq8jp" Sep 30 14:22:51 crc kubenswrapper[4676]: I0930 14:22:51.550058 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a" containerName="nova-api-api" containerID="cri-o://a3d445756cdce10eeb166f4cafd30feb11ae8771d7477a6e69b9bfdeb67a8793" gracePeriod=30 Sep 30 14:22:51 crc kubenswrapper[4676]: I0930 14:22:51.570170 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-59cf4bdb65-dq8jp" podStartSLOduration=3.570145902 podStartE2EDuration="3.570145902s" podCreationTimestamp="2025-09-30 14:22:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:22:51.568249122 +0000 UTC m=+1475.551337571" watchObservedRunningTime="2025-09-30 14:22:51.570145902 +0000 UTC m=+1475.553234341" Sep 30 14:22:51 crc kubenswrapper[4676]: I0930 14:22:51.860089 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 14:22:51 crc kubenswrapper[4676]: I0930 14:22:51.989983 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dc46v\" (UniqueName: \"kubernetes.io/projected/73a5791c-b9cc-4142-a4d8-fb05651a30af-kube-api-access-dc46v\") pod \"73a5791c-b9cc-4142-a4d8-fb05651a30af\" (UID: \"73a5791c-b9cc-4142-a4d8-fb05651a30af\") " Sep 30 14:22:51 crc kubenswrapper[4676]: I0930 14:22:51.990114 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73a5791c-b9cc-4142-a4d8-fb05651a30af-run-httpd\") pod \"73a5791c-b9cc-4142-a4d8-fb05651a30af\" (UID: \"73a5791c-b9cc-4142-a4d8-fb05651a30af\") " Sep 30 14:22:51 crc kubenswrapper[4676]: I0930 14:22:51.990189 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73a5791c-b9cc-4142-a4d8-fb05651a30af-log-httpd\") pod \"73a5791c-b9cc-4142-a4d8-fb05651a30af\" (UID: \"73a5791c-b9cc-4142-a4d8-fb05651a30af\") " Sep 30 14:22:51 crc kubenswrapper[4676]: I0930 14:22:51.990230 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73a5791c-b9cc-4142-a4d8-fb05651a30af-combined-ca-bundle\") pod \"73a5791c-b9cc-4142-a4d8-fb05651a30af\" (UID: \"73a5791c-b9cc-4142-a4d8-fb05651a30af\") " Sep 30 14:22:51 crc kubenswrapper[4676]: I0930 14:22:51.990295 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73a5791c-b9cc-4142-a4d8-fb05651a30af-config-data\") pod \"73a5791c-b9cc-4142-a4d8-fb05651a30af\" (UID: \"73a5791c-b9cc-4142-a4d8-fb05651a30af\") " Sep 30 14:22:51 crc kubenswrapper[4676]: I0930 14:22:51.990405 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/73a5791c-b9cc-4142-a4d8-fb05651a30af-sg-core-conf-yaml\") pod \"73a5791c-b9cc-4142-a4d8-fb05651a30af\" (UID: \"73a5791c-b9cc-4142-a4d8-fb05651a30af\") " Sep 30 14:22:51 crc kubenswrapper[4676]: I0930 14:22:51.990438 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73a5791c-b9cc-4142-a4d8-fb05651a30af-scripts\") pod \"73a5791c-b9cc-4142-a4d8-fb05651a30af\" (UID: \"73a5791c-b9cc-4142-a4d8-fb05651a30af\") " Sep 30 14:22:51 crc kubenswrapper[4676]: I0930 14:22:51.990465 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/73a5791c-b9cc-4142-a4d8-fb05651a30af-ceilometer-tls-certs\") pod \"73a5791c-b9cc-4142-a4d8-fb05651a30af\" (UID: \"73a5791c-b9cc-4142-a4d8-fb05651a30af\") " Sep 30 14:22:52 crc kubenswrapper[4676]: I0930 14:22:52.000132 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73a5791c-b9cc-4142-a4d8-fb05651a30af-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "73a5791c-b9cc-4142-a4d8-fb05651a30af" (UID: "73a5791c-b9cc-4142-a4d8-fb05651a30af"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:22:52 crc kubenswrapper[4676]: I0930 14:22:52.001673 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73a5791c-b9cc-4142-a4d8-fb05651a30af-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "73a5791c-b9cc-4142-a4d8-fb05651a30af" (UID: "73a5791c-b9cc-4142-a4d8-fb05651a30af"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:22:52 crc kubenswrapper[4676]: I0930 14:22:52.025260 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73a5791c-b9cc-4142-a4d8-fb05651a30af-kube-api-access-dc46v" (OuterVolumeSpecName: "kube-api-access-dc46v") pod "73a5791c-b9cc-4142-a4d8-fb05651a30af" (UID: "73a5791c-b9cc-4142-a4d8-fb05651a30af"). InnerVolumeSpecName "kube-api-access-dc46v". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:22:52 crc kubenswrapper[4676]: I0930 14:22:52.039127 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73a5791c-b9cc-4142-a4d8-fb05651a30af-scripts" (OuterVolumeSpecName: "scripts") pod "73a5791c-b9cc-4142-a4d8-fb05651a30af" (UID: "73a5791c-b9cc-4142-a4d8-fb05651a30af"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:22:52 crc kubenswrapper[4676]: I0930 14:22:52.093374 4676 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73a5791c-b9cc-4142-a4d8-fb05651a30af-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:52 crc kubenswrapper[4676]: I0930 14:22:52.093403 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dc46v\" (UniqueName: \"kubernetes.io/projected/73a5791c-b9cc-4142-a4d8-fb05651a30af-kube-api-access-dc46v\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:52 crc kubenswrapper[4676]: I0930 14:22:52.093414 4676 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73a5791c-b9cc-4142-a4d8-fb05651a30af-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:52 crc kubenswrapper[4676]: I0930 14:22:52.093423 4676 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73a5791c-b9cc-4142-a4d8-fb05651a30af-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:52 crc kubenswrapper[4676]: I0930 14:22:52.113834 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73a5791c-b9cc-4142-a4d8-fb05651a30af-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "73a5791c-b9cc-4142-a4d8-fb05651a30af" (UID: "73a5791c-b9cc-4142-a4d8-fb05651a30af"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:22:52 crc kubenswrapper[4676]: I0930 14:22:52.151285 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73a5791c-b9cc-4142-a4d8-fb05651a30af-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "73a5791c-b9cc-4142-a4d8-fb05651a30af" (UID: "73a5791c-b9cc-4142-a4d8-fb05651a30af"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:22:52 crc kubenswrapper[4676]: I0930 14:22:52.167339 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Sep 30 14:22:52 crc kubenswrapper[4676]: I0930 14:22:52.195236 4676 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/73a5791c-b9cc-4142-a4d8-fb05651a30af-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:52 crc kubenswrapper[4676]: I0930 14:22:52.195268 4676 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/73a5791c-b9cc-4142-a4d8-fb05651a30af-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:52 crc kubenswrapper[4676]: I0930 14:22:52.198329 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73a5791c-b9cc-4142-a4d8-fb05651a30af-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "73a5791c-b9cc-4142-a4d8-fb05651a30af" (UID: "73a5791c-b9cc-4142-a4d8-fb05651a30af"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:22:52 crc kubenswrapper[4676]: I0930 14:22:52.233226 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73a5791c-b9cc-4142-a4d8-fb05651a30af-config-data" (OuterVolumeSpecName: "config-data") pod "73a5791c-b9cc-4142-a4d8-fb05651a30af" (UID: "73a5791c-b9cc-4142-a4d8-fb05651a30af"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:22:52 crc kubenswrapper[4676]: I0930 14:22:52.297662 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73a5791c-b9cc-4142-a4d8-fb05651a30af-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:52 crc kubenswrapper[4676]: I0930 14:22:52.297703 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73a5791c-b9cc-4142-a4d8-fb05651a30af-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:52 crc kubenswrapper[4676]: I0930 14:22:52.560978 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73a5791c-b9cc-4142-a4d8-fb05651a30af","Type":"ContainerDied","Data":"12377e7449161fbf854adea3b4d9ada41c73dd39468ee410a1d076a4468f3fb0"} Sep 30 14:22:52 crc kubenswrapper[4676]: I0930 14:22:52.561041 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 14:22:52 crc kubenswrapper[4676]: I0930 14:22:52.561729 4676 scope.go:117] "RemoveContainer" containerID="cec1a94a39a82935e5a597ffc5613557865d42884d23c26753dd5041e9bf8cd9" Sep 30 14:22:52 crc kubenswrapper[4676]: I0930 14:22:52.564071 4676 generic.go:334] "Generic (PLEG): container finished" podID="ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a" containerID="a8633ab480e720f25318372559fffacd77af460ef7c8cb22da21da981d967676" exitCode=143 Sep 30 14:22:52 crc kubenswrapper[4676]: I0930 14:22:52.564136 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a","Type":"ContainerDied","Data":"a8633ab480e720f25318372559fffacd77af460ef7c8cb22da21da981d967676"} Sep 30 14:22:52 crc kubenswrapper[4676]: I0930 14:22:52.624463 4676 scope.go:117] "RemoveContainer" containerID="94aa762bebba35a3d55e67b72f084b5ad319a22f9b2bfeafeda69e48da3dd073" Sep 30 14:22:52 crc kubenswrapper[4676]: I0930 14:22:52.638358 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 14:22:52 crc kubenswrapper[4676]: I0930 14:22:52.644986 4676 scope.go:117] "RemoveContainer" containerID="acfbbd4f569177f7f7385e31ee7738a11a51457da6b4f231a50fd18a85e57e43" Sep 30 14:22:52 crc kubenswrapper[4676]: I0930 14:22:52.654864 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 14:22:52 crc kubenswrapper[4676]: I0930 14:22:52.663476 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 14:22:52 crc kubenswrapper[4676]: E0930 14:22:52.663859 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73a5791c-b9cc-4142-a4d8-fb05651a30af" containerName="sg-core" Sep 30 14:22:52 crc kubenswrapper[4676]: I0930 14:22:52.663879 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="73a5791c-b9cc-4142-a4d8-fb05651a30af" containerName="sg-core" Sep 30 14:22:52 crc kubenswrapper[4676]: E0930 14:22:52.663920 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73a5791c-b9cc-4142-a4d8-fb05651a30af" containerName="ceilometer-notification-agent" Sep 30 14:22:52 crc kubenswrapper[4676]: I0930 14:22:52.663926 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="73a5791c-b9cc-4142-a4d8-fb05651a30af" containerName="ceilometer-notification-agent" Sep 30 14:22:52 crc kubenswrapper[4676]: E0930 14:22:52.663944 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73a5791c-b9cc-4142-a4d8-fb05651a30af" containerName="proxy-httpd" Sep 30 14:22:52 crc kubenswrapper[4676]: I0930 14:22:52.663950 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="73a5791c-b9cc-4142-a4d8-fb05651a30af" containerName="proxy-httpd" Sep 30 14:22:52 crc kubenswrapper[4676]: E0930 14:22:52.663962 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73a5791c-b9cc-4142-a4d8-fb05651a30af" containerName="ceilometer-central-agent" Sep 30 14:22:52 crc kubenswrapper[4676]: I0930 14:22:52.663968 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="73a5791c-b9cc-4142-a4d8-fb05651a30af" containerName="ceilometer-central-agent" Sep 30 14:22:52 crc kubenswrapper[4676]: I0930 14:22:52.664122 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="73a5791c-b9cc-4142-a4d8-fb05651a30af" containerName="sg-core" Sep 30 14:22:52 crc kubenswrapper[4676]: I0930 14:22:52.664146 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="73a5791c-b9cc-4142-a4d8-fb05651a30af" containerName="ceilometer-notification-agent" Sep 30 14:22:52 crc kubenswrapper[4676]: I0930 14:22:52.664303 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="73a5791c-b9cc-4142-a4d8-fb05651a30af" containerName="ceilometer-central-agent" Sep 30 14:22:52 crc kubenswrapper[4676]: I0930 14:22:52.664319 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="73a5791c-b9cc-4142-a4d8-fb05651a30af" containerName="proxy-httpd" Sep 30 14:22:52 crc kubenswrapper[4676]: I0930 14:22:52.665970 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 14:22:52 crc kubenswrapper[4676]: I0930 14:22:52.668325 4676 scope.go:117] "RemoveContainer" containerID="35862c4715b94808f02bf304718d30edfc4467ba6621b0ea7be5192d549fb009" Sep 30 14:22:52 crc kubenswrapper[4676]: I0930 14:22:52.669974 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Sep 30 14:22:52 crc kubenswrapper[4676]: I0930 14:22:52.670298 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 14:22:52 crc kubenswrapper[4676]: I0930 14:22:52.670663 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 14:22:52 crc kubenswrapper[4676]: I0930 14:22:52.676074 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 14:22:52 crc kubenswrapper[4676]: I0930 14:22:52.807954 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35b4f753-8942-4db3-960b-09f2316e8152-log-httpd\") pod \"ceilometer-0\" (UID: \"35b4f753-8942-4db3-960b-09f2316e8152\") " pod="openstack/ceilometer-0" Sep 30 14:22:52 crc kubenswrapper[4676]: I0930 14:22:52.808006 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35b4f753-8942-4db3-960b-09f2316e8152-run-httpd\") pod \"ceilometer-0\" (UID: \"35b4f753-8942-4db3-960b-09f2316e8152\") " pod="openstack/ceilometer-0" Sep 30 14:22:52 crc kubenswrapper[4676]: I0930 14:22:52.808033 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/35b4f753-8942-4db3-960b-09f2316e8152-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"35b4f753-8942-4db3-960b-09f2316e8152\") " pod="openstack/ceilometer-0" Sep 30 14:22:52 crc kubenswrapper[4676]: I0930 14:22:52.808105 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sswwj\" (UniqueName: \"kubernetes.io/projected/35b4f753-8942-4db3-960b-09f2316e8152-kube-api-access-sswwj\") pod \"ceilometer-0\" (UID: \"35b4f753-8942-4db3-960b-09f2316e8152\") " pod="openstack/ceilometer-0" Sep 30 14:22:52 crc kubenswrapper[4676]: I0930 14:22:52.808148 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35b4f753-8942-4db3-960b-09f2316e8152-scripts\") pod \"ceilometer-0\" (UID: \"35b4f753-8942-4db3-960b-09f2316e8152\") " pod="openstack/ceilometer-0" Sep 30 14:22:52 crc kubenswrapper[4676]: I0930 14:22:52.808173 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35b4f753-8942-4db3-960b-09f2316e8152-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"35b4f753-8942-4db3-960b-09f2316e8152\") " pod="openstack/ceilometer-0" Sep 30 14:22:52 crc kubenswrapper[4676]: I0930 14:22:52.808235 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/35b4f753-8942-4db3-960b-09f2316e8152-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"35b4f753-8942-4db3-960b-09f2316e8152\") " pod="openstack/ceilometer-0" Sep 30 14:22:52 crc kubenswrapper[4676]: I0930 14:22:52.808260 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35b4f753-8942-4db3-960b-09f2316e8152-config-data\") pod \"ceilometer-0\" (UID: \"35b4f753-8942-4db3-960b-09f2316e8152\") " pod="openstack/ceilometer-0" Sep 30 14:22:52 crc kubenswrapper[4676]: I0930 14:22:52.909526 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/35b4f753-8942-4db3-960b-09f2316e8152-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"35b4f753-8942-4db3-960b-09f2316e8152\") " pod="openstack/ceilometer-0" Sep 30 14:22:52 crc kubenswrapper[4676]: I0930 14:22:52.909583 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35b4f753-8942-4db3-960b-09f2316e8152-config-data\") pod \"ceilometer-0\" (UID: \"35b4f753-8942-4db3-960b-09f2316e8152\") " pod="openstack/ceilometer-0" Sep 30 14:22:52 crc kubenswrapper[4676]: I0930 14:22:52.909636 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35b4f753-8942-4db3-960b-09f2316e8152-log-httpd\") pod \"ceilometer-0\" (UID: \"35b4f753-8942-4db3-960b-09f2316e8152\") " pod="openstack/ceilometer-0" Sep 30 14:22:52 crc kubenswrapper[4676]: I0930 14:22:52.909657 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35b4f753-8942-4db3-960b-09f2316e8152-run-httpd\") pod \"ceilometer-0\" (UID: \"35b4f753-8942-4db3-960b-09f2316e8152\") " pod="openstack/ceilometer-0" Sep 30 14:22:52 crc kubenswrapper[4676]: I0930 14:22:52.909688 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/35b4f753-8942-4db3-960b-09f2316e8152-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"35b4f753-8942-4db3-960b-09f2316e8152\") " pod="openstack/ceilometer-0" Sep 30 14:22:52 crc kubenswrapper[4676]: I0930 14:22:52.909734 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sswwj\" (UniqueName: \"kubernetes.io/projected/35b4f753-8942-4db3-960b-09f2316e8152-kube-api-access-sswwj\") pod \"ceilometer-0\" (UID: \"35b4f753-8942-4db3-960b-09f2316e8152\") " pod="openstack/ceilometer-0" Sep 30 14:22:52 crc kubenswrapper[4676]: I0930 14:22:52.909768 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35b4f753-8942-4db3-960b-09f2316e8152-scripts\") pod \"ceilometer-0\" (UID: \"35b4f753-8942-4db3-960b-09f2316e8152\") " pod="openstack/ceilometer-0" Sep 30 14:22:52 crc kubenswrapper[4676]: I0930 14:22:52.909787 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35b4f753-8942-4db3-960b-09f2316e8152-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"35b4f753-8942-4db3-960b-09f2316e8152\") " pod="openstack/ceilometer-0" Sep 30 14:22:52 crc kubenswrapper[4676]: I0930 14:22:52.910267 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35b4f753-8942-4db3-960b-09f2316e8152-log-httpd\") pod \"ceilometer-0\" (UID: \"35b4f753-8942-4db3-960b-09f2316e8152\") " pod="openstack/ceilometer-0" Sep 30 14:22:52 crc kubenswrapper[4676]: I0930 14:22:52.910901 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35b4f753-8942-4db3-960b-09f2316e8152-run-httpd\") pod \"ceilometer-0\" (UID: \"35b4f753-8942-4db3-960b-09f2316e8152\") " pod="openstack/ceilometer-0" Sep 30 14:22:52 crc kubenswrapper[4676]: I0930 14:22:52.915098 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/35b4f753-8942-4db3-960b-09f2316e8152-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"35b4f753-8942-4db3-960b-09f2316e8152\") " pod="openstack/ceilometer-0" Sep 30 14:22:52 crc kubenswrapper[4676]: I0930 14:22:52.915956 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/35b4f753-8942-4db3-960b-09f2316e8152-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"35b4f753-8942-4db3-960b-09f2316e8152\") " pod="openstack/ceilometer-0" Sep 30 14:22:52 crc kubenswrapper[4676]: I0930 14:22:52.916060 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35b4f753-8942-4db3-960b-09f2316e8152-scripts\") pod \"ceilometer-0\" (UID: \"35b4f753-8942-4db3-960b-09f2316e8152\") " pod="openstack/ceilometer-0" Sep 30 14:22:52 crc kubenswrapper[4676]: I0930 14:22:52.916570 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35b4f753-8942-4db3-960b-09f2316e8152-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"35b4f753-8942-4db3-960b-09f2316e8152\") " pod="openstack/ceilometer-0" Sep 30 14:22:52 crc kubenswrapper[4676]: I0930 14:22:52.925946 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35b4f753-8942-4db3-960b-09f2316e8152-config-data\") pod \"ceilometer-0\" (UID: \"35b4f753-8942-4db3-960b-09f2316e8152\") " pod="openstack/ceilometer-0" Sep 30 14:22:52 crc kubenswrapper[4676]: I0930 14:22:52.932540 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sswwj\" (UniqueName: \"kubernetes.io/projected/35b4f753-8942-4db3-960b-09f2316e8152-kube-api-access-sswwj\") pod \"ceilometer-0\" (UID: \"35b4f753-8942-4db3-960b-09f2316e8152\") " pod="openstack/ceilometer-0" Sep 30 14:22:53 crc kubenswrapper[4676]: I0930 14:22:53.009796 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 14:22:53 crc kubenswrapper[4676]: I0930 14:22:53.444218 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73a5791c-b9cc-4142-a4d8-fb05651a30af" path="/var/lib/kubelet/pods/73a5791c-b9cc-4142-a4d8-fb05651a30af/volumes" Sep 30 14:22:53 crc kubenswrapper[4676]: I0930 14:22:53.528699 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 14:22:53 crc kubenswrapper[4676]: I0930 14:22:53.572564 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35b4f753-8942-4db3-960b-09f2316e8152","Type":"ContainerStarted","Data":"39ba5e8433b6237d33415ca2f33681fa6d10dde8fe6f6a2a3a83f0ab865afee6"} Sep 30 14:22:54 crc kubenswrapper[4676]: I0930 14:22:54.558174 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 14:22:54 crc kubenswrapper[4676]: I0930 14:22:54.585106 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35b4f753-8942-4db3-960b-09f2316e8152","Type":"ContainerStarted","Data":"840d2a7b624a1f31891c186d3f7fc820690288ad61fd7be641d78e0ad534c13c"} Sep 30 14:22:55 crc kubenswrapper[4676]: I0930 14:22:55.128961 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 14:22:55 crc kubenswrapper[4676]: I0930 14:22:55.258554 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a-config-data\") pod \"ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a\" (UID: \"ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a\") " Sep 30 14:22:55 crc kubenswrapper[4676]: I0930 14:22:55.258984 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a-combined-ca-bundle\") pod \"ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a\" (UID: \"ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a\") " Sep 30 14:22:55 crc kubenswrapper[4676]: I0930 14:22:55.259048 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qp2t\" (UniqueName: \"kubernetes.io/projected/ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a-kube-api-access-7qp2t\") pod \"ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a\" (UID: \"ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a\") " Sep 30 14:22:55 crc kubenswrapper[4676]: I0930 14:22:55.259183 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a-logs\") pod \"ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a\" (UID: \"ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a\") " Sep 30 14:22:55 crc kubenswrapper[4676]: I0930 14:22:55.260312 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a-logs" (OuterVolumeSpecName: "logs") pod "ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a" (UID: "ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:22:55 crc kubenswrapper[4676]: I0930 14:22:55.266247 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a-kube-api-access-7qp2t" (OuterVolumeSpecName: "kube-api-access-7qp2t") pod "ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a" (UID: "ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a"). InnerVolumeSpecName "kube-api-access-7qp2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:22:55 crc kubenswrapper[4676]: I0930 14:22:55.309733 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a" (UID: "ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:22:55 crc kubenswrapper[4676]: I0930 14:22:55.310025 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a-config-data" (OuterVolumeSpecName: "config-data") pod "ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a" (UID: "ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:22:55 crc kubenswrapper[4676]: I0930 14:22:55.361581 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:55 crc kubenswrapper[4676]: I0930 14:22:55.361618 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:55 crc kubenswrapper[4676]: I0930 14:22:55.361629 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qp2t\" (UniqueName: \"kubernetes.io/projected/ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a-kube-api-access-7qp2t\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:55 crc kubenswrapper[4676]: I0930 14:22:55.361639 4676 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a-logs\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:55 crc kubenswrapper[4676]: I0930 14:22:55.597061 4676 generic.go:334] "Generic (PLEG): container finished" podID="ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a" containerID="a3d445756cdce10eeb166f4cafd30feb11ae8771d7477a6e69b9bfdeb67a8793" exitCode=0 Sep 30 14:22:55 crc kubenswrapper[4676]: I0930 14:22:55.597141 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a","Type":"ContainerDied","Data":"a3d445756cdce10eeb166f4cafd30feb11ae8771d7477a6e69b9bfdeb67a8793"} Sep 30 14:22:55 crc kubenswrapper[4676]: I0930 14:22:55.597167 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a","Type":"ContainerDied","Data":"64ecba4ef51dcb744f30879a755e9091300ef82e1090a1c11b57a9592fc2940b"} Sep 30 14:22:55 crc kubenswrapper[4676]: I0930 14:22:55.597185 4676 scope.go:117] "RemoveContainer" containerID="a3d445756cdce10eeb166f4cafd30feb11ae8771d7477a6e69b9bfdeb67a8793" Sep 30 14:22:55 crc kubenswrapper[4676]: I0930 14:22:55.597302 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 14:22:55 crc kubenswrapper[4676]: I0930 14:22:55.601899 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35b4f753-8942-4db3-960b-09f2316e8152","Type":"ContainerStarted","Data":"9578a81995e40a6e0156beb060209f226b6b3e252e78fa4cb4bd8fe1d8986ef1"} Sep 30 14:22:55 crc kubenswrapper[4676]: I0930 14:22:55.639862 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 14:22:55 crc kubenswrapper[4676]: I0930 14:22:55.640188 4676 scope.go:117] "RemoveContainer" containerID="a8633ab480e720f25318372559fffacd77af460ef7c8cb22da21da981d967676" Sep 30 14:22:55 crc kubenswrapper[4676]: I0930 14:22:55.651920 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Sep 30 14:22:55 crc kubenswrapper[4676]: I0930 14:22:55.676995 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Sep 30 14:22:55 crc kubenswrapper[4676]: E0930 14:22:55.677505 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a" containerName="nova-api-log" Sep 30 14:22:55 crc kubenswrapper[4676]: I0930 14:22:55.677533 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a" containerName="nova-api-log" Sep 30 14:22:55 crc kubenswrapper[4676]: E0930 14:22:55.677586 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a" containerName="nova-api-api" Sep 30 14:22:55 crc kubenswrapper[4676]: I0930 14:22:55.677596 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a" containerName="nova-api-api" Sep 30 14:22:55 crc kubenswrapper[4676]: I0930 14:22:55.677814 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a" containerName="nova-api-api" Sep 30 14:22:55 crc kubenswrapper[4676]: I0930 14:22:55.677849 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a" containerName="nova-api-log" Sep 30 14:22:55 crc kubenswrapper[4676]: I0930 14:22:55.679170 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 14:22:55 crc kubenswrapper[4676]: I0930 14:22:55.681344 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Sep 30 14:22:55 crc kubenswrapper[4676]: I0930 14:22:55.681568 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Sep 30 14:22:55 crc kubenswrapper[4676]: I0930 14:22:55.681768 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Sep 30 14:22:55 crc kubenswrapper[4676]: I0930 14:22:55.688591 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 14:22:55 crc kubenswrapper[4676]: I0930 14:22:55.690156 4676 scope.go:117] "RemoveContainer" containerID="a3d445756cdce10eeb166f4cafd30feb11ae8771d7477a6e69b9bfdeb67a8793" Sep 30 14:22:55 crc kubenswrapper[4676]: E0930 14:22:55.690693 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3d445756cdce10eeb166f4cafd30feb11ae8771d7477a6e69b9bfdeb67a8793\": container with ID starting with a3d445756cdce10eeb166f4cafd30feb11ae8771d7477a6e69b9bfdeb67a8793 not found: ID does not exist" containerID="a3d445756cdce10eeb166f4cafd30feb11ae8771d7477a6e69b9bfdeb67a8793" Sep 30 14:22:55 crc kubenswrapper[4676]: I0930 14:22:55.690751 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3d445756cdce10eeb166f4cafd30feb11ae8771d7477a6e69b9bfdeb67a8793"} err="failed to get container status \"a3d445756cdce10eeb166f4cafd30feb11ae8771d7477a6e69b9bfdeb67a8793\": rpc error: code = NotFound desc = could not find container \"a3d445756cdce10eeb166f4cafd30feb11ae8771d7477a6e69b9bfdeb67a8793\": container with ID starting with a3d445756cdce10eeb166f4cafd30feb11ae8771d7477a6e69b9bfdeb67a8793 not found: ID does not exist" Sep 30 14:22:55 crc kubenswrapper[4676]: I0930 14:22:55.690778 4676 scope.go:117] "RemoveContainer" containerID="a8633ab480e720f25318372559fffacd77af460ef7c8cb22da21da981d967676" Sep 30 14:22:55 crc kubenswrapper[4676]: E0930 14:22:55.691594 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8633ab480e720f25318372559fffacd77af460ef7c8cb22da21da981d967676\": container with ID starting with a8633ab480e720f25318372559fffacd77af460ef7c8cb22da21da981d967676 not found: ID does not exist" containerID="a8633ab480e720f25318372559fffacd77af460ef7c8cb22da21da981d967676" Sep 30 14:22:55 crc kubenswrapper[4676]: I0930 14:22:55.691655 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8633ab480e720f25318372559fffacd77af460ef7c8cb22da21da981d967676"} err="failed to get container status \"a8633ab480e720f25318372559fffacd77af460ef7c8cb22da21da981d967676\": rpc error: code = NotFound desc = could not find container \"a8633ab480e720f25318372559fffacd77af460ef7c8cb22da21da981d967676\": container with ID starting with a8633ab480e720f25318372559fffacd77af460ef7c8cb22da21da981d967676 not found: ID does not exist" Sep 30 14:22:55 crc kubenswrapper[4676]: I0930 14:22:55.769247 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0bc3e41b-ae3f-4732-815b-76dda8e0730b-logs\") pod \"nova-api-0\" (UID: \"0bc3e41b-ae3f-4732-815b-76dda8e0730b\") " pod="openstack/nova-api-0" Sep 30 14:22:55 crc kubenswrapper[4676]: I0930 14:22:55.769676 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bc3e41b-ae3f-4732-815b-76dda8e0730b-config-data\") pod \"nova-api-0\" (UID: \"0bc3e41b-ae3f-4732-815b-76dda8e0730b\") " pod="openstack/nova-api-0" Sep 30 14:22:55 crc kubenswrapper[4676]: I0930 14:22:55.769705 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzwrk\" (UniqueName: \"kubernetes.io/projected/0bc3e41b-ae3f-4732-815b-76dda8e0730b-kube-api-access-nzwrk\") pod \"nova-api-0\" (UID: \"0bc3e41b-ae3f-4732-815b-76dda8e0730b\") " pod="openstack/nova-api-0" Sep 30 14:22:55 crc kubenswrapper[4676]: I0930 14:22:55.769990 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bc3e41b-ae3f-4732-815b-76dda8e0730b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0bc3e41b-ae3f-4732-815b-76dda8e0730b\") " pod="openstack/nova-api-0" Sep 30 14:22:55 crc kubenswrapper[4676]: I0930 14:22:55.770249 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0bc3e41b-ae3f-4732-815b-76dda8e0730b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"0bc3e41b-ae3f-4732-815b-76dda8e0730b\") " pod="openstack/nova-api-0" Sep 30 14:22:55 crc kubenswrapper[4676]: I0930 14:22:55.770367 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0bc3e41b-ae3f-4732-815b-76dda8e0730b-public-tls-certs\") pod \"nova-api-0\" (UID: \"0bc3e41b-ae3f-4732-815b-76dda8e0730b\") " pod="openstack/nova-api-0" Sep 30 14:22:55 crc kubenswrapper[4676]: I0930 14:22:55.871965 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0bc3e41b-ae3f-4732-815b-76dda8e0730b-public-tls-certs\") pod \"nova-api-0\" (UID: \"0bc3e41b-ae3f-4732-815b-76dda8e0730b\") " pod="openstack/nova-api-0" Sep 30 14:22:55 crc kubenswrapper[4676]: I0930 14:22:55.872134 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0bc3e41b-ae3f-4732-815b-76dda8e0730b-logs\") pod \"nova-api-0\" (UID: \"0bc3e41b-ae3f-4732-815b-76dda8e0730b\") " pod="openstack/nova-api-0" Sep 30 14:22:55 crc kubenswrapper[4676]: I0930 14:22:55.872281 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzwrk\" (UniqueName: \"kubernetes.io/projected/0bc3e41b-ae3f-4732-815b-76dda8e0730b-kube-api-access-nzwrk\") pod \"nova-api-0\" (UID: \"0bc3e41b-ae3f-4732-815b-76dda8e0730b\") " pod="openstack/nova-api-0" Sep 30 14:22:55 crc kubenswrapper[4676]: I0930 14:22:55.872353 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bc3e41b-ae3f-4732-815b-76dda8e0730b-config-data\") pod \"nova-api-0\" (UID: \"0bc3e41b-ae3f-4732-815b-76dda8e0730b\") " pod="openstack/nova-api-0" Sep 30 14:22:55 crc kubenswrapper[4676]: I0930 14:22:55.872500 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bc3e41b-ae3f-4732-815b-76dda8e0730b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0bc3e41b-ae3f-4732-815b-76dda8e0730b\") " pod="openstack/nova-api-0" Sep 30 14:22:55 crc kubenswrapper[4676]: I0930 14:22:55.872967 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0bc3e41b-ae3f-4732-815b-76dda8e0730b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"0bc3e41b-ae3f-4732-815b-76dda8e0730b\") " pod="openstack/nova-api-0" Sep 30 14:22:55 crc kubenswrapper[4676]: I0930 14:22:55.872743 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0bc3e41b-ae3f-4732-815b-76dda8e0730b-logs\") pod \"nova-api-0\" (UID: \"0bc3e41b-ae3f-4732-815b-76dda8e0730b\") " pod="openstack/nova-api-0" Sep 30 14:22:55 crc kubenswrapper[4676]: I0930 14:22:55.876225 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bc3e41b-ae3f-4732-815b-76dda8e0730b-config-data\") pod \"nova-api-0\" (UID: \"0bc3e41b-ae3f-4732-815b-76dda8e0730b\") " pod="openstack/nova-api-0" Sep 30 14:22:55 crc kubenswrapper[4676]: I0930 14:22:55.876557 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bc3e41b-ae3f-4732-815b-76dda8e0730b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0bc3e41b-ae3f-4732-815b-76dda8e0730b\") " pod="openstack/nova-api-0" Sep 30 14:22:55 crc kubenswrapper[4676]: I0930 14:22:55.876865 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0bc3e41b-ae3f-4732-815b-76dda8e0730b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"0bc3e41b-ae3f-4732-815b-76dda8e0730b\") " pod="openstack/nova-api-0" Sep 30 14:22:55 crc kubenswrapper[4676]: I0930 14:22:55.880969 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0bc3e41b-ae3f-4732-815b-76dda8e0730b-public-tls-certs\") pod \"nova-api-0\" (UID: \"0bc3e41b-ae3f-4732-815b-76dda8e0730b\") " pod="openstack/nova-api-0" Sep 30 14:22:55 crc kubenswrapper[4676]: I0930 14:22:55.896417 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzwrk\" (UniqueName: \"kubernetes.io/projected/0bc3e41b-ae3f-4732-815b-76dda8e0730b-kube-api-access-nzwrk\") pod \"nova-api-0\" (UID: \"0bc3e41b-ae3f-4732-815b-76dda8e0730b\") " pod="openstack/nova-api-0" Sep 30 14:22:55 crc kubenswrapper[4676]: I0930 14:22:55.996105 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 14:22:56 crc kubenswrapper[4676]: I0930 14:22:56.420671 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 14:22:56 crc kubenswrapper[4676]: I0930 14:22:56.612248 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0bc3e41b-ae3f-4732-815b-76dda8e0730b","Type":"ContainerStarted","Data":"a5fb5f902c326942c29fc3001dd58ce4d84bd50e8486e168bd7ff43abf0ce87c"} Sep 30 14:22:56 crc kubenswrapper[4676]: I0930 14:22:56.612530 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0bc3e41b-ae3f-4732-815b-76dda8e0730b","Type":"ContainerStarted","Data":"cbf5cb278a884cd4e569fdd7e08f4df75aebfd7b5ea4417a56e0c30391a3db4f"} Sep 30 14:22:56 crc kubenswrapper[4676]: I0930 14:22:56.615245 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35b4f753-8942-4db3-960b-09f2316e8152","Type":"ContainerStarted","Data":"8a9de551d85030683ea8fbaa3a33a25d88158ddacaf1e1dfa1d2037561f0cc70"} Sep 30 14:22:57 crc kubenswrapper[4676]: I0930 14:22:57.166223 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Sep 30 14:22:57 crc kubenswrapper[4676]: I0930 14:22:57.184353 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Sep 30 14:22:57 crc kubenswrapper[4676]: I0930 14:22:57.445801 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a" path="/var/lib/kubelet/pods/ea1ffd3a-4915-4ac8-9bdd-ee0bbad2e94a/volumes" Sep 30 14:22:57 crc kubenswrapper[4676]: I0930 14:22:57.631658 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0bc3e41b-ae3f-4732-815b-76dda8e0730b","Type":"ContainerStarted","Data":"90419b61c388d39e6ad278ec514bdc43c8f2b099c10f431cc4df4aaa98351d00"} Sep 30 14:22:57 crc kubenswrapper[4676]: I0930 14:22:57.656513 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Sep 30 14:22:57 crc kubenswrapper[4676]: I0930 14:22:57.663012 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.66298484 podStartE2EDuration="2.66298484s" podCreationTimestamp="2025-09-30 14:22:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:22:57.652046585 +0000 UTC m=+1481.635135014" watchObservedRunningTime="2025-09-30 14:22:57.66298484 +0000 UTC m=+1481.646073269" Sep 30 14:22:57 crc kubenswrapper[4676]: I0930 14:22:57.861763 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-4fvlz"] Sep 30 14:22:57 crc kubenswrapper[4676]: I0930 14:22:57.863169 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-4fvlz" Sep 30 14:22:57 crc kubenswrapper[4676]: I0930 14:22:57.866820 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Sep 30 14:22:57 crc kubenswrapper[4676]: I0930 14:22:57.867067 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Sep 30 14:22:57 crc kubenswrapper[4676]: I0930 14:22:57.875825 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-4fvlz"] Sep 30 14:22:57 crc kubenswrapper[4676]: I0930 14:22:57.922091 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-546w8\" (UniqueName: \"kubernetes.io/projected/e77accdf-897d-4abd-b12c-b28bf6406a78-kube-api-access-546w8\") pod \"nova-cell1-cell-mapping-4fvlz\" (UID: \"e77accdf-897d-4abd-b12c-b28bf6406a78\") " pod="openstack/nova-cell1-cell-mapping-4fvlz" Sep 30 14:22:57 crc kubenswrapper[4676]: I0930 14:22:57.922212 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e77accdf-897d-4abd-b12c-b28bf6406a78-scripts\") pod \"nova-cell1-cell-mapping-4fvlz\" (UID: \"e77accdf-897d-4abd-b12c-b28bf6406a78\") " pod="openstack/nova-cell1-cell-mapping-4fvlz" Sep 30 14:22:57 crc kubenswrapper[4676]: I0930 14:22:57.922263 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e77accdf-897d-4abd-b12c-b28bf6406a78-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-4fvlz\" (UID: \"e77accdf-897d-4abd-b12c-b28bf6406a78\") " pod="openstack/nova-cell1-cell-mapping-4fvlz" Sep 30 14:22:57 crc kubenswrapper[4676]: I0930 14:22:57.922299 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e77accdf-897d-4abd-b12c-b28bf6406a78-config-data\") pod \"nova-cell1-cell-mapping-4fvlz\" (UID: \"e77accdf-897d-4abd-b12c-b28bf6406a78\") " pod="openstack/nova-cell1-cell-mapping-4fvlz" Sep 30 14:22:58 crc kubenswrapper[4676]: I0930 14:22:58.024398 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e77accdf-897d-4abd-b12c-b28bf6406a78-scripts\") pod \"nova-cell1-cell-mapping-4fvlz\" (UID: \"e77accdf-897d-4abd-b12c-b28bf6406a78\") " pod="openstack/nova-cell1-cell-mapping-4fvlz" Sep 30 14:22:58 crc kubenswrapper[4676]: I0930 14:22:58.024730 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e77accdf-897d-4abd-b12c-b28bf6406a78-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-4fvlz\" (UID: \"e77accdf-897d-4abd-b12c-b28bf6406a78\") " pod="openstack/nova-cell1-cell-mapping-4fvlz" Sep 30 14:22:58 crc kubenswrapper[4676]: I0930 14:22:58.024781 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e77accdf-897d-4abd-b12c-b28bf6406a78-config-data\") pod \"nova-cell1-cell-mapping-4fvlz\" (UID: \"e77accdf-897d-4abd-b12c-b28bf6406a78\") " pod="openstack/nova-cell1-cell-mapping-4fvlz" Sep 30 14:22:58 crc kubenswrapper[4676]: I0930 14:22:58.024821 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-546w8\" (UniqueName: \"kubernetes.io/projected/e77accdf-897d-4abd-b12c-b28bf6406a78-kube-api-access-546w8\") pod \"nova-cell1-cell-mapping-4fvlz\" (UID: \"e77accdf-897d-4abd-b12c-b28bf6406a78\") " pod="openstack/nova-cell1-cell-mapping-4fvlz" Sep 30 14:22:58 crc kubenswrapper[4676]: I0930 14:22:58.029098 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e77accdf-897d-4abd-b12c-b28bf6406a78-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-4fvlz\" (UID: \"e77accdf-897d-4abd-b12c-b28bf6406a78\") " pod="openstack/nova-cell1-cell-mapping-4fvlz" Sep 30 14:22:58 crc kubenswrapper[4676]: I0930 14:22:58.037653 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e77accdf-897d-4abd-b12c-b28bf6406a78-scripts\") pod \"nova-cell1-cell-mapping-4fvlz\" (UID: \"e77accdf-897d-4abd-b12c-b28bf6406a78\") " pod="openstack/nova-cell1-cell-mapping-4fvlz" Sep 30 14:22:58 crc kubenswrapper[4676]: I0930 14:22:58.038802 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e77accdf-897d-4abd-b12c-b28bf6406a78-config-data\") pod \"nova-cell1-cell-mapping-4fvlz\" (UID: \"e77accdf-897d-4abd-b12c-b28bf6406a78\") " pod="openstack/nova-cell1-cell-mapping-4fvlz" Sep 30 14:22:58 crc kubenswrapper[4676]: I0930 14:22:58.042303 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-546w8\" (UniqueName: \"kubernetes.io/projected/e77accdf-897d-4abd-b12c-b28bf6406a78-kube-api-access-546w8\") pod \"nova-cell1-cell-mapping-4fvlz\" (UID: \"e77accdf-897d-4abd-b12c-b28bf6406a78\") " pod="openstack/nova-cell1-cell-mapping-4fvlz" Sep 30 14:22:58 crc kubenswrapper[4676]: I0930 14:22:58.179664 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-4fvlz" Sep 30 14:22:58 crc kubenswrapper[4676]: I0930 14:22:58.644785 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35b4f753-8942-4db3-960b-09f2316e8152","Type":"ContainerStarted","Data":"cdbcf323441d4f66d7eaaf7601aa0afb4a56983d49fab23b1d5b364f812d64bf"} Sep 30 14:22:58 crc kubenswrapper[4676]: I0930 14:22:58.644847 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="35b4f753-8942-4db3-960b-09f2316e8152" containerName="ceilometer-central-agent" containerID="cri-o://840d2a7b624a1f31891c186d3f7fc820690288ad61fd7be641d78e0ad534c13c" gracePeriod=30 Sep 30 14:22:58 crc kubenswrapper[4676]: I0930 14:22:58.645027 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="35b4f753-8942-4db3-960b-09f2316e8152" containerName="proxy-httpd" containerID="cri-o://cdbcf323441d4f66d7eaaf7601aa0afb4a56983d49fab23b1d5b364f812d64bf" gracePeriod=30 Sep 30 14:22:58 crc kubenswrapper[4676]: I0930 14:22:58.645045 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="35b4f753-8942-4db3-960b-09f2316e8152" containerName="sg-core" containerID="cri-o://8a9de551d85030683ea8fbaa3a33a25d88158ddacaf1e1dfa1d2037561f0cc70" gracePeriod=30 Sep 30 14:22:58 crc kubenswrapper[4676]: I0930 14:22:58.645062 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="35b4f753-8942-4db3-960b-09f2316e8152" containerName="ceilometer-notification-agent" containerID="cri-o://9578a81995e40a6e0156beb060209f226b6b3e252e78fa4cb4bd8fe1d8986ef1" gracePeriod=30 Sep 30 14:22:58 crc kubenswrapper[4676]: I0930 14:22:58.645490 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 14:22:58 crc kubenswrapper[4676]: I0930 14:22:58.669849 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-4fvlz"] Sep 30 14:22:59 crc kubenswrapper[4676]: I0930 14:22:59.056273 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-59cf4bdb65-dq8jp" Sep 30 14:22:59 crc kubenswrapper[4676]: I0930 14:22:59.085159 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.046527285 podStartE2EDuration="7.085136973s" podCreationTimestamp="2025-09-30 14:22:52 +0000 UTC" firstStartedPulling="2025-09-30 14:22:53.533812157 +0000 UTC m=+1477.516900596" lastFinishedPulling="2025-09-30 14:22:57.572421855 +0000 UTC m=+1481.555510284" observedRunningTime="2025-09-30 14:22:58.684746926 +0000 UTC m=+1482.667835355" watchObservedRunningTime="2025-09-30 14:22:59.085136973 +0000 UTC m=+1483.068225402" Sep 30 14:22:59 crc kubenswrapper[4676]: I0930 14:22:59.127134 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-dc5jg"] Sep 30 14:22:59 crc kubenswrapper[4676]: I0930 14:22:59.127922 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-845d6d6f59-dc5jg" podUID="90feda1c-209e-474d-bd1c-eee343b5f674" containerName="dnsmasq-dns" containerID="cri-o://d2dee546d5ac2ad3f210578fb5199e6246e537e1afa877b9a9efa21051362f18" gracePeriod=10 Sep 30 14:22:59 crc kubenswrapper[4676]: I0930 14:22:59.670764 4676 generic.go:334] "Generic (PLEG): container finished" podID="35b4f753-8942-4db3-960b-09f2316e8152" containerID="cdbcf323441d4f66d7eaaf7601aa0afb4a56983d49fab23b1d5b364f812d64bf" exitCode=0 Sep 30 14:22:59 crc kubenswrapper[4676]: I0930 14:22:59.671093 4676 generic.go:334] "Generic (PLEG): container finished" podID="35b4f753-8942-4db3-960b-09f2316e8152" containerID="8a9de551d85030683ea8fbaa3a33a25d88158ddacaf1e1dfa1d2037561f0cc70" exitCode=2 Sep 30 14:22:59 crc kubenswrapper[4676]: I0930 14:22:59.671104 4676 generic.go:334] "Generic (PLEG): container finished" podID="35b4f753-8942-4db3-960b-09f2316e8152" containerID="9578a81995e40a6e0156beb060209f226b6b3e252e78fa4cb4bd8fe1d8986ef1" exitCode=0 Sep 30 14:22:59 crc kubenswrapper[4676]: I0930 14:22:59.671110 4676 generic.go:334] "Generic (PLEG): container finished" podID="35b4f753-8942-4db3-960b-09f2316e8152" containerID="840d2a7b624a1f31891c186d3f7fc820690288ad61fd7be641d78e0ad534c13c" exitCode=0 Sep 30 14:22:59 crc kubenswrapper[4676]: I0930 14:22:59.672130 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35b4f753-8942-4db3-960b-09f2316e8152","Type":"ContainerDied","Data":"cdbcf323441d4f66d7eaaf7601aa0afb4a56983d49fab23b1d5b364f812d64bf"} Sep 30 14:22:59 crc kubenswrapper[4676]: I0930 14:22:59.672165 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35b4f753-8942-4db3-960b-09f2316e8152","Type":"ContainerDied","Data":"8a9de551d85030683ea8fbaa3a33a25d88158ddacaf1e1dfa1d2037561f0cc70"} Sep 30 14:22:59 crc kubenswrapper[4676]: I0930 14:22:59.672174 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35b4f753-8942-4db3-960b-09f2316e8152","Type":"ContainerDied","Data":"9578a81995e40a6e0156beb060209f226b6b3e252e78fa4cb4bd8fe1d8986ef1"} Sep 30 14:22:59 crc kubenswrapper[4676]: I0930 14:22:59.672183 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35b4f753-8942-4db3-960b-09f2316e8152","Type":"ContainerDied","Data":"840d2a7b624a1f31891c186d3f7fc820690288ad61fd7be641d78e0ad534c13c"} Sep 30 14:22:59 crc kubenswrapper[4676]: I0930 14:22:59.672300 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-dc5jg" Sep 30 14:22:59 crc kubenswrapper[4676]: I0930 14:22:59.675266 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-4fvlz" event={"ID":"e77accdf-897d-4abd-b12c-b28bf6406a78","Type":"ContainerStarted","Data":"61489785c3d00ad3a9ec3beb08cedeeb572595f3b5c955cdb582f80269cd7247"} Sep 30 14:22:59 crc kubenswrapper[4676]: I0930 14:22:59.675298 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-4fvlz" event={"ID":"e77accdf-897d-4abd-b12c-b28bf6406a78","Type":"ContainerStarted","Data":"99a68e1f663edff3c865cebf6413ab277b66956ccc8ecfbff2933f6105e2b232"} Sep 30 14:22:59 crc kubenswrapper[4676]: I0930 14:22:59.679554 4676 generic.go:334] "Generic (PLEG): container finished" podID="90feda1c-209e-474d-bd1c-eee343b5f674" containerID="d2dee546d5ac2ad3f210578fb5199e6246e537e1afa877b9a9efa21051362f18" exitCode=0 Sep 30 14:22:59 crc kubenswrapper[4676]: I0930 14:22:59.679582 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-dc5jg" event={"ID":"90feda1c-209e-474d-bd1c-eee343b5f674","Type":"ContainerDied","Data":"d2dee546d5ac2ad3f210578fb5199e6246e537e1afa877b9a9efa21051362f18"} Sep 30 14:22:59 crc kubenswrapper[4676]: I0930 14:22:59.679600 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-dc5jg" event={"ID":"90feda1c-209e-474d-bd1c-eee343b5f674","Type":"ContainerDied","Data":"c52a4cf7dcdfeba2587705edd1ef20c18e064540f4bbc08f22bba60a9188a7a2"} Sep 30 14:22:59 crc kubenswrapper[4676]: I0930 14:22:59.679617 4676 scope.go:117] "RemoveContainer" containerID="d2dee546d5ac2ad3f210578fb5199e6246e537e1afa877b9a9efa21051362f18" Sep 30 14:22:59 crc kubenswrapper[4676]: I0930 14:22:59.679720 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-dc5jg" Sep 30 14:22:59 crc kubenswrapper[4676]: I0930 14:22:59.714678 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 14:22:59 crc kubenswrapper[4676]: I0930 14:22:59.718012 4676 scope.go:117] "RemoveContainer" containerID="85261a9fae223a658b9979f97e7159ed11991262b7a8fb2d6dfa71f1ba50cf02" Sep 30 14:22:59 crc kubenswrapper[4676]: I0930 14:22:59.743490 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-4fvlz" podStartSLOduration=2.743469706 podStartE2EDuration="2.743469706s" podCreationTimestamp="2025-09-30 14:22:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:22:59.719507191 +0000 UTC m=+1483.702595620" watchObservedRunningTime="2025-09-30 14:22:59.743469706 +0000 UTC m=+1483.726558135" Sep 30 14:22:59 crc kubenswrapper[4676]: I0930 14:22:59.750482 4676 scope.go:117] "RemoveContainer" containerID="d2dee546d5ac2ad3f210578fb5199e6246e537e1afa877b9a9efa21051362f18" Sep 30 14:22:59 crc kubenswrapper[4676]: E0930 14:22:59.751466 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2dee546d5ac2ad3f210578fb5199e6246e537e1afa877b9a9efa21051362f18\": container with ID starting with d2dee546d5ac2ad3f210578fb5199e6246e537e1afa877b9a9efa21051362f18 not found: ID does not exist" containerID="d2dee546d5ac2ad3f210578fb5199e6246e537e1afa877b9a9efa21051362f18" Sep 30 14:22:59 crc kubenswrapper[4676]: I0930 14:22:59.751515 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2dee546d5ac2ad3f210578fb5199e6246e537e1afa877b9a9efa21051362f18"} err="failed to get container status \"d2dee546d5ac2ad3f210578fb5199e6246e537e1afa877b9a9efa21051362f18\": rpc error: code = NotFound desc = could not find container \"d2dee546d5ac2ad3f210578fb5199e6246e537e1afa877b9a9efa21051362f18\": container with ID starting with d2dee546d5ac2ad3f210578fb5199e6246e537e1afa877b9a9efa21051362f18 not found: ID does not exist" Sep 30 14:22:59 crc kubenswrapper[4676]: I0930 14:22:59.751542 4676 scope.go:117] "RemoveContainer" containerID="85261a9fae223a658b9979f97e7159ed11991262b7a8fb2d6dfa71f1ba50cf02" Sep 30 14:22:59 crc kubenswrapper[4676]: E0930 14:22:59.752546 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85261a9fae223a658b9979f97e7159ed11991262b7a8fb2d6dfa71f1ba50cf02\": container with ID starting with 85261a9fae223a658b9979f97e7159ed11991262b7a8fb2d6dfa71f1ba50cf02 not found: ID does not exist" containerID="85261a9fae223a658b9979f97e7159ed11991262b7a8fb2d6dfa71f1ba50cf02" Sep 30 14:22:59 crc kubenswrapper[4676]: I0930 14:22:59.752576 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85261a9fae223a658b9979f97e7159ed11991262b7a8fb2d6dfa71f1ba50cf02"} err="failed to get container status \"85261a9fae223a658b9979f97e7159ed11991262b7a8fb2d6dfa71f1ba50cf02\": rpc error: code = NotFound desc = could not find container \"85261a9fae223a658b9979f97e7159ed11991262b7a8fb2d6dfa71f1ba50cf02\": container with ID starting with 85261a9fae223a658b9979f97e7159ed11991262b7a8fb2d6dfa71f1ba50cf02 not found: ID does not exist" Sep 30 14:22:59 crc kubenswrapper[4676]: I0930 14:22:59.760606 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90feda1c-209e-474d-bd1c-eee343b5f674-ovsdbserver-nb\") pod \"90feda1c-209e-474d-bd1c-eee343b5f674\" (UID: \"90feda1c-209e-474d-bd1c-eee343b5f674\") " Sep 30 14:22:59 crc kubenswrapper[4676]: I0930 14:22:59.760657 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90feda1c-209e-474d-bd1c-eee343b5f674-dns-svc\") pod \"90feda1c-209e-474d-bd1c-eee343b5f674\" (UID: \"90feda1c-209e-474d-bd1c-eee343b5f674\") " Sep 30 14:22:59 crc kubenswrapper[4676]: I0930 14:22:59.760687 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90feda1c-209e-474d-bd1c-eee343b5f674-ovsdbserver-sb\") pod \"90feda1c-209e-474d-bd1c-eee343b5f674\" (UID: \"90feda1c-209e-474d-bd1c-eee343b5f674\") " Sep 30 14:22:59 crc kubenswrapper[4676]: I0930 14:22:59.760750 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90feda1c-209e-474d-bd1c-eee343b5f674-config\") pod \"90feda1c-209e-474d-bd1c-eee343b5f674\" (UID: \"90feda1c-209e-474d-bd1c-eee343b5f674\") " Sep 30 14:22:59 crc kubenswrapper[4676]: I0930 14:22:59.760775 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/90feda1c-209e-474d-bd1c-eee343b5f674-dns-swift-storage-0\") pod \"90feda1c-209e-474d-bd1c-eee343b5f674\" (UID: \"90feda1c-209e-474d-bd1c-eee343b5f674\") " Sep 30 14:22:59 crc kubenswrapper[4676]: I0930 14:22:59.760847 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5sfw7\" (UniqueName: \"kubernetes.io/projected/90feda1c-209e-474d-bd1c-eee343b5f674-kube-api-access-5sfw7\") pod \"90feda1c-209e-474d-bd1c-eee343b5f674\" (UID: \"90feda1c-209e-474d-bd1c-eee343b5f674\") " Sep 30 14:22:59 crc kubenswrapper[4676]: I0930 14:22:59.769047 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90feda1c-209e-474d-bd1c-eee343b5f674-kube-api-access-5sfw7" (OuterVolumeSpecName: "kube-api-access-5sfw7") pod "90feda1c-209e-474d-bd1c-eee343b5f674" (UID: "90feda1c-209e-474d-bd1c-eee343b5f674"). InnerVolumeSpecName "kube-api-access-5sfw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:22:59 crc kubenswrapper[4676]: I0930 14:22:59.811795 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90feda1c-209e-474d-bd1c-eee343b5f674-config" (OuterVolumeSpecName: "config") pod "90feda1c-209e-474d-bd1c-eee343b5f674" (UID: "90feda1c-209e-474d-bd1c-eee343b5f674"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:22:59 crc kubenswrapper[4676]: I0930 14:22:59.812659 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90feda1c-209e-474d-bd1c-eee343b5f674-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "90feda1c-209e-474d-bd1c-eee343b5f674" (UID: "90feda1c-209e-474d-bd1c-eee343b5f674"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:22:59 crc kubenswrapper[4676]: I0930 14:22:59.815760 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90feda1c-209e-474d-bd1c-eee343b5f674-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "90feda1c-209e-474d-bd1c-eee343b5f674" (UID: "90feda1c-209e-474d-bd1c-eee343b5f674"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:22:59 crc kubenswrapper[4676]: I0930 14:22:59.816358 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90feda1c-209e-474d-bd1c-eee343b5f674-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "90feda1c-209e-474d-bd1c-eee343b5f674" (UID: "90feda1c-209e-474d-bd1c-eee343b5f674"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:22:59 crc kubenswrapper[4676]: I0930 14:22:59.822514 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90feda1c-209e-474d-bd1c-eee343b5f674-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "90feda1c-209e-474d-bd1c-eee343b5f674" (UID: "90feda1c-209e-474d-bd1c-eee343b5f674"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:22:59 crc kubenswrapper[4676]: I0930 14:22:59.862405 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35b4f753-8942-4db3-960b-09f2316e8152-log-httpd\") pod \"35b4f753-8942-4db3-960b-09f2316e8152\" (UID: \"35b4f753-8942-4db3-960b-09f2316e8152\") " Sep 30 14:22:59 crc kubenswrapper[4676]: I0930 14:22:59.862450 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/35b4f753-8942-4db3-960b-09f2316e8152-sg-core-conf-yaml\") pod \"35b4f753-8942-4db3-960b-09f2316e8152\" (UID: \"35b4f753-8942-4db3-960b-09f2316e8152\") " Sep 30 14:22:59 crc kubenswrapper[4676]: I0930 14:22:59.862481 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35b4f753-8942-4db3-960b-09f2316e8152-config-data\") pod \"35b4f753-8942-4db3-960b-09f2316e8152\" (UID: \"35b4f753-8942-4db3-960b-09f2316e8152\") " Sep 30 14:22:59 crc kubenswrapper[4676]: I0930 14:22:59.862503 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35b4f753-8942-4db3-960b-09f2316e8152-combined-ca-bundle\") pod \"35b4f753-8942-4db3-960b-09f2316e8152\" (UID: \"35b4f753-8942-4db3-960b-09f2316e8152\") " Sep 30 14:22:59 crc kubenswrapper[4676]: I0930 14:22:59.862522 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35b4f753-8942-4db3-960b-09f2316e8152-scripts\") pod \"35b4f753-8942-4db3-960b-09f2316e8152\" (UID: \"35b4f753-8942-4db3-960b-09f2316e8152\") " Sep 30 14:22:59 crc kubenswrapper[4676]: I0930 14:22:59.862552 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sswwj\" (UniqueName: \"kubernetes.io/projected/35b4f753-8942-4db3-960b-09f2316e8152-kube-api-access-sswwj\") pod \"35b4f753-8942-4db3-960b-09f2316e8152\" (UID: \"35b4f753-8942-4db3-960b-09f2316e8152\") " Sep 30 14:22:59 crc kubenswrapper[4676]: I0930 14:22:59.862586 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/35b4f753-8942-4db3-960b-09f2316e8152-ceilometer-tls-certs\") pod \"35b4f753-8942-4db3-960b-09f2316e8152\" (UID: \"35b4f753-8942-4db3-960b-09f2316e8152\") " Sep 30 14:22:59 crc kubenswrapper[4676]: I0930 14:22:59.862665 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35b4f753-8942-4db3-960b-09f2316e8152-run-httpd\") pod \"35b4f753-8942-4db3-960b-09f2316e8152\" (UID: \"35b4f753-8942-4db3-960b-09f2316e8152\") " Sep 30 14:22:59 crc kubenswrapper[4676]: I0930 14:22:59.863141 4676 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90feda1c-209e-474d-bd1c-eee343b5f674-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:59 crc kubenswrapper[4676]: I0930 14:22:59.863158 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90feda1c-209e-474d-bd1c-eee343b5f674-config\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:59 crc kubenswrapper[4676]: I0930 14:22:59.863167 4676 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/90feda1c-209e-474d-bd1c-eee343b5f674-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:59 crc kubenswrapper[4676]: I0930 14:22:59.863177 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5sfw7\" (UniqueName: \"kubernetes.io/projected/90feda1c-209e-474d-bd1c-eee343b5f674-kube-api-access-5sfw7\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:59 crc kubenswrapper[4676]: I0930 14:22:59.863187 4676 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90feda1c-209e-474d-bd1c-eee343b5f674-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:59 crc kubenswrapper[4676]: I0930 14:22:59.863195 4676 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90feda1c-209e-474d-bd1c-eee343b5f674-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:59 crc kubenswrapper[4676]: I0930 14:22:59.863470 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35b4f753-8942-4db3-960b-09f2316e8152-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "35b4f753-8942-4db3-960b-09f2316e8152" (UID: "35b4f753-8942-4db3-960b-09f2316e8152"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:22:59 crc kubenswrapper[4676]: I0930 14:22:59.863547 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35b4f753-8942-4db3-960b-09f2316e8152-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "35b4f753-8942-4db3-960b-09f2316e8152" (UID: "35b4f753-8942-4db3-960b-09f2316e8152"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:22:59 crc kubenswrapper[4676]: I0930 14:22:59.866037 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35b4f753-8942-4db3-960b-09f2316e8152-scripts" (OuterVolumeSpecName: "scripts") pod "35b4f753-8942-4db3-960b-09f2316e8152" (UID: "35b4f753-8942-4db3-960b-09f2316e8152"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:22:59 crc kubenswrapper[4676]: I0930 14:22:59.866532 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35b4f753-8942-4db3-960b-09f2316e8152-kube-api-access-sswwj" (OuterVolumeSpecName: "kube-api-access-sswwj") pod "35b4f753-8942-4db3-960b-09f2316e8152" (UID: "35b4f753-8942-4db3-960b-09f2316e8152"). InnerVolumeSpecName "kube-api-access-sswwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:22:59 crc kubenswrapper[4676]: I0930 14:22:59.899112 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35b4f753-8942-4db3-960b-09f2316e8152-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "35b4f753-8942-4db3-960b-09f2316e8152" (UID: "35b4f753-8942-4db3-960b-09f2316e8152"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:22:59 crc kubenswrapper[4676]: I0930 14:22:59.912588 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35b4f753-8942-4db3-960b-09f2316e8152-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "35b4f753-8942-4db3-960b-09f2316e8152" (UID: "35b4f753-8942-4db3-960b-09f2316e8152"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:22:59 crc kubenswrapper[4676]: I0930 14:22:59.919984 4676 patch_prober.go:28] interesting pod/machine-config-daemon-4k2dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:22:59 crc kubenswrapper[4676]: I0930 14:22:59.920044 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:22:59 crc kubenswrapper[4676]: I0930 14:22:59.943133 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35b4f753-8942-4db3-960b-09f2316e8152-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "35b4f753-8942-4db3-960b-09f2316e8152" (UID: "35b4f753-8942-4db3-960b-09f2316e8152"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:22:59 crc kubenswrapper[4676]: I0930 14:22:59.955846 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35b4f753-8942-4db3-960b-09f2316e8152-config-data" (OuterVolumeSpecName: "config-data") pod "35b4f753-8942-4db3-960b-09f2316e8152" (UID: "35b4f753-8942-4db3-960b-09f2316e8152"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:22:59 crc kubenswrapper[4676]: I0930 14:22:59.964694 4676 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/35b4f753-8942-4db3-960b-09f2316e8152-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:59 crc kubenswrapper[4676]: I0930 14:22:59.964911 4676 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35b4f753-8942-4db3-960b-09f2316e8152-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:59 crc kubenswrapper[4676]: I0930 14:22:59.964986 4676 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35b4f753-8942-4db3-960b-09f2316e8152-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:59 crc kubenswrapper[4676]: I0930 14:22:59.965039 4676 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/35b4f753-8942-4db3-960b-09f2316e8152-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:59 crc kubenswrapper[4676]: I0930 14:22:59.965088 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35b4f753-8942-4db3-960b-09f2316e8152-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:59 crc kubenswrapper[4676]: I0930 14:22:59.965144 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35b4f753-8942-4db3-960b-09f2316e8152-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:59 crc kubenswrapper[4676]: I0930 14:22:59.965223 4676 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35b4f753-8942-4db3-960b-09f2316e8152-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 14:22:59 crc kubenswrapper[4676]: I0930 14:22:59.965274 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sswwj\" (UniqueName: \"kubernetes.io/projected/35b4f753-8942-4db3-960b-09f2316e8152-kube-api-access-sswwj\") on node \"crc\" DevicePath \"\"" Sep 30 14:23:00 crc kubenswrapper[4676]: I0930 14:23:00.011355 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-dc5jg"] Sep 30 14:23:00 crc kubenswrapper[4676]: I0930 14:23:00.018593 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-dc5jg"] Sep 30 14:23:00 crc kubenswrapper[4676]: I0930 14:23:00.691728 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 14:23:00 crc kubenswrapper[4676]: I0930 14:23:00.695261 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35b4f753-8942-4db3-960b-09f2316e8152","Type":"ContainerDied","Data":"39ba5e8433b6237d33415ca2f33681fa6d10dde8fe6f6a2a3a83f0ab865afee6"} Sep 30 14:23:00 crc kubenswrapper[4676]: I0930 14:23:00.695307 4676 scope.go:117] "RemoveContainer" containerID="cdbcf323441d4f66d7eaaf7601aa0afb4a56983d49fab23b1d5b364f812d64bf" Sep 30 14:23:00 crc kubenswrapper[4676]: I0930 14:23:00.726697 4676 scope.go:117] "RemoveContainer" containerID="8a9de551d85030683ea8fbaa3a33a25d88158ddacaf1e1dfa1d2037561f0cc70" Sep 30 14:23:00 crc kubenswrapper[4676]: I0930 14:23:00.734267 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 14:23:00 crc kubenswrapper[4676]: I0930 14:23:00.745433 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 14:23:00 crc kubenswrapper[4676]: I0930 14:23:00.753126 4676 scope.go:117] "RemoveContainer" containerID="9578a81995e40a6e0156beb060209f226b6b3e252e78fa4cb4bd8fe1d8986ef1" Sep 30 14:23:00 crc kubenswrapper[4676]: I0930 14:23:00.768146 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 14:23:00 crc kubenswrapper[4676]: E0930 14:23:00.768620 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90feda1c-209e-474d-bd1c-eee343b5f674" containerName="dnsmasq-dns" Sep 30 14:23:00 crc kubenswrapper[4676]: I0930 14:23:00.768639 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="90feda1c-209e-474d-bd1c-eee343b5f674" containerName="dnsmasq-dns" Sep 30 14:23:00 crc kubenswrapper[4676]: E0930 14:23:00.768669 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90feda1c-209e-474d-bd1c-eee343b5f674" containerName="init" Sep 30 14:23:00 crc kubenswrapper[4676]: I0930 14:23:00.768679 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="90feda1c-209e-474d-bd1c-eee343b5f674" containerName="init" Sep 30 14:23:00 crc kubenswrapper[4676]: E0930 14:23:00.768699 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35b4f753-8942-4db3-960b-09f2316e8152" containerName="proxy-httpd" Sep 30 14:23:00 crc kubenswrapper[4676]: I0930 14:23:00.768707 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="35b4f753-8942-4db3-960b-09f2316e8152" containerName="proxy-httpd" Sep 30 14:23:00 crc kubenswrapper[4676]: E0930 14:23:00.768722 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35b4f753-8942-4db3-960b-09f2316e8152" containerName="sg-core" Sep 30 14:23:00 crc kubenswrapper[4676]: I0930 14:23:00.768729 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="35b4f753-8942-4db3-960b-09f2316e8152" containerName="sg-core" Sep 30 14:23:00 crc kubenswrapper[4676]: E0930 14:23:00.768758 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35b4f753-8942-4db3-960b-09f2316e8152" containerName="ceilometer-notification-agent" Sep 30 14:23:00 crc kubenswrapper[4676]: I0930 14:23:00.768766 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="35b4f753-8942-4db3-960b-09f2316e8152" containerName="ceilometer-notification-agent" Sep 30 14:23:00 crc kubenswrapper[4676]: E0930 14:23:00.768782 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35b4f753-8942-4db3-960b-09f2316e8152" containerName="ceilometer-central-agent" Sep 30 14:23:00 crc kubenswrapper[4676]: I0930 14:23:00.768789 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="35b4f753-8942-4db3-960b-09f2316e8152" containerName="ceilometer-central-agent" Sep 30 14:23:00 crc kubenswrapper[4676]: I0930 14:23:00.770042 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="35b4f753-8942-4db3-960b-09f2316e8152" containerName="ceilometer-notification-agent" Sep 30 14:23:00 crc kubenswrapper[4676]: I0930 14:23:00.770070 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="35b4f753-8942-4db3-960b-09f2316e8152" containerName="proxy-httpd" Sep 30 14:23:00 crc kubenswrapper[4676]: I0930 14:23:00.770090 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="35b4f753-8942-4db3-960b-09f2316e8152" containerName="ceilometer-central-agent" Sep 30 14:23:00 crc kubenswrapper[4676]: I0930 14:23:00.770109 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="90feda1c-209e-474d-bd1c-eee343b5f674" containerName="dnsmasq-dns" Sep 30 14:23:00 crc kubenswrapper[4676]: I0930 14:23:00.770141 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="35b4f753-8942-4db3-960b-09f2316e8152" containerName="sg-core" Sep 30 14:23:00 crc kubenswrapper[4676]: I0930 14:23:00.772392 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 14:23:00 crc kubenswrapper[4676]: I0930 14:23:00.775841 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Sep 30 14:23:00 crc kubenswrapper[4676]: I0930 14:23:00.776348 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 14:23:00 crc kubenswrapper[4676]: I0930 14:23:00.790198 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 14:23:00 crc kubenswrapper[4676]: I0930 14:23:00.815541 4676 scope.go:117] "RemoveContainer" containerID="840d2a7b624a1f31891c186d3f7fc820690288ad61fd7be641d78e0ad534c13c" Sep 30 14:23:00 crc kubenswrapper[4676]: I0930 14:23:00.821995 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 14:23:00 crc kubenswrapper[4676]: I0930 14:23:00.912822 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a301731-560b-431e-b265-ef436fa8eccb-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2a301731-560b-431e-b265-ef436fa8eccb\") " pod="openstack/ceilometer-0" Sep 30 14:23:00 crc kubenswrapper[4676]: I0930 14:23:00.912978 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2a301731-560b-431e-b265-ef436fa8eccb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2a301731-560b-431e-b265-ef436fa8eccb\") " pod="openstack/ceilometer-0" Sep 30 14:23:00 crc kubenswrapper[4676]: I0930 14:23:00.913006 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a301731-560b-431e-b265-ef436fa8eccb-scripts\") pod \"ceilometer-0\" (UID: \"2a301731-560b-431e-b265-ef436fa8eccb\") " pod="openstack/ceilometer-0" Sep 30 14:23:00 crc kubenswrapper[4676]: I0930 14:23:00.913049 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a301731-560b-431e-b265-ef436fa8eccb-config-data\") pod \"ceilometer-0\" (UID: \"2a301731-560b-431e-b265-ef436fa8eccb\") " pod="openstack/ceilometer-0" Sep 30 14:23:00 crc kubenswrapper[4676]: I0930 14:23:00.913116 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwsws\" (UniqueName: \"kubernetes.io/projected/2a301731-560b-431e-b265-ef436fa8eccb-kube-api-access-dwsws\") pod \"ceilometer-0\" (UID: \"2a301731-560b-431e-b265-ef436fa8eccb\") " pod="openstack/ceilometer-0" Sep 30 14:23:00 crc kubenswrapper[4676]: I0930 14:23:00.913246 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a301731-560b-431e-b265-ef436fa8eccb-run-httpd\") pod \"ceilometer-0\" (UID: \"2a301731-560b-431e-b265-ef436fa8eccb\") " pod="openstack/ceilometer-0" Sep 30 14:23:00 crc kubenswrapper[4676]: I0930 14:23:00.913466 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a301731-560b-431e-b265-ef436fa8eccb-log-httpd\") pod \"ceilometer-0\" (UID: \"2a301731-560b-431e-b265-ef436fa8eccb\") " pod="openstack/ceilometer-0" Sep 30 14:23:00 crc kubenswrapper[4676]: I0930 14:23:00.913497 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a301731-560b-431e-b265-ef436fa8eccb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2a301731-560b-431e-b265-ef436fa8eccb\") " pod="openstack/ceilometer-0" Sep 30 14:23:01 crc kubenswrapper[4676]: I0930 14:23:01.015732 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a301731-560b-431e-b265-ef436fa8eccb-log-httpd\") pod \"ceilometer-0\" (UID: \"2a301731-560b-431e-b265-ef436fa8eccb\") " pod="openstack/ceilometer-0" Sep 30 14:23:01 crc kubenswrapper[4676]: I0930 14:23:01.015775 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a301731-560b-431e-b265-ef436fa8eccb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2a301731-560b-431e-b265-ef436fa8eccb\") " pod="openstack/ceilometer-0" Sep 30 14:23:01 crc kubenswrapper[4676]: I0930 14:23:01.015831 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a301731-560b-431e-b265-ef436fa8eccb-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2a301731-560b-431e-b265-ef436fa8eccb\") " pod="openstack/ceilometer-0" Sep 30 14:23:01 crc kubenswrapper[4676]: I0930 14:23:01.015869 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2a301731-560b-431e-b265-ef436fa8eccb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2a301731-560b-431e-b265-ef436fa8eccb\") " pod="openstack/ceilometer-0" Sep 30 14:23:01 crc kubenswrapper[4676]: I0930 14:23:01.015914 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a301731-560b-431e-b265-ef436fa8eccb-scripts\") pod \"ceilometer-0\" (UID: \"2a301731-560b-431e-b265-ef436fa8eccb\") " pod="openstack/ceilometer-0" Sep 30 14:23:01 crc kubenswrapper[4676]: I0930 14:23:01.015964 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a301731-560b-431e-b265-ef436fa8eccb-config-data\") pod \"ceilometer-0\" (UID: \"2a301731-560b-431e-b265-ef436fa8eccb\") " pod="openstack/ceilometer-0" Sep 30 14:23:01 crc kubenswrapper[4676]: I0930 14:23:01.016011 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwsws\" (UniqueName: \"kubernetes.io/projected/2a301731-560b-431e-b265-ef436fa8eccb-kube-api-access-dwsws\") pod \"ceilometer-0\" (UID: \"2a301731-560b-431e-b265-ef436fa8eccb\") " pod="openstack/ceilometer-0" Sep 30 14:23:01 crc kubenswrapper[4676]: I0930 14:23:01.016030 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a301731-560b-431e-b265-ef436fa8eccb-run-httpd\") pod \"ceilometer-0\" (UID: \"2a301731-560b-431e-b265-ef436fa8eccb\") " pod="openstack/ceilometer-0" Sep 30 14:23:01 crc kubenswrapper[4676]: I0930 14:23:01.016340 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a301731-560b-431e-b265-ef436fa8eccb-log-httpd\") pod \"ceilometer-0\" (UID: \"2a301731-560b-431e-b265-ef436fa8eccb\") " pod="openstack/ceilometer-0" Sep 30 14:23:01 crc kubenswrapper[4676]: I0930 14:23:01.016398 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a301731-560b-431e-b265-ef436fa8eccb-run-httpd\") pod \"ceilometer-0\" (UID: \"2a301731-560b-431e-b265-ef436fa8eccb\") " pod="openstack/ceilometer-0" Sep 30 14:23:01 crc kubenswrapper[4676]: I0930 14:23:01.020593 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a301731-560b-431e-b265-ef436fa8eccb-scripts\") pod \"ceilometer-0\" (UID: \"2a301731-560b-431e-b265-ef436fa8eccb\") " pod="openstack/ceilometer-0" Sep 30 14:23:01 crc kubenswrapper[4676]: I0930 14:23:01.020858 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2a301731-560b-431e-b265-ef436fa8eccb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2a301731-560b-431e-b265-ef436fa8eccb\") " pod="openstack/ceilometer-0" Sep 30 14:23:01 crc kubenswrapper[4676]: I0930 14:23:01.021721 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a301731-560b-431e-b265-ef436fa8eccb-config-data\") pod \"ceilometer-0\" (UID: \"2a301731-560b-431e-b265-ef436fa8eccb\") " pod="openstack/ceilometer-0" Sep 30 14:23:01 crc kubenswrapper[4676]: I0930 14:23:01.026828 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a301731-560b-431e-b265-ef436fa8eccb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2a301731-560b-431e-b265-ef436fa8eccb\") " pod="openstack/ceilometer-0" Sep 30 14:23:01 crc kubenswrapper[4676]: I0930 14:23:01.034828 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a301731-560b-431e-b265-ef436fa8eccb-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2a301731-560b-431e-b265-ef436fa8eccb\") " pod="openstack/ceilometer-0" Sep 30 14:23:01 crc kubenswrapper[4676]: I0930 14:23:01.037507 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwsws\" (UniqueName: \"kubernetes.io/projected/2a301731-560b-431e-b265-ef436fa8eccb-kube-api-access-dwsws\") pod \"ceilometer-0\" (UID: \"2a301731-560b-431e-b265-ef436fa8eccb\") " pod="openstack/ceilometer-0" Sep 30 14:23:01 crc kubenswrapper[4676]: I0930 14:23:01.094499 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 14:23:01 crc kubenswrapper[4676]: I0930 14:23:01.449245 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35b4f753-8942-4db3-960b-09f2316e8152" path="/var/lib/kubelet/pods/35b4f753-8942-4db3-960b-09f2316e8152/volumes" Sep 30 14:23:01 crc kubenswrapper[4676]: I0930 14:23:01.450698 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90feda1c-209e-474d-bd1c-eee343b5f674" path="/var/lib/kubelet/pods/90feda1c-209e-474d-bd1c-eee343b5f674/volumes" Sep 30 14:23:01 crc kubenswrapper[4676]: I0930 14:23:01.550566 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 14:23:01 crc kubenswrapper[4676]: W0930 14:23:01.559015 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a301731_560b_431e_b265_ef436fa8eccb.slice/crio-dfe36e208735831e58043e95d45359916cc12d36d77f628580bda606b004fac7 WatchSource:0}: Error finding container dfe36e208735831e58043e95d45359916cc12d36d77f628580bda606b004fac7: Status 404 returned error can't find the container with id dfe36e208735831e58043e95d45359916cc12d36d77f628580bda606b004fac7 Sep 30 14:23:01 crc kubenswrapper[4676]: I0930 14:23:01.701933 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2a301731-560b-431e-b265-ef436fa8eccb","Type":"ContainerStarted","Data":"dfe36e208735831e58043e95d45359916cc12d36d77f628580bda606b004fac7"} Sep 30 14:23:02 crc kubenswrapper[4676]: I0930 14:23:02.714293 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2a301731-560b-431e-b265-ef436fa8eccb","Type":"ContainerStarted","Data":"bae1ef9613219b66a6148b337553837196f05428d47a6364096a4907d82246f6"} Sep 30 14:23:03 crc kubenswrapper[4676]: I0930 14:23:03.724778 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2a301731-560b-431e-b265-ef436fa8eccb","Type":"ContainerStarted","Data":"3d852216af31304aae24049746635a0e08f13455e3b16943895e92e340fc34dc"} Sep 30 14:23:04 crc kubenswrapper[4676]: I0930 14:23:04.248162 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-c5q86"] Sep 30 14:23:04 crc kubenswrapper[4676]: I0930 14:23:04.250334 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c5q86" Sep 30 14:23:04 crc kubenswrapper[4676]: I0930 14:23:04.283931 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c5q86"] Sep 30 14:23:04 crc kubenswrapper[4676]: I0930 14:23:04.397905 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8c17bb9-4ccb-470c-9f5b-69dd8420b4b0-utilities\") pod \"redhat-marketplace-c5q86\" (UID: \"a8c17bb9-4ccb-470c-9f5b-69dd8420b4b0\") " pod="openshift-marketplace/redhat-marketplace-c5q86" Sep 30 14:23:04 crc kubenswrapper[4676]: I0930 14:23:04.398251 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25898\" (UniqueName: \"kubernetes.io/projected/a8c17bb9-4ccb-470c-9f5b-69dd8420b4b0-kube-api-access-25898\") pod \"redhat-marketplace-c5q86\" (UID: \"a8c17bb9-4ccb-470c-9f5b-69dd8420b4b0\") " pod="openshift-marketplace/redhat-marketplace-c5q86" Sep 30 14:23:04 crc kubenswrapper[4676]: I0930 14:23:04.398470 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8c17bb9-4ccb-470c-9f5b-69dd8420b4b0-catalog-content\") pod \"redhat-marketplace-c5q86\" (UID: \"a8c17bb9-4ccb-470c-9f5b-69dd8420b4b0\") " pod="openshift-marketplace/redhat-marketplace-c5q86" Sep 30 14:23:04 crc kubenswrapper[4676]: I0930 14:23:04.500517 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8c17bb9-4ccb-470c-9f5b-69dd8420b4b0-catalog-content\") pod \"redhat-marketplace-c5q86\" (UID: \"a8c17bb9-4ccb-470c-9f5b-69dd8420b4b0\") " pod="openshift-marketplace/redhat-marketplace-c5q86" Sep 30 14:23:04 crc kubenswrapper[4676]: I0930 14:23:04.500652 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8c17bb9-4ccb-470c-9f5b-69dd8420b4b0-utilities\") pod \"redhat-marketplace-c5q86\" (UID: \"a8c17bb9-4ccb-470c-9f5b-69dd8420b4b0\") " pod="openshift-marketplace/redhat-marketplace-c5q86" Sep 30 14:23:04 crc kubenswrapper[4676]: I0930 14:23:04.500770 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25898\" (UniqueName: \"kubernetes.io/projected/a8c17bb9-4ccb-470c-9f5b-69dd8420b4b0-kube-api-access-25898\") pod \"redhat-marketplace-c5q86\" (UID: \"a8c17bb9-4ccb-470c-9f5b-69dd8420b4b0\") " pod="openshift-marketplace/redhat-marketplace-c5q86" Sep 30 14:23:04 crc kubenswrapper[4676]: I0930 14:23:04.501097 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8c17bb9-4ccb-470c-9f5b-69dd8420b4b0-catalog-content\") pod \"redhat-marketplace-c5q86\" (UID: \"a8c17bb9-4ccb-470c-9f5b-69dd8420b4b0\") " pod="openshift-marketplace/redhat-marketplace-c5q86" Sep 30 14:23:04 crc kubenswrapper[4676]: I0930 14:23:04.501229 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8c17bb9-4ccb-470c-9f5b-69dd8420b4b0-utilities\") pod \"redhat-marketplace-c5q86\" (UID: \"a8c17bb9-4ccb-470c-9f5b-69dd8420b4b0\") " pod="openshift-marketplace/redhat-marketplace-c5q86" Sep 30 14:23:04 crc kubenswrapper[4676]: I0930 14:23:04.528748 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25898\" (UniqueName: \"kubernetes.io/projected/a8c17bb9-4ccb-470c-9f5b-69dd8420b4b0-kube-api-access-25898\") pod \"redhat-marketplace-c5q86\" (UID: \"a8c17bb9-4ccb-470c-9f5b-69dd8420b4b0\") " pod="openshift-marketplace/redhat-marketplace-c5q86" Sep 30 14:23:04 crc kubenswrapper[4676]: I0930 14:23:04.584566 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c5q86" Sep 30 14:23:04 crc kubenswrapper[4676]: I0930 14:23:04.756653 4676 generic.go:334] "Generic (PLEG): container finished" podID="e77accdf-897d-4abd-b12c-b28bf6406a78" containerID="61489785c3d00ad3a9ec3beb08cedeeb572595f3b5c955cdb582f80269cd7247" exitCode=0 Sep 30 14:23:04 crc kubenswrapper[4676]: I0930 14:23:04.757335 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-4fvlz" event={"ID":"e77accdf-897d-4abd-b12c-b28bf6406a78","Type":"ContainerDied","Data":"61489785c3d00ad3a9ec3beb08cedeeb572595f3b5c955cdb582f80269cd7247"} Sep 30 14:23:04 crc kubenswrapper[4676]: I0930 14:23:04.763755 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2a301731-560b-431e-b265-ef436fa8eccb","Type":"ContainerStarted","Data":"506ab65f09d6c16b965561e5c29928dfa933ac23c4a82ae0c9c5a168ef40d99b"} Sep 30 14:23:05 crc kubenswrapper[4676]: I0930 14:23:05.151021 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c5q86"] Sep 30 14:23:05 crc kubenswrapper[4676]: I0930 14:23:05.788523 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2a301731-560b-431e-b265-ef436fa8eccb","Type":"ContainerStarted","Data":"99e6f32c131a893d2cd0427f21de56b373ed84a713f5d0f2aaa62927967ca1ab"} Sep 30 14:23:05 crc kubenswrapper[4676]: I0930 14:23:05.789870 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 14:23:05 crc kubenswrapper[4676]: I0930 14:23:05.794980 4676 generic.go:334] "Generic (PLEG): container finished" podID="a8c17bb9-4ccb-470c-9f5b-69dd8420b4b0" containerID="242942cd9e20d89b513415c89434ee39f6c9be6e01e8d55361fff901c232996a" exitCode=0 Sep 30 14:23:05 crc kubenswrapper[4676]: I0930 14:23:05.795456 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c5q86" event={"ID":"a8c17bb9-4ccb-470c-9f5b-69dd8420b4b0","Type":"ContainerDied","Data":"242942cd9e20d89b513415c89434ee39f6c9be6e01e8d55361fff901c232996a"} Sep 30 14:23:05 crc kubenswrapper[4676]: I0930 14:23:05.795532 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c5q86" event={"ID":"a8c17bb9-4ccb-470c-9f5b-69dd8420b4b0","Type":"ContainerStarted","Data":"39b4daf02daa16aafef8313af7c1ead9dc5bef2dc7c9ab06c2a6acbef76c82d8"} Sep 30 14:23:05 crc kubenswrapper[4676]: I0930 14:23:05.827898 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.494569989 podStartE2EDuration="5.827857224s" podCreationTimestamp="2025-09-30 14:23:00 +0000 UTC" firstStartedPulling="2025-09-30 14:23:01.561450218 +0000 UTC m=+1485.544538647" lastFinishedPulling="2025-09-30 14:23:04.894737453 +0000 UTC m=+1488.877825882" observedRunningTime="2025-09-30 14:23:05.820279276 +0000 UTC m=+1489.803367705" watchObservedRunningTime="2025-09-30 14:23:05.827857224 +0000 UTC m=+1489.810945653" Sep 30 14:23:05 crc kubenswrapper[4676]: I0930 14:23:05.996534 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 14:23:05 crc kubenswrapper[4676]: I0930 14:23:05.997415 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 14:23:06 crc kubenswrapper[4676]: I0930 14:23:06.218606 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-4fvlz" Sep 30 14:23:06 crc kubenswrapper[4676]: I0930 14:23:06.362602 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-546w8\" (UniqueName: \"kubernetes.io/projected/e77accdf-897d-4abd-b12c-b28bf6406a78-kube-api-access-546w8\") pod \"e77accdf-897d-4abd-b12c-b28bf6406a78\" (UID: \"e77accdf-897d-4abd-b12c-b28bf6406a78\") " Sep 30 14:23:06 crc kubenswrapper[4676]: I0930 14:23:06.362756 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e77accdf-897d-4abd-b12c-b28bf6406a78-combined-ca-bundle\") pod \"e77accdf-897d-4abd-b12c-b28bf6406a78\" (UID: \"e77accdf-897d-4abd-b12c-b28bf6406a78\") " Sep 30 14:23:06 crc kubenswrapper[4676]: I0930 14:23:06.362938 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e77accdf-897d-4abd-b12c-b28bf6406a78-config-data\") pod \"e77accdf-897d-4abd-b12c-b28bf6406a78\" (UID: \"e77accdf-897d-4abd-b12c-b28bf6406a78\") " Sep 30 14:23:06 crc kubenswrapper[4676]: I0930 14:23:06.365997 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e77accdf-897d-4abd-b12c-b28bf6406a78-scripts\") pod \"e77accdf-897d-4abd-b12c-b28bf6406a78\" (UID: \"e77accdf-897d-4abd-b12c-b28bf6406a78\") " Sep 30 14:23:06 crc kubenswrapper[4676]: I0930 14:23:06.371895 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e77accdf-897d-4abd-b12c-b28bf6406a78-kube-api-access-546w8" (OuterVolumeSpecName: "kube-api-access-546w8") pod "e77accdf-897d-4abd-b12c-b28bf6406a78" (UID: "e77accdf-897d-4abd-b12c-b28bf6406a78"). InnerVolumeSpecName "kube-api-access-546w8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:23:06 crc kubenswrapper[4676]: I0930 14:23:06.372647 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e77accdf-897d-4abd-b12c-b28bf6406a78-scripts" (OuterVolumeSpecName: "scripts") pod "e77accdf-897d-4abd-b12c-b28bf6406a78" (UID: "e77accdf-897d-4abd-b12c-b28bf6406a78"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:23:06 crc kubenswrapper[4676]: E0930 14:23:06.394681 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e77accdf-897d-4abd-b12c-b28bf6406a78-combined-ca-bundle podName:e77accdf-897d-4abd-b12c-b28bf6406a78 nodeName:}" failed. No retries permitted until 2025-09-30 14:23:06.894642117 +0000 UTC m=+1490.877730546 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/e77accdf-897d-4abd-b12c-b28bf6406a78-combined-ca-bundle") pod "e77accdf-897d-4abd-b12c-b28bf6406a78" (UID: "e77accdf-897d-4abd-b12c-b28bf6406a78") : error deleting /var/lib/kubelet/pods/e77accdf-897d-4abd-b12c-b28bf6406a78/volume-subpaths: remove /var/lib/kubelet/pods/e77accdf-897d-4abd-b12c-b28bf6406a78/volume-subpaths: no such file or directory Sep 30 14:23:06 crc kubenswrapper[4676]: I0930 14:23:06.397507 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e77accdf-897d-4abd-b12c-b28bf6406a78-config-data" (OuterVolumeSpecName: "config-data") pod "e77accdf-897d-4abd-b12c-b28bf6406a78" (UID: "e77accdf-897d-4abd-b12c-b28bf6406a78"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:23:06 crc kubenswrapper[4676]: I0930 14:23:06.468637 4676 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e77accdf-897d-4abd-b12c-b28bf6406a78-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 14:23:06 crc kubenswrapper[4676]: I0930 14:23:06.468798 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-546w8\" (UniqueName: \"kubernetes.io/projected/e77accdf-897d-4abd-b12c-b28bf6406a78-kube-api-access-546w8\") on node \"crc\" DevicePath \"\"" Sep 30 14:23:06 crc kubenswrapper[4676]: I0930 14:23:06.468906 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e77accdf-897d-4abd-b12c-b28bf6406a78-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 14:23:06 crc kubenswrapper[4676]: I0930 14:23:06.823633 4676 generic.go:334] "Generic (PLEG): container finished" podID="a8c17bb9-4ccb-470c-9f5b-69dd8420b4b0" containerID="1885dc8da6cc2d94469d4ee0a8533a2f9606fe74928f79b16cd7e30dea823d28" exitCode=0 Sep 30 14:23:06 crc kubenswrapper[4676]: I0930 14:23:06.823753 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c5q86" event={"ID":"a8c17bb9-4ccb-470c-9f5b-69dd8420b4b0","Type":"ContainerDied","Data":"1885dc8da6cc2d94469d4ee0a8533a2f9606fe74928f79b16cd7e30dea823d28"} Sep 30 14:23:06 crc kubenswrapper[4676]: I0930 14:23:06.829283 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-4fvlz" Sep 30 14:23:06 crc kubenswrapper[4676]: I0930 14:23:06.829467 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-4fvlz" event={"ID":"e77accdf-897d-4abd-b12c-b28bf6406a78","Type":"ContainerDied","Data":"99a68e1f663edff3c865cebf6413ab277b66956ccc8ecfbff2933f6105e2b232"} Sep 30 14:23:06 crc kubenswrapper[4676]: I0930 14:23:06.829559 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99a68e1f663edff3c865cebf6413ab277b66956ccc8ecfbff2933f6105e2b232" Sep 30 14:23:06 crc kubenswrapper[4676]: I0930 14:23:06.979139 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e77accdf-897d-4abd-b12c-b28bf6406a78-combined-ca-bundle\") pod \"e77accdf-897d-4abd-b12c-b28bf6406a78\" (UID: \"e77accdf-897d-4abd-b12c-b28bf6406a78\") " Sep 30 14:23:06 crc kubenswrapper[4676]: I0930 14:23:06.981340 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 14:23:06 crc kubenswrapper[4676]: I0930 14:23:06.981661 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="2c789c39-fefd-45f4-b5f2-1e85711b5d88" containerName="nova-scheduler-scheduler" containerID="cri-o://c28faf240d8a12e9f2d3fcd25b1b64c2869979e9fa93b3d2c1075499cbd773d0" gracePeriod=30 Sep 30 14:23:06 crc kubenswrapper[4676]: I0930 14:23:06.985097 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e77accdf-897d-4abd-b12c-b28bf6406a78-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e77accdf-897d-4abd-b12c-b28bf6406a78" (UID: "e77accdf-897d-4abd-b12c-b28bf6406a78"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:23:07 crc kubenswrapper[4676]: I0930 14:23:07.000019 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 14:23:07 crc kubenswrapper[4676]: I0930 14:23:07.015109 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0bc3e41b-ae3f-4732-815b-76dda8e0730b" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.202:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 14:23:07 crc kubenswrapper[4676]: I0930 14:23:07.015255 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0bc3e41b-ae3f-4732-815b-76dda8e0730b" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.202:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 14:23:07 crc kubenswrapper[4676]: I0930 14:23:07.044200 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 14:23:07 crc kubenswrapper[4676]: I0930 14:23:07.044451 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="36d8ac97-2979-408c-add8-bbda7b01abe2" containerName="nova-metadata-log" containerID="cri-o://a0b9aa2d62eab5874cba513c04d560a51ae84688cd11d0f202bcfd7dd8b21fe6" gracePeriod=30 Sep 30 14:23:07 crc kubenswrapper[4676]: I0930 14:23:07.044599 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="36d8ac97-2979-408c-add8-bbda7b01abe2" containerName="nova-metadata-metadata" containerID="cri-o://42e55a79f6f42cc0e144da5c8e0ea9b45d7d73d68fa705e299fd82c634c2dcc2" gracePeriod=30 Sep 30 14:23:07 crc kubenswrapper[4676]: I0930 14:23:07.081619 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e77accdf-897d-4abd-b12c-b28bf6406a78-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:23:07 crc kubenswrapper[4676]: I0930 14:23:07.841987 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c5q86" event={"ID":"a8c17bb9-4ccb-470c-9f5b-69dd8420b4b0","Type":"ContainerStarted","Data":"132eb428f09d1cdd842bcfbd95531b10c13cd57968ace3fb1cbe737c1647f05d"} Sep 30 14:23:07 crc kubenswrapper[4676]: I0930 14:23:07.844416 4676 generic.go:334] "Generic (PLEG): container finished" podID="36d8ac97-2979-408c-add8-bbda7b01abe2" containerID="a0b9aa2d62eab5874cba513c04d560a51ae84688cd11d0f202bcfd7dd8b21fe6" exitCode=143 Sep 30 14:23:07 crc kubenswrapper[4676]: I0930 14:23:07.844490 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"36d8ac97-2979-408c-add8-bbda7b01abe2","Type":"ContainerDied","Data":"a0b9aa2d62eab5874cba513c04d560a51ae84688cd11d0f202bcfd7dd8b21fe6"} Sep 30 14:23:07 crc kubenswrapper[4676]: I0930 14:23:07.844682 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0bc3e41b-ae3f-4732-815b-76dda8e0730b" containerName="nova-api-log" containerID="cri-o://a5fb5f902c326942c29fc3001dd58ce4d84bd50e8486e168bd7ff43abf0ce87c" gracePeriod=30 Sep 30 14:23:07 crc kubenswrapper[4676]: I0930 14:23:07.844839 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0bc3e41b-ae3f-4732-815b-76dda8e0730b" containerName="nova-api-api" containerID="cri-o://90419b61c388d39e6ad278ec514bdc43c8f2b099c10f431cc4df4aaa98351d00" gracePeriod=30 Sep 30 14:23:07 crc kubenswrapper[4676]: I0930 14:23:07.867974 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-c5q86" podStartSLOduration=2.400238434 podStartE2EDuration="3.867958626s" podCreationTimestamp="2025-09-30 14:23:04 +0000 UTC" firstStartedPulling="2025-09-30 14:23:05.797122522 +0000 UTC m=+1489.780210951" lastFinishedPulling="2025-09-30 14:23:07.264842714 +0000 UTC m=+1491.247931143" observedRunningTime="2025-09-30 14:23:07.866220191 +0000 UTC m=+1491.849308620" watchObservedRunningTime="2025-09-30 14:23:07.867958626 +0000 UTC m=+1491.851047055" Sep 30 14:23:08 crc kubenswrapper[4676]: E0930 14:23:08.612354 4676 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c28faf240d8a12e9f2d3fcd25b1b64c2869979e9fa93b3d2c1075499cbd773d0" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 30 14:23:08 crc kubenswrapper[4676]: E0930 14:23:08.613995 4676 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c28faf240d8a12e9f2d3fcd25b1b64c2869979e9fa93b3d2c1075499cbd773d0" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 30 14:23:08 crc kubenswrapper[4676]: E0930 14:23:08.615634 4676 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c28faf240d8a12e9f2d3fcd25b1b64c2869979e9fa93b3d2c1075499cbd773d0" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 30 14:23:08 crc kubenswrapper[4676]: E0930 14:23:08.615706 4676 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="2c789c39-fefd-45f4-b5f2-1e85711b5d88" containerName="nova-scheduler-scheduler" Sep 30 14:23:08 crc kubenswrapper[4676]: I0930 14:23:08.857452 4676 generic.go:334] "Generic (PLEG): container finished" podID="0bc3e41b-ae3f-4732-815b-76dda8e0730b" containerID="a5fb5f902c326942c29fc3001dd58ce4d84bd50e8486e168bd7ff43abf0ce87c" exitCode=143 Sep 30 14:23:08 crc kubenswrapper[4676]: I0930 14:23:08.857575 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0bc3e41b-ae3f-4732-815b-76dda8e0730b","Type":"ContainerDied","Data":"a5fb5f902c326942c29fc3001dd58ce4d84bd50e8486e168bd7ff43abf0ce87c"} Sep 30 14:23:10 crc kubenswrapper[4676]: I0930 14:23:10.190872 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="36d8ac97-2979-408c-add8-bbda7b01abe2" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.193:8775/\": read tcp 10.217.0.2:41000->10.217.0.193:8775: read: connection reset by peer" Sep 30 14:23:10 crc kubenswrapper[4676]: I0930 14:23:10.190906 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="36d8ac97-2979-408c-add8-bbda7b01abe2" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.193:8775/\": read tcp 10.217.0.2:41012->10.217.0.193:8775: read: connection reset by peer" Sep 30 14:23:10 crc kubenswrapper[4676]: I0930 14:23:10.637761 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5ks5l"] Sep 30 14:23:10 crc kubenswrapper[4676]: E0930 14:23:10.638670 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e77accdf-897d-4abd-b12c-b28bf6406a78" containerName="nova-manage" Sep 30 14:23:10 crc kubenswrapper[4676]: I0930 14:23:10.638691 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="e77accdf-897d-4abd-b12c-b28bf6406a78" containerName="nova-manage" Sep 30 14:23:10 crc kubenswrapper[4676]: I0930 14:23:10.638944 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="e77accdf-897d-4abd-b12c-b28bf6406a78" containerName="nova-manage" Sep 30 14:23:10 crc kubenswrapper[4676]: I0930 14:23:10.640609 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5ks5l" Sep 30 14:23:10 crc kubenswrapper[4676]: I0930 14:23:10.666724 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5ks5l"] Sep 30 14:23:10 crc kubenswrapper[4676]: I0930 14:23:10.679669 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 14:23:10 crc kubenswrapper[4676]: I0930 14:23:10.756269 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwkn2\" (UniqueName: \"kubernetes.io/projected/f5e16da2-d89d-4896-85f9-727c80f00a9b-kube-api-access-hwkn2\") pod \"community-operators-5ks5l\" (UID: \"f5e16da2-d89d-4896-85f9-727c80f00a9b\") " pod="openshift-marketplace/community-operators-5ks5l" Sep 30 14:23:10 crc kubenswrapper[4676]: I0930 14:23:10.756488 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5e16da2-d89d-4896-85f9-727c80f00a9b-utilities\") pod \"community-operators-5ks5l\" (UID: \"f5e16da2-d89d-4896-85f9-727c80f00a9b\") " pod="openshift-marketplace/community-operators-5ks5l" Sep 30 14:23:10 crc kubenswrapper[4676]: I0930 14:23:10.756523 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5e16da2-d89d-4896-85f9-727c80f00a9b-catalog-content\") pod \"community-operators-5ks5l\" (UID: \"f5e16da2-d89d-4896-85f9-727c80f00a9b\") " pod="openshift-marketplace/community-operators-5ks5l" Sep 30 14:23:10 crc kubenswrapper[4676]: I0930 14:23:10.857710 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/36d8ac97-2979-408c-add8-bbda7b01abe2-nova-metadata-tls-certs\") pod \"36d8ac97-2979-408c-add8-bbda7b01abe2\" (UID: \"36d8ac97-2979-408c-add8-bbda7b01abe2\") " Sep 30 14:23:10 crc kubenswrapper[4676]: I0930 14:23:10.857918 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8jhn\" (UniqueName: \"kubernetes.io/projected/36d8ac97-2979-408c-add8-bbda7b01abe2-kube-api-access-k8jhn\") pod \"36d8ac97-2979-408c-add8-bbda7b01abe2\" (UID: \"36d8ac97-2979-408c-add8-bbda7b01abe2\") " Sep 30 14:23:10 crc kubenswrapper[4676]: I0930 14:23:10.857994 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36d8ac97-2979-408c-add8-bbda7b01abe2-logs\") pod \"36d8ac97-2979-408c-add8-bbda7b01abe2\" (UID: \"36d8ac97-2979-408c-add8-bbda7b01abe2\") " Sep 30 14:23:10 crc kubenswrapper[4676]: I0930 14:23:10.858098 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36d8ac97-2979-408c-add8-bbda7b01abe2-config-data\") pod \"36d8ac97-2979-408c-add8-bbda7b01abe2\" (UID: \"36d8ac97-2979-408c-add8-bbda7b01abe2\") " Sep 30 14:23:10 crc kubenswrapper[4676]: I0930 14:23:10.858165 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36d8ac97-2979-408c-add8-bbda7b01abe2-combined-ca-bundle\") pod \"36d8ac97-2979-408c-add8-bbda7b01abe2\" (UID: \"36d8ac97-2979-408c-add8-bbda7b01abe2\") " Sep 30 14:23:10 crc kubenswrapper[4676]: I0930 14:23:10.858534 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5e16da2-d89d-4896-85f9-727c80f00a9b-utilities\") pod \"community-operators-5ks5l\" (UID: \"f5e16da2-d89d-4896-85f9-727c80f00a9b\") " pod="openshift-marketplace/community-operators-5ks5l" Sep 30 14:23:10 crc kubenswrapper[4676]: I0930 14:23:10.858577 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5e16da2-d89d-4896-85f9-727c80f00a9b-catalog-content\") pod \"community-operators-5ks5l\" (UID: \"f5e16da2-d89d-4896-85f9-727c80f00a9b\") " pod="openshift-marketplace/community-operators-5ks5l" Sep 30 14:23:10 crc kubenswrapper[4676]: I0930 14:23:10.858684 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwkn2\" (UniqueName: \"kubernetes.io/projected/f5e16da2-d89d-4896-85f9-727c80f00a9b-kube-api-access-hwkn2\") pod \"community-operators-5ks5l\" (UID: \"f5e16da2-d89d-4896-85f9-727c80f00a9b\") " pod="openshift-marketplace/community-operators-5ks5l" Sep 30 14:23:10 crc kubenswrapper[4676]: I0930 14:23:10.859006 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5e16da2-d89d-4896-85f9-727c80f00a9b-utilities\") pod \"community-operators-5ks5l\" (UID: \"f5e16da2-d89d-4896-85f9-727c80f00a9b\") " pod="openshift-marketplace/community-operators-5ks5l" Sep 30 14:23:10 crc kubenswrapper[4676]: I0930 14:23:10.859023 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36d8ac97-2979-408c-add8-bbda7b01abe2-logs" (OuterVolumeSpecName: "logs") pod "36d8ac97-2979-408c-add8-bbda7b01abe2" (UID: "36d8ac97-2979-408c-add8-bbda7b01abe2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:23:10 crc kubenswrapper[4676]: I0930 14:23:10.859247 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5e16da2-d89d-4896-85f9-727c80f00a9b-catalog-content\") pod \"community-operators-5ks5l\" (UID: \"f5e16da2-d89d-4896-85f9-727c80f00a9b\") " pod="openshift-marketplace/community-operators-5ks5l" Sep 30 14:23:10 crc kubenswrapper[4676]: I0930 14:23:10.864344 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36d8ac97-2979-408c-add8-bbda7b01abe2-kube-api-access-k8jhn" (OuterVolumeSpecName: "kube-api-access-k8jhn") pod "36d8ac97-2979-408c-add8-bbda7b01abe2" (UID: "36d8ac97-2979-408c-add8-bbda7b01abe2"). InnerVolumeSpecName "kube-api-access-k8jhn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:23:10 crc kubenswrapper[4676]: I0930 14:23:10.879389 4676 generic.go:334] "Generic (PLEG): container finished" podID="36d8ac97-2979-408c-add8-bbda7b01abe2" containerID="42e55a79f6f42cc0e144da5c8e0ea9b45d7d73d68fa705e299fd82c634c2dcc2" exitCode=0 Sep 30 14:23:10 crc kubenswrapper[4676]: I0930 14:23:10.879440 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 14:23:10 crc kubenswrapper[4676]: I0930 14:23:10.879948 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwkn2\" (UniqueName: \"kubernetes.io/projected/f5e16da2-d89d-4896-85f9-727c80f00a9b-kube-api-access-hwkn2\") pod \"community-operators-5ks5l\" (UID: \"f5e16da2-d89d-4896-85f9-727c80f00a9b\") " pod="openshift-marketplace/community-operators-5ks5l" Sep 30 14:23:10 crc kubenswrapper[4676]: I0930 14:23:10.879460 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"36d8ac97-2979-408c-add8-bbda7b01abe2","Type":"ContainerDied","Data":"42e55a79f6f42cc0e144da5c8e0ea9b45d7d73d68fa705e299fd82c634c2dcc2"} Sep 30 14:23:10 crc kubenswrapper[4676]: I0930 14:23:10.880127 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"36d8ac97-2979-408c-add8-bbda7b01abe2","Type":"ContainerDied","Data":"a2d859741ac220cb8e732ae37d5c62693400855ad8279fe9234915301d2535d9"} Sep 30 14:23:10 crc kubenswrapper[4676]: I0930 14:23:10.880208 4676 scope.go:117] "RemoveContainer" containerID="42e55a79f6f42cc0e144da5c8e0ea9b45d7d73d68fa705e299fd82c634c2dcc2" Sep 30 14:23:10 crc kubenswrapper[4676]: I0930 14:23:10.900324 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36d8ac97-2979-408c-add8-bbda7b01abe2-config-data" (OuterVolumeSpecName: "config-data") pod "36d8ac97-2979-408c-add8-bbda7b01abe2" (UID: "36d8ac97-2979-408c-add8-bbda7b01abe2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:23:10 crc kubenswrapper[4676]: I0930 14:23:10.904629 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36d8ac97-2979-408c-add8-bbda7b01abe2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "36d8ac97-2979-408c-add8-bbda7b01abe2" (UID: "36d8ac97-2979-408c-add8-bbda7b01abe2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:23:10 crc kubenswrapper[4676]: I0930 14:23:10.931777 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36d8ac97-2979-408c-add8-bbda7b01abe2-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "36d8ac97-2979-408c-add8-bbda7b01abe2" (UID: "36d8ac97-2979-408c-add8-bbda7b01abe2"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:23:10 crc kubenswrapper[4676]: I0930 14:23:10.960389 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36d8ac97-2979-408c-add8-bbda7b01abe2-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 14:23:10 crc kubenswrapper[4676]: I0930 14:23:10.960423 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36d8ac97-2979-408c-add8-bbda7b01abe2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:23:10 crc kubenswrapper[4676]: I0930 14:23:10.960434 4676 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/36d8ac97-2979-408c-add8-bbda7b01abe2-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 14:23:10 crc kubenswrapper[4676]: I0930 14:23:10.960443 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8jhn\" (UniqueName: \"kubernetes.io/projected/36d8ac97-2979-408c-add8-bbda7b01abe2-kube-api-access-k8jhn\") on node \"crc\" DevicePath \"\"" Sep 30 14:23:10 crc kubenswrapper[4676]: I0930 14:23:10.960454 4676 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36d8ac97-2979-408c-add8-bbda7b01abe2-logs\") on node \"crc\" DevicePath \"\"" Sep 30 14:23:10 crc kubenswrapper[4676]: I0930 14:23:10.983763 4676 scope.go:117] "RemoveContainer" containerID="a0b9aa2d62eab5874cba513c04d560a51ae84688cd11d0f202bcfd7dd8b21fe6" Sep 30 14:23:10 crc kubenswrapper[4676]: I0930 14:23:10.996385 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5ks5l" Sep 30 14:23:11 crc kubenswrapper[4676]: I0930 14:23:11.013988 4676 scope.go:117] "RemoveContainer" containerID="42e55a79f6f42cc0e144da5c8e0ea9b45d7d73d68fa705e299fd82c634c2dcc2" Sep 30 14:23:11 crc kubenswrapper[4676]: E0930 14:23:11.015147 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42e55a79f6f42cc0e144da5c8e0ea9b45d7d73d68fa705e299fd82c634c2dcc2\": container with ID starting with 42e55a79f6f42cc0e144da5c8e0ea9b45d7d73d68fa705e299fd82c634c2dcc2 not found: ID does not exist" containerID="42e55a79f6f42cc0e144da5c8e0ea9b45d7d73d68fa705e299fd82c634c2dcc2" Sep 30 14:23:11 crc kubenswrapper[4676]: I0930 14:23:11.015193 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42e55a79f6f42cc0e144da5c8e0ea9b45d7d73d68fa705e299fd82c634c2dcc2"} err="failed to get container status \"42e55a79f6f42cc0e144da5c8e0ea9b45d7d73d68fa705e299fd82c634c2dcc2\": rpc error: code = NotFound desc = could not find container \"42e55a79f6f42cc0e144da5c8e0ea9b45d7d73d68fa705e299fd82c634c2dcc2\": container with ID starting with 42e55a79f6f42cc0e144da5c8e0ea9b45d7d73d68fa705e299fd82c634c2dcc2 not found: ID does not exist" Sep 30 14:23:11 crc kubenswrapper[4676]: I0930 14:23:11.015220 4676 scope.go:117] "RemoveContainer" containerID="a0b9aa2d62eab5874cba513c04d560a51ae84688cd11d0f202bcfd7dd8b21fe6" Sep 30 14:23:11 crc kubenswrapper[4676]: E0930 14:23:11.015827 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0b9aa2d62eab5874cba513c04d560a51ae84688cd11d0f202bcfd7dd8b21fe6\": container with ID starting with a0b9aa2d62eab5874cba513c04d560a51ae84688cd11d0f202bcfd7dd8b21fe6 not found: ID does not exist" containerID="a0b9aa2d62eab5874cba513c04d560a51ae84688cd11d0f202bcfd7dd8b21fe6" Sep 30 14:23:11 crc kubenswrapper[4676]: I0930 14:23:11.015855 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0b9aa2d62eab5874cba513c04d560a51ae84688cd11d0f202bcfd7dd8b21fe6"} err="failed to get container status \"a0b9aa2d62eab5874cba513c04d560a51ae84688cd11d0f202bcfd7dd8b21fe6\": rpc error: code = NotFound desc = could not find container \"a0b9aa2d62eab5874cba513c04d560a51ae84688cd11d0f202bcfd7dd8b21fe6\": container with ID starting with a0b9aa2d62eab5874cba513c04d560a51ae84688cd11d0f202bcfd7dd8b21fe6 not found: ID does not exist" Sep 30 14:23:11 crc kubenswrapper[4676]: I0930 14:23:11.290640 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 14:23:11 crc kubenswrapper[4676]: I0930 14:23:11.325988 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 14:23:11 crc kubenswrapper[4676]: I0930 14:23:11.344802 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Sep 30 14:23:11 crc kubenswrapper[4676]: E0930 14:23:11.346057 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36d8ac97-2979-408c-add8-bbda7b01abe2" containerName="nova-metadata-log" Sep 30 14:23:11 crc kubenswrapper[4676]: I0930 14:23:11.346089 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="36d8ac97-2979-408c-add8-bbda7b01abe2" containerName="nova-metadata-log" Sep 30 14:23:11 crc kubenswrapper[4676]: E0930 14:23:11.346141 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36d8ac97-2979-408c-add8-bbda7b01abe2" containerName="nova-metadata-metadata" Sep 30 14:23:11 crc kubenswrapper[4676]: I0930 14:23:11.346150 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="36d8ac97-2979-408c-add8-bbda7b01abe2" containerName="nova-metadata-metadata" Sep 30 14:23:11 crc kubenswrapper[4676]: I0930 14:23:11.346604 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="36d8ac97-2979-408c-add8-bbda7b01abe2" containerName="nova-metadata-log" Sep 30 14:23:11 crc kubenswrapper[4676]: I0930 14:23:11.346634 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="36d8ac97-2979-408c-add8-bbda7b01abe2" containerName="nova-metadata-metadata" Sep 30 14:23:11 crc kubenswrapper[4676]: I0930 14:23:11.350264 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 14:23:11 crc kubenswrapper[4676]: I0930 14:23:11.353565 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Sep 30 14:23:11 crc kubenswrapper[4676]: I0930 14:23:11.354056 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Sep 30 14:23:11 crc kubenswrapper[4676]: I0930 14:23:11.392306 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 14:23:11 crc kubenswrapper[4676]: I0930 14:23:11.444218 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36d8ac97-2979-408c-add8-bbda7b01abe2" path="/var/lib/kubelet/pods/36d8ac97-2979-408c-add8-bbda7b01abe2/volumes" Sep 30 14:23:11 crc kubenswrapper[4676]: I0930 14:23:11.471510 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9578818-8dfa-4aec-8923-d1d9424068be-logs\") pod \"nova-metadata-0\" (UID: \"b9578818-8dfa-4aec-8923-d1d9424068be\") " pod="openstack/nova-metadata-0" Sep 30 14:23:11 crc kubenswrapper[4676]: I0930 14:23:11.471664 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9578818-8dfa-4aec-8923-d1d9424068be-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b9578818-8dfa-4aec-8923-d1d9424068be\") " pod="openstack/nova-metadata-0" Sep 30 14:23:11 crc kubenswrapper[4676]: I0930 14:23:11.472195 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zv6tg\" (UniqueName: \"kubernetes.io/projected/b9578818-8dfa-4aec-8923-d1d9424068be-kube-api-access-zv6tg\") pod \"nova-metadata-0\" (UID: \"b9578818-8dfa-4aec-8923-d1d9424068be\") " pod="openstack/nova-metadata-0" Sep 30 14:23:11 crc kubenswrapper[4676]: I0930 14:23:11.472260 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9578818-8dfa-4aec-8923-d1d9424068be-config-data\") pod \"nova-metadata-0\" (UID: \"b9578818-8dfa-4aec-8923-d1d9424068be\") " pod="openstack/nova-metadata-0" Sep 30 14:23:11 crc kubenswrapper[4676]: I0930 14:23:11.472328 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9578818-8dfa-4aec-8923-d1d9424068be-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b9578818-8dfa-4aec-8923-d1d9424068be\") " pod="openstack/nova-metadata-0" Sep 30 14:23:11 crc kubenswrapper[4676]: I0930 14:23:11.573004 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5ks5l"] Sep 30 14:23:11 crc kubenswrapper[4676]: I0930 14:23:11.575081 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zv6tg\" (UniqueName: \"kubernetes.io/projected/b9578818-8dfa-4aec-8923-d1d9424068be-kube-api-access-zv6tg\") pod \"nova-metadata-0\" (UID: \"b9578818-8dfa-4aec-8923-d1d9424068be\") " pod="openstack/nova-metadata-0" Sep 30 14:23:11 crc kubenswrapper[4676]: I0930 14:23:11.575117 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9578818-8dfa-4aec-8923-d1d9424068be-config-data\") pod \"nova-metadata-0\" (UID: \"b9578818-8dfa-4aec-8923-d1d9424068be\") " pod="openstack/nova-metadata-0" Sep 30 14:23:11 crc kubenswrapper[4676]: I0930 14:23:11.575147 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9578818-8dfa-4aec-8923-d1d9424068be-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b9578818-8dfa-4aec-8923-d1d9424068be\") " pod="openstack/nova-metadata-0" Sep 30 14:23:11 crc kubenswrapper[4676]: I0930 14:23:11.575197 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9578818-8dfa-4aec-8923-d1d9424068be-logs\") pod \"nova-metadata-0\" (UID: \"b9578818-8dfa-4aec-8923-d1d9424068be\") " pod="openstack/nova-metadata-0" Sep 30 14:23:11 crc kubenswrapper[4676]: I0930 14:23:11.575263 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9578818-8dfa-4aec-8923-d1d9424068be-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b9578818-8dfa-4aec-8923-d1d9424068be\") " pod="openstack/nova-metadata-0" Sep 30 14:23:11 crc kubenswrapper[4676]: I0930 14:23:11.575783 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9578818-8dfa-4aec-8923-d1d9424068be-logs\") pod \"nova-metadata-0\" (UID: \"b9578818-8dfa-4aec-8923-d1d9424068be\") " pod="openstack/nova-metadata-0" Sep 30 14:23:11 crc kubenswrapper[4676]: I0930 14:23:11.582893 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9578818-8dfa-4aec-8923-d1d9424068be-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b9578818-8dfa-4aec-8923-d1d9424068be\") " pod="openstack/nova-metadata-0" Sep 30 14:23:11 crc kubenswrapper[4676]: I0930 14:23:11.583100 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9578818-8dfa-4aec-8923-d1d9424068be-config-data\") pod \"nova-metadata-0\" (UID: \"b9578818-8dfa-4aec-8923-d1d9424068be\") " pod="openstack/nova-metadata-0" Sep 30 14:23:11 crc kubenswrapper[4676]: I0930 14:23:11.583537 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9578818-8dfa-4aec-8923-d1d9424068be-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b9578818-8dfa-4aec-8923-d1d9424068be\") " pod="openstack/nova-metadata-0" Sep 30 14:23:11 crc kubenswrapper[4676]: I0930 14:23:11.593735 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zv6tg\" (UniqueName: \"kubernetes.io/projected/b9578818-8dfa-4aec-8923-d1d9424068be-kube-api-access-zv6tg\") pod \"nova-metadata-0\" (UID: \"b9578818-8dfa-4aec-8923-d1d9424068be\") " pod="openstack/nova-metadata-0" Sep 30 14:23:11 crc kubenswrapper[4676]: I0930 14:23:11.695424 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 14:23:11 crc kubenswrapper[4676]: I0930 14:23:11.897181 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5ks5l" event={"ID":"f5e16da2-d89d-4896-85f9-727c80f00a9b","Type":"ContainerStarted","Data":"ec9114d35d4f400dc88f3d27bcc84e4fc640d60efe249a919bd0f8211a9f8d38"} Sep 30 14:23:12 crc kubenswrapper[4676]: I0930 14:23:12.187808 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 14:23:12 crc kubenswrapper[4676]: W0930 14:23:12.189686 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb9578818_8dfa_4aec_8923_d1d9424068be.slice/crio-20c89e4adabdf839b7065c1f904e29b3825af94a3a7029c5a548d19fbcffec54 WatchSource:0}: Error finding container 20c89e4adabdf839b7065c1f904e29b3825af94a3a7029c5a548d19fbcffec54: Status 404 returned error can't find the container with id 20c89e4adabdf839b7065c1f904e29b3825af94a3a7029c5a548d19fbcffec54 Sep 30 14:23:12 crc kubenswrapper[4676]: I0930 14:23:12.735735 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 14:23:12 crc kubenswrapper[4676]: I0930 14:23:12.901627 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bc3e41b-ae3f-4732-815b-76dda8e0730b-config-data\") pod \"0bc3e41b-ae3f-4732-815b-76dda8e0730b\" (UID: \"0bc3e41b-ae3f-4732-815b-76dda8e0730b\") " Sep 30 14:23:12 crc kubenswrapper[4676]: I0930 14:23:12.902843 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0bc3e41b-ae3f-4732-815b-76dda8e0730b-logs\") pod \"0bc3e41b-ae3f-4732-815b-76dda8e0730b\" (UID: \"0bc3e41b-ae3f-4732-815b-76dda8e0730b\") " Sep 30 14:23:12 crc kubenswrapper[4676]: I0930 14:23:12.903071 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwrk\" (UniqueName: \"kubernetes.io/projected/0bc3e41b-ae3f-4732-815b-76dda8e0730b-kube-api-access-nzwrk\") pod \"0bc3e41b-ae3f-4732-815b-76dda8e0730b\" (UID: \"0bc3e41b-ae3f-4732-815b-76dda8e0730b\") " Sep 30 14:23:12 crc kubenswrapper[4676]: I0930 14:23:12.903301 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0bc3e41b-ae3f-4732-815b-76dda8e0730b-public-tls-certs\") pod \"0bc3e41b-ae3f-4732-815b-76dda8e0730b\" (UID: \"0bc3e41b-ae3f-4732-815b-76dda8e0730b\") " Sep 30 14:23:12 crc kubenswrapper[4676]: I0930 14:23:12.903421 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bc3e41b-ae3f-4732-815b-76dda8e0730b-combined-ca-bundle\") pod \"0bc3e41b-ae3f-4732-815b-76dda8e0730b\" (UID: \"0bc3e41b-ae3f-4732-815b-76dda8e0730b\") " Sep 30 14:23:12 crc kubenswrapper[4676]: I0930 14:23:12.903549 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0bc3e41b-ae3f-4732-815b-76dda8e0730b-logs" (OuterVolumeSpecName: "logs") pod "0bc3e41b-ae3f-4732-815b-76dda8e0730b" (UID: "0bc3e41b-ae3f-4732-815b-76dda8e0730b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:23:12 crc kubenswrapper[4676]: I0930 14:23:12.903759 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0bc3e41b-ae3f-4732-815b-76dda8e0730b-internal-tls-certs\") pod \"0bc3e41b-ae3f-4732-815b-76dda8e0730b\" (UID: \"0bc3e41b-ae3f-4732-815b-76dda8e0730b\") " Sep 30 14:23:12 crc kubenswrapper[4676]: I0930 14:23:12.905220 4676 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0bc3e41b-ae3f-4732-815b-76dda8e0730b-logs\") on node \"crc\" DevicePath \"\"" Sep 30 14:23:12 crc kubenswrapper[4676]: I0930 14:23:12.910632 4676 generic.go:334] "Generic (PLEG): container finished" podID="0bc3e41b-ae3f-4732-815b-76dda8e0730b" containerID="90419b61c388d39e6ad278ec514bdc43c8f2b099c10f431cc4df4aaa98351d00" exitCode=0 Sep 30 14:23:12 crc kubenswrapper[4676]: I0930 14:23:12.910695 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0bc3e41b-ae3f-4732-815b-76dda8e0730b","Type":"ContainerDied","Data":"90419b61c388d39e6ad278ec514bdc43c8f2b099c10f431cc4df4aaa98351d00"} Sep 30 14:23:12 crc kubenswrapper[4676]: I0930 14:23:12.910720 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0bc3e41b-ae3f-4732-815b-76dda8e0730b","Type":"ContainerDied","Data":"cbf5cb278a884cd4e569fdd7e08f4df75aebfd7b5ea4417a56e0c30391a3db4f"} Sep 30 14:23:12 crc kubenswrapper[4676]: I0930 14:23:12.910737 4676 scope.go:117] "RemoveContainer" containerID="90419b61c388d39e6ad278ec514bdc43c8f2b099c10f431cc4df4aaa98351d00" Sep 30 14:23:12 crc kubenswrapper[4676]: I0930 14:23:12.910899 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 14:23:12 crc kubenswrapper[4676]: I0930 14:23:12.912390 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 14:23:12 crc kubenswrapper[4676]: I0930 14:23:12.913095 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bc3e41b-ae3f-4732-815b-76dda8e0730b-kube-api-access-nzwrk" (OuterVolumeSpecName: "kube-api-access-nzwrk") pod "0bc3e41b-ae3f-4732-815b-76dda8e0730b" (UID: "0bc3e41b-ae3f-4732-815b-76dda8e0730b"). InnerVolumeSpecName "kube-api-access-nzwrk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:23:12 crc kubenswrapper[4676]: I0930 14:23:12.915357 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b9578818-8dfa-4aec-8923-d1d9424068be","Type":"ContainerStarted","Data":"a687e5ae2d49dcf65cf40af5418b62fe661fbc98dd88238fef496caf00967129"} Sep 30 14:23:12 crc kubenswrapper[4676]: I0930 14:23:12.915405 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b9578818-8dfa-4aec-8923-d1d9424068be","Type":"ContainerStarted","Data":"da58c22a9f2dcc7205bbc789becbe06a280887191cc06ee3fe587f37183772c3"} Sep 30 14:23:12 crc kubenswrapper[4676]: I0930 14:23:12.915422 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b9578818-8dfa-4aec-8923-d1d9424068be","Type":"ContainerStarted","Data":"20c89e4adabdf839b7065c1f904e29b3825af94a3a7029c5a548d19fbcffec54"} Sep 30 14:23:12 crc kubenswrapper[4676]: I0930 14:23:12.921409 4676 generic.go:334] "Generic (PLEG): container finished" podID="2c789c39-fefd-45f4-b5f2-1e85711b5d88" containerID="c28faf240d8a12e9f2d3fcd25b1b64c2869979e9fa93b3d2c1075499cbd773d0" exitCode=0 Sep 30 14:23:12 crc kubenswrapper[4676]: I0930 14:23:12.921453 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 14:23:12 crc kubenswrapper[4676]: I0930 14:23:12.921511 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2c789c39-fefd-45f4-b5f2-1e85711b5d88","Type":"ContainerDied","Data":"c28faf240d8a12e9f2d3fcd25b1b64c2869979e9fa93b3d2c1075499cbd773d0"} Sep 30 14:23:12 crc kubenswrapper[4676]: I0930 14:23:12.921553 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2c789c39-fefd-45f4-b5f2-1e85711b5d88","Type":"ContainerDied","Data":"1873c4369e07d3b9b6f9d00e2c85fb66b0a527855abb33e55106cb1deb811a00"} Sep 30 14:23:12 crc kubenswrapper[4676]: I0930 14:23:12.932507 4676 generic.go:334] "Generic (PLEG): container finished" podID="f5e16da2-d89d-4896-85f9-727c80f00a9b" containerID="f5ca0365b3c5c12d4896860c008e693871d812adb6bb057a94a362affcc71020" exitCode=0 Sep 30 14:23:12 crc kubenswrapper[4676]: I0930 14:23:12.933199 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5ks5l" event={"ID":"f5e16da2-d89d-4896-85f9-727c80f00a9b","Type":"ContainerDied","Data":"f5ca0365b3c5c12d4896860c008e693871d812adb6bb057a94a362affcc71020"} Sep 30 14:23:12 crc kubenswrapper[4676]: I0930 14:23:12.936084 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bc3e41b-ae3f-4732-815b-76dda8e0730b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0bc3e41b-ae3f-4732-815b-76dda8e0730b" (UID: "0bc3e41b-ae3f-4732-815b-76dda8e0730b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:23:12 crc kubenswrapper[4676]: I0930 14:23:12.938191 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bc3e41b-ae3f-4732-815b-76dda8e0730b-config-data" (OuterVolumeSpecName: "config-data") pod "0bc3e41b-ae3f-4732-815b-76dda8e0730b" (UID: "0bc3e41b-ae3f-4732-815b-76dda8e0730b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:23:12 crc kubenswrapper[4676]: I0930 14:23:12.945041 4676 scope.go:117] "RemoveContainer" containerID="a5fb5f902c326942c29fc3001dd58ce4d84bd50e8486e168bd7ff43abf0ce87c" Sep 30 14:23:12 crc kubenswrapper[4676]: I0930 14:23:12.963577 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.963536371 podStartE2EDuration="1.963536371s" podCreationTimestamp="2025-09-30 14:23:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:23:12.957482809 +0000 UTC m=+1496.940571238" watchObservedRunningTime="2025-09-30 14:23:12.963536371 +0000 UTC m=+1496.946624800" Sep 30 14:23:12 crc kubenswrapper[4676]: I0930 14:23:12.995315 4676 scope.go:117] "RemoveContainer" containerID="90419b61c388d39e6ad278ec514bdc43c8f2b099c10f431cc4df4aaa98351d00" Sep 30 14:23:12 crc kubenswrapper[4676]: E0930 14:23:12.995737 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90419b61c388d39e6ad278ec514bdc43c8f2b099c10f431cc4df4aaa98351d00\": container with ID starting with 90419b61c388d39e6ad278ec514bdc43c8f2b099c10f431cc4df4aaa98351d00 not found: ID does not exist" containerID="90419b61c388d39e6ad278ec514bdc43c8f2b099c10f431cc4df4aaa98351d00" Sep 30 14:23:12 crc kubenswrapper[4676]: I0930 14:23:12.995738 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bc3e41b-ae3f-4732-815b-76dda8e0730b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "0bc3e41b-ae3f-4732-815b-76dda8e0730b" (UID: "0bc3e41b-ae3f-4732-815b-76dda8e0730b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:23:12 crc kubenswrapper[4676]: I0930 14:23:12.995764 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90419b61c388d39e6ad278ec514bdc43c8f2b099c10f431cc4df4aaa98351d00"} err="failed to get container status \"90419b61c388d39e6ad278ec514bdc43c8f2b099c10f431cc4df4aaa98351d00\": rpc error: code = NotFound desc = could not find container \"90419b61c388d39e6ad278ec514bdc43c8f2b099c10f431cc4df4aaa98351d00\": container with ID starting with 90419b61c388d39e6ad278ec514bdc43c8f2b099c10f431cc4df4aaa98351d00 not found: ID does not exist" Sep 30 14:23:12 crc kubenswrapper[4676]: I0930 14:23:12.995784 4676 scope.go:117] "RemoveContainer" containerID="a5fb5f902c326942c29fc3001dd58ce4d84bd50e8486e168bd7ff43abf0ce87c" Sep 30 14:23:12 crc kubenswrapper[4676]: E0930 14:23:12.996244 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5fb5f902c326942c29fc3001dd58ce4d84bd50e8486e168bd7ff43abf0ce87c\": container with ID starting with a5fb5f902c326942c29fc3001dd58ce4d84bd50e8486e168bd7ff43abf0ce87c not found: ID does not exist" containerID="a5fb5f902c326942c29fc3001dd58ce4d84bd50e8486e168bd7ff43abf0ce87c" Sep 30 14:23:12 crc kubenswrapper[4676]: I0930 14:23:12.996298 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5fb5f902c326942c29fc3001dd58ce4d84bd50e8486e168bd7ff43abf0ce87c"} err="failed to get container status \"a5fb5f902c326942c29fc3001dd58ce4d84bd50e8486e168bd7ff43abf0ce87c\": rpc error: code = NotFound desc = could not find container \"a5fb5f902c326942c29fc3001dd58ce4d84bd50e8486e168bd7ff43abf0ce87c\": container with ID starting with a5fb5f902c326942c29fc3001dd58ce4d84bd50e8486e168bd7ff43abf0ce87c not found: ID does not exist" Sep 30 14:23:12 crc kubenswrapper[4676]: I0930 14:23:12.996333 4676 scope.go:117] "RemoveContainer" containerID="c28faf240d8a12e9f2d3fcd25b1b64c2869979e9fa93b3d2c1075499cbd773d0" Sep 30 14:23:12 crc kubenswrapper[4676]: I0930 14:23:12.999347 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bc3e41b-ae3f-4732-815b-76dda8e0730b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "0bc3e41b-ae3f-4732-815b-76dda8e0730b" (UID: "0bc3e41b-ae3f-4732-815b-76dda8e0730b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:23:13 crc kubenswrapper[4676]: I0930 14:23:13.009784 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bc3e41b-ae3f-4732-815b-76dda8e0730b-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 14:23:13 crc kubenswrapper[4676]: I0930 14:23:13.009977 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwrk\" (UniqueName: \"kubernetes.io/projected/0bc3e41b-ae3f-4732-815b-76dda8e0730b-kube-api-access-nzwrk\") on node \"crc\" DevicePath \"\"" Sep 30 14:23:13 crc kubenswrapper[4676]: I0930 14:23:13.010010 4676 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0bc3e41b-ae3f-4732-815b-76dda8e0730b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 14:23:13 crc kubenswrapper[4676]: I0930 14:23:13.010021 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bc3e41b-ae3f-4732-815b-76dda8e0730b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:23:13 crc kubenswrapper[4676]: I0930 14:23:13.010030 4676 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0bc3e41b-ae3f-4732-815b-76dda8e0730b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 14:23:13 crc kubenswrapper[4676]: I0930 14:23:13.018967 4676 scope.go:117] "RemoveContainer" containerID="c28faf240d8a12e9f2d3fcd25b1b64c2869979e9fa93b3d2c1075499cbd773d0" Sep 30 14:23:13 crc kubenswrapper[4676]: E0930 14:23:13.019460 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c28faf240d8a12e9f2d3fcd25b1b64c2869979e9fa93b3d2c1075499cbd773d0\": container with ID starting with c28faf240d8a12e9f2d3fcd25b1b64c2869979e9fa93b3d2c1075499cbd773d0 not found: ID does not exist" containerID="c28faf240d8a12e9f2d3fcd25b1b64c2869979e9fa93b3d2c1075499cbd773d0" Sep 30 14:23:13 crc kubenswrapper[4676]: I0930 14:23:13.019499 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c28faf240d8a12e9f2d3fcd25b1b64c2869979e9fa93b3d2c1075499cbd773d0"} err="failed to get container status \"c28faf240d8a12e9f2d3fcd25b1b64c2869979e9fa93b3d2c1075499cbd773d0\": rpc error: code = NotFound desc = could not find container \"c28faf240d8a12e9f2d3fcd25b1b64c2869979e9fa93b3d2c1075499cbd773d0\": container with ID starting with c28faf240d8a12e9f2d3fcd25b1b64c2869979e9fa93b3d2c1075499cbd773d0 not found: ID does not exist" Sep 30 14:23:13 crc kubenswrapper[4676]: I0930 14:23:13.111062 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c789c39-fefd-45f4-b5f2-1e85711b5d88-config-data\") pod \"2c789c39-fefd-45f4-b5f2-1e85711b5d88\" (UID: \"2c789c39-fefd-45f4-b5f2-1e85711b5d88\") " Sep 30 14:23:13 crc kubenswrapper[4676]: I0930 14:23:13.111594 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c789c39-fefd-45f4-b5f2-1e85711b5d88-combined-ca-bundle\") pod \"2c789c39-fefd-45f4-b5f2-1e85711b5d88\" (UID: \"2c789c39-fefd-45f4-b5f2-1e85711b5d88\") " Sep 30 14:23:13 crc kubenswrapper[4676]: I0930 14:23:13.111726 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfpk4\" (UniqueName: \"kubernetes.io/projected/2c789c39-fefd-45f4-b5f2-1e85711b5d88-kube-api-access-rfpk4\") pod \"2c789c39-fefd-45f4-b5f2-1e85711b5d88\" (UID: \"2c789c39-fefd-45f4-b5f2-1e85711b5d88\") " Sep 30 14:23:13 crc kubenswrapper[4676]: I0930 14:23:13.115929 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c789c39-fefd-45f4-b5f2-1e85711b5d88-kube-api-access-rfpk4" (OuterVolumeSpecName: "kube-api-access-rfpk4") pod "2c789c39-fefd-45f4-b5f2-1e85711b5d88" (UID: "2c789c39-fefd-45f4-b5f2-1e85711b5d88"). InnerVolumeSpecName "kube-api-access-rfpk4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:23:13 crc kubenswrapper[4676]: I0930 14:23:13.151078 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c789c39-fefd-45f4-b5f2-1e85711b5d88-config-data" (OuterVolumeSpecName: "config-data") pod "2c789c39-fefd-45f4-b5f2-1e85711b5d88" (UID: "2c789c39-fefd-45f4-b5f2-1e85711b5d88"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:23:13 crc kubenswrapper[4676]: I0930 14:23:13.158087 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c789c39-fefd-45f4-b5f2-1e85711b5d88-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2c789c39-fefd-45f4-b5f2-1e85711b5d88" (UID: "2c789c39-fefd-45f4-b5f2-1e85711b5d88"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:23:13 crc kubenswrapper[4676]: I0930 14:23:13.215251 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c789c39-fefd-45f4-b5f2-1e85711b5d88-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:23:13 crc kubenswrapper[4676]: I0930 14:23:13.215293 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfpk4\" (UniqueName: \"kubernetes.io/projected/2c789c39-fefd-45f4-b5f2-1e85711b5d88-kube-api-access-rfpk4\") on node \"crc\" DevicePath \"\"" Sep 30 14:23:13 crc kubenswrapper[4676]: I0930 14:23:13.215306 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c789c39-fefd-45f4-b5f2-1e85711b5d88-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 14:23:13 crc kubenswrapper[4676]: I0930 14:23:13.284872 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 14:23:13 crc kubenswrapper[4676]: I0930 14:23:13.315008 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Sep 30 14:23:13 crc kubenswrapper[4676]: I0930 14:23:13.339966 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 14:23:13 crc kubenswrapper[4676]: I0930 14:23:13.385077 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 14:23:13 crc kubenswrapper[4676]: I0930 14:23:13.413967 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Sep 30 14:23:13 crc kubenswrapper[4676]: E0930 14:23:13.414388 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c789c39-fefd-45f4-b5f2-1e85711b5d88" containerName="nova-scheduler-scheduler" Sep 30 14:23:13 crc kubenswrapper[4676]: I0930 14:23:13.414410 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c789c39-fefd-45f4-b5f2-1e85711b5d88" containerName="nova-scheduler-scheduler" Sep 30 14:23:13 crc kubenswrapper[4676]: E0930 14:23:13.414438 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bc3e41b-ae3f-4732-815b-76dda8e0730b" containerName="nova-api-api" Sep 30 14:23:13 crc kubenswrapper[4676]: I0930 14:23:13.414445 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bc3e41b-ae3f-4732-815b-76dda8e0730b" containerName="nova-api-api" Sep 30 14:23:13 crc kubenswrapper[4676]: E0930 14:23:13.414456 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bc3e41b-ae3f-4732-815b-76dda8e0730b" containerName="nova-api-log" Sep 30 14:23:13 crc kubenswrapper[4676]: I0930 14:23:13.414462 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bc3e41b-ae3f-4732-815b-76dda8e0730b" containerName="nova-api-log" Sep 30 14:23:13 crc kubenswrapper[4676]: I0930 14:23:13.426020 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bc3e41b-ae3f-4732-815b-76dda8e0730b" containerName="nova-api-api" Sep 30 14:23:13 crc kubenswrapper[4676]: I0930 14:23:13.426087 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c789c39-fefd-45f4-b5f2-1e85711b5d88" containerName="nova-scheduler-scheduler" Sep 30 14:23:13 crc kubenswrapper[4676]: I0930 14:23:13.426115 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bc3e41b-ae3f-4732-815b-76dda8e0730b" containerName="nova-api-log" Sep 30 14:23:13 crc kubenswrapper[4676]: I0930 14:23:13.429404 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 14:23:13 crc kubenswrapper[4676]: I0930 14:23:13.432990 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Sep 30 14:23:13 crc kubenswrapper[4676]: I0930 14:23:13.433082 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Sep 30 14:23:13 crc kubenswrapper[4676]: I0930 14:23:13.433086 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Sep 30 14:23:13 crc kubenswrapper[4676]: I0930 14:23:13.474132 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bc3e41b-ae3f-4732-815b-76dda8e0730b" path="/var/lib/kubelet/pods/0bc3e41b-ae3f-4732-815b-76dda8e0730b/volumes" Sep 30 14:23:13 crc kubenswrapper[4676]: I0930 14:23:13.475694 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c789c39-fefd-45f4-b5f2-1e85711b5d88" path="/var/lib/kubelet/pods/2c789c39-fefd-45f4-b5f2-1e85711b5d88/volumes" Sep 30 14:23:13 crc kubenswrapper[4676]: I0930 14:23:13.481258 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 14:23:13 crc kubenswrapper[4676]: I0930 14:23:13.481468 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 14:23:13 crc kubenswrapper[4676]: I0930 14:23:13.484440 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 14:23:13 crc kubenswrapper[4676]: I0930 14:23:13.488341 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Sep 30 14:23:13 crc kubenswrapper[4676]: I0930 14:23:13.492978 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 14:23:13 crc kubenswrapper[4676]: I0930 14:23:13.532827 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b49d9f0a-d50a-409b-b985-c09b657e9ba2-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b49d9f0a-d50a-409b-b985-c09b657e9ba2\") " pod="openstack/nova-api-0" Sep 30 14:23:13 crc kubenswrapper[4676]: I0930 14:23:13.532946 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b49d9f0a-d50a-409b-b985-c09b657e9ba2-public-tls-certs\") pod \"nova-api-0\" (UID: \"b49d9f0a-d50a-409b-b985-c09b657e9ba2\") " pod="openstack/nova-api-0" Sep 30 14:23:13 crc kubenswrapper[4676]: I0930 14:23:13.533097 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b49d9f0a-d50a-409b-b985-c09b657e9ba2-config-data\") pod \"nova-api-0\" (UID: \"b49d9f0a-d50a-409b-b985-c09b657e9ba2\") " pod="openstack/nova-api-0" Sep 30 14:23:13 crc kubenswrapper[4676]: I0930 14:23:13.533339 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b49d9f0a-d50a-409b-b985-c09b657e9ba2-logs\") pod \"nova-api-0\" (UID: \"b49d9f0a-d50a-409b-b985-c09b657e9ba2\") " pod="openstack/nova-api-0" Sep 30 14:23:13 crc kubenswrapper[4676]: I0930 14:23:13.533618 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9chww\" (UniqueName: \"kubernetes.io/projected/b49d9f0a-d50a-409b-b985-c09b657e9ba2-kube-api-access-9chww\") pod \"nova-api-0\" (UID: \"b49d9f0a-d50a-409b-b985-c09b657e9ba2\") " pod="openstack/nova-api-0" Sep 30 14:23:13 crc kubenswrapper[4676]: I0930 14:23:13.533709 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b49d9f0a-d50a-409b-b985-c09b657e9ba2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b49d9f0a-d50a-409b-b985-c09b657e9ba2\") " pod="openstack/nova-api-0" Sep 30 14:23:13 crc kubenswrapper[4676]: I0930 14:23:13.636198 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk42q\" (UniqueName: \"kubernetes.io/projected/92d9c3ce-1a0b-48d0-a88b-ce25162e54b0-kube-api-access-rk42q\") pod \"nova-scheduler-0\" (UID: \"92d9c3ce-1a0b-48d0-a88b-ce25162e54b0\") " pod="openstack/nova-scheduler-0" Sep 30 14:23:13 crc kubenswrapper[4676]: I0930 14:23:13.636523 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b49d9f0a-d50a-409b-b985-c09b657e9ba2-public-tls-certs\") pod \"nova-api-0\" (UID: \"b49d9f0a-d50a-409b-b985-c09b657e9ba2\") " pod="openstack/nova-api-0" Sep 30 14:23:13 crc kubenswrapper[4676]: I0930 14:23:13.636704 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92d9c3ce-1a0b-48d0-a88b-ce25162e54b0-config-data\") pod \"nova-scheduler-0\" (UID: \"92d9c3ce-1a0b-48d0-a88b-ce25162e54b0\") " pod="openstack/nova-scheduler-0" Sep 30 14:23:13 crc kubenswrapper[4676]: I0930 14:23:13.636812 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b49d9f0a-d50a-409b-b985-c09b657e9ba2-config-data\") pod \"nova-api-0\" (UID: \"b49d9f0a-d50a-409b-b985-c09b657e9ba2\") " pod="openstack/nova-api-0" Sep 30 14:23:13 crc kubenswrapper[4676]: I0930 14:23:13.637037 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b49d9f0a-d50a-409b-b985-c09b657e9ba2-logs\") pod \"nova-api-0\" (UID: \"b49d9f0a-d50a-409b-b985-c09b657e9ba2\") " pod="openstack/nova-api-0" Sep 30 14:23:13 crc kubenswrapper[4676]: I0930 14:23:13.637447 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9chww\" (UniqueName: \"kubernetes.io/projected/b49d9f0a-d50a-409b-b985-c09b657e9ba2-kube-api-access-9chww\") pod \"nova-api-0\" (UID: \"b49d9f0a-d50a-409b-b985-c09b657e9ba2\") " pod="openstack/nova-api-0" Sep 30 14:23:13 crc kubenswrapper[4676]: I0930 14:23:13.637646 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92d9c3ce-1a0b-48d0-a88b-ce25162e54b0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"92d9c3ce-1a0b-48d0-a88b-ce25162e54b0\") " pod="openstack/nova-scheduler-0" Sep 30 14:23:13 crc kubenswrapper[4676]: I0930 14:23:13.637720 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b49d9f0a-d50a-409b-b985-c09b657e9ba2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b49d9f0a-d50a-409b-b985-c09b657e9ba2\") " pod="openstack/nova-api-0" Sep 30 14:23:13 crc kubenswrapper[4676]: I0930 14:23:13.637721 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b49d9f0a-d50a-409b-b985-c09b657e9ba2-logs\") pod \"nova-api-0\" (UID: \"b49d9f0a-d50a-409b-b985-c09b657e9ba2\") " pod="openstack/nova-api-0" Sep 30 14:23:13 crc kubenswrapper[4676]: I0930 14:23:13.638197 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b49d9f0a-d50a-409b-b985-c09b657e9ba2-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b49d9f0a-d50a-409b-b985-c09b657e9ba2\") " pod="openstack/nova-api-0" Sep 30 14:23:13 crc kubenswrapper[4676]: I0930 14:23:13.640771 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b49d9f0a-d50a-409b-b985-c09b657e9ba2-public-tls-certs\") pod \"nova-api-0\" (UID: \"b49d9f0a-d50a-409b-b985-c09b657e9ba2\") " pod="openstack/nova-api-0" Sep 30 14:23:13 crc kubenswrapper[4676]: I0930 14:23:13.641648 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b49d9f0a-d50a-409b-b985-c09b657e9ba2-config-data\") pod \"nova-api-0\" (UID: \"b49d9f0a-d50a-409b-b985-c09b657e9ba2\") " pod="openstack/nova-api-0" Sep 30 14:23:13 crc kubenswrapper[4676]: I0930 14:23:13.643056 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b49d9f0a-d50a-409b-b985-c09b657e9ba2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b49d9f0a-d50a-409b-b985-c09b657e9ba2\") " pod="openstack/nova-api-0" Sep 30 14:23:13 crc kubenswrapper[4676]: I0930 14:23:13.643413 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b49d9f0a-d50a-409b-b985-c09b657e9ba2-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b49d9f0a-d50a-409b-b985-c09b657e9ba2\") " pod="openstack/nova-api-0" Sep 30 14:23:13 crc kubenswrapper[4676]: I0930 14:23:13.657382 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9chww\" (UniqueName: \"kubernetes.io/projected/b49d9f0a-d50a-409b-b985-c09b657e9ba2-kube-api-access-9chww\") pod \"nova-api-0\" (UID: \"b49d9f0a-d50a-409b-b985-c09b657e9ba2\") " pod="openstack/nova-api-0" Sep 30 14:23:13 crc kubenswrapper[4676]: I0930 14:23:13.740287 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rk42q\" (UniqueName: \"kubernetes.io/projected/92d9c3ce-1a0b-48d0-a88b-ce25162e54b0-kube-api-access-rk42q\") pod \"nova-scheduler-0\" (UID: \"92d9c3ce-1a0b-48d0-a88b-ce25162e54b0\") " pod="openstack/nova-scheduler-0" Sep 30 14:23:13 crc kubenswrapper[4676]: I0930 14:23:13.740379 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92d9c3ce-1a0b-48d0-a88b-ce25162e54b0-config-data\") pod \"nova-scheduler-0\" (UID: \"92d9c3ce-1a0b-48d0-a88b-ce25162e54b0\") " pod="openstack/nova-scheduler-0" Sep 30 14:23:13 crc kubenswrapper[4676]: I0930 14:23:13.740448 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92d9c3ce-1a0b-48d0-a88b-ce25162e54b0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"92d9c3ce-1a0b-48d0-a88b-ce25162e54b0\") " pod="openstack/nova-scheduler-0" Sep 30 14:23:13 crc kubenswrapper[4676]: I0930 14:23:13.744250 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92d9c3ce-1a0b-48d0-a88b-ce25162e54b0-config-data\") pod \"nova-scheduler-0\" (UID: \"92d9c3ce-1a0b-48d0-a88b-ce25162e54b0\") " pod="openstack/nova-scheduler-0" Sep 30 14:23:13 crc kubenswrapper[4676]: I0930 14:23:13.744834 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92d9c3ce-1a0b-48d0-a88b-ce25162e54b0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"92d9c3ce-1a0b-48d0-a88b-ce25162e54b0\") " pod="openstack/nova-scheduler-0" Sep 30 14:23:13 crc kubenswrapper[4676]: I0930 14:23:13.751647 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 14:23:13 crc kubenswrapper[4676]: I0930 14:23:13.759813 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk42q\" (UniqueName: \"kubernetes.io/projected/92d9c3ce-1a0b-48d0-a88b-ce25162e54b0-kube-api-access-rk42q\") pod \"nova-scheduler-0\" (UID: \"92d9c3ce-1a0b-48d0-a88b-ce25162e54b0\") " pod="openstack/nova-scheduler-0" Sep 30 14:23:13 crc kubenswrapper[4676]: I0930 14:23:13.805777 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 14:23:14 crc kubenswrapper[4676]: I0930 14:23:14.216113 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 14:23:14 crc kubenswrapper[4676]: W0930 14:23:14.216383 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb49d9f0a_d50a_409b_b985_c09b657e9ba2.slice/crio-03fbf482c86f4fa14501089af8634f5f363f5aea85ba7765cbba6d794e662503 WatchSource:0}: Error finding container 03fbf482c86f4fa14501089af8634f5f363f5aea85ba7765cbba6d794e662503: Status 404 returned error can't find the container with id 03fbf482c86f4fa14501089af8634f5f363f5aea85ba7765cbba6d794e662503 Sep 30 14:23:14 crc kubenswrapper[4676]: I0930 14:23:14.312423 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 14:23:14 crc kubenswrapper[4676]: I0930 14:23:14.584783 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-c5q86" Sep 30 14:23:14 crc kubenswrapper[4676]: I0930 14:23:14.584854 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-c5q86" Sep 30 14:23:14 crc kubenswrapper[4676]: I0930 14:23:14.645273 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-c5q86" Sep 30 14:23:14 crc kubenswrapper[4676]: I0930 14:23:14.970662 4676 generic.go:334] "Generic (PLEG): container finished" podID="f5e16da2-d89d-4896-85f9-727c80f00a9b" containerID="c2fe8942b64a932ed3b353cb4dcf782d249e316feec4811ecca955cb274b1862" exitCode=0 Sep 30 14:23:14 crc kubenswrapper[4676]: I0930 14:23:14.970729 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5ks5l" event={"ID":"f5e16da2-d89d-4896-85f9-727c80f00a9b","Type":"ContainerDied","Data":"c2fe8942b64a932ed3b353cb4dcf782d249e316feec4811ecca955cb274b1862"} Sep 30 14:23:14 crc kubenswrapper[4676]: I0930 14:23:14.981563 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b49d9f0a-d50a-409b-b985-c09b657e9ba2","Type":"ContainerStarted","Data":"bd24323e746efb789c586ec9ecd227788c962911626cdbb01b365652c9e72b8c"} Sep 30 14:23:14 crc kubenswrapper[4676]: I0930 14:23:14.981607 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b49d9f0a-d50a-409b-b985-c09b657e9ba2","Type":"ContainerStarted","Data":"7a78288c7366538909c298aceea9af5314bbd583934d09b22858e8faaa366025"} Sep 30 14:23:14 crc kubenswrapper[4676]: I0930 14:23:14.981620 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b49d9f0a-d50a-409b-b985-c09b657e9ba2","Type":"ContainerStarted","Data":"03fbf482c86f4fa14501089af8634f5f363f5aea85ba7765cbba6d794e662503"} Sep 30 14:23:14 crc kubenswrapper[4676]: I0930 14:23:14.986213 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"92d9c3ce-1a0b-48d0-a88b-ce25162e54b0","Type":"ContainerStarted","Data":"f7e6ef3b51a09c5174ac0deb5a2c1d937dff674e28458ae9c81d5baae1da5b0f"} Sep 30 14:23:14 crc kubenswrapper[4676]: I0930 14:23:14.986259 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"92d9c3ce-1a0b-48d0-a88b-ce25162e54b0","Type":"ContainerStarted","Data":"ffc46ed69d0d7a11feb9080851f1cd56465718f79fd7eaf00b7b0740913bd51e"} Sep 30 14:23:15 crc kubenswrapper[4676]: I0930 14:23:15.034692 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.034671957 podStartE2EDuration="2.034671957s" podCreationTimestamp="2025-09-30 14:23:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:23:15.016737597 +0000 UTC m=+1498.999826036" watchObservedRunningTime="2025-09-30 14:23:15.034671957 +0000 UTC m=+1499.017760386" Sep 30 14:23:15 crc kubenswrapper[4676]: I0930 14:23:15.054841 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.054817903 podStartE2EDuration="2.054817903s" podCreationTimestamp="2025-09-30 14:23:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:23:15.045563261 +0000 UTC m=+1499.028651710" watchObservedRunningTime="2025-09-30 14:23:15.054817903 +0000 UTC m=+1499.037906332" Sep 30 14:23:15 crc kubenswrapper[4676]: I0930 14:23:15.055497 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-c5q86" Sep 30 14:23:15 crc kubenswrapper[4676]: I0930 14:23:15.997430 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5ks5l" event={"ID":"f5e16da2-d89d-4896-85f9-727c80f00a9b","Type":"ContainerStarted","Data":"b1b4523951ab1e8bc1137c241434e6e7763322c9f155a07229684db62157a000"} Sep 30 14:23:16 crc kubenswrapper[4676]: I0930 14:23:16.026862 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5ks5l" podStartSLOduration=3.545096074 podStartE2EDuration="6.026841011s" podCreationTimestamp="2025-09-30 14:23:10 +0000 UTC" firstStartedPulling="2025-09-30 14:23:12.944753419 +0000 UTC m=+1496.927841848" lastFinishedPulling="2025-09-30 14:23:15.426498356 +0000 UTC m=+1499.409586785" observedRunningTime="2025-09-30 14:23:16.018615204 +0000 UTC m=+1500.001703643" watchObservedRunningTime="2025-09-30 14:23:16.026841011 +0000 UTC m=+1500.009929440" Sep 30 14:23:16 crc kubenswrapper[4676]: I0930 14:23:16.695816 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 14:23:16 crc kubenswrapper[4676]: I0930 14:23:16.696191 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 14:23:17 crc kubenswrapper[4676]: I0930 14:23:17.419056 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-c5q86"] Sep 30 14:23:17 crc kubenswrapper[4676]: I0930 14:23:17.419329 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-c5q86" podUID="a8c17bb9-4ccb-470c-9f5b-69dd8420b4b0" containerName="registry-server" containerID="cri-o://132eb428f09d1cdd842bcfbd95531b10c13cd57968ace3fb1cbe737c1647f05d" gracePeriod=2 Sep 30 14:23:17 crc kubenswrapper[4676]: I0930 14:23:17.899912 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c5q86" Sep 30 14:23:18 crc kubenswrapper[4676]: I0930 14:23:18.019629 4676 generic.go:334] "Generic (PLEG): container finished" podID="a8c17bb9-4ccb-470c-9f5b-69dd8420b4b0" containerID="132eb428f09d1cdd842bcfbd95531b10c13cd57968ace3fb1cbe737c1647f05d" exitCode=0 Sep 30 14:23:18 crc kubenswrapper[4676]: I0930 14:23:18.019678 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c5q86" event={"ID":"a8c17bb9-4ccb-470c-9f5b-69dd8420b4b0","Type":"ContainerDied","Data":"132eb428f09d1cdd842bcfbd95531b10c13cd57968ace3fb1cbe737c1647f05d"} Sep 30 14:23:18 crc kubenswrapper[4676]: I0930 14:23:18.019716 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c5q86" event={"ID":"a8c17bb9-4ccb-470c-9f5b-69dd8420b4b0","Type":"ContainerDied","Data":"39b4daf02daa16aafef8313af7c1ead9dc5bef2dc7c9ab06c2a6acbef76c82d8"} Sep 30 14:23:18 crc kubenswrapper[4676]: I0930 14:23:18.019736 4676 scope.go:117] "RemoveContainer" containerID="132eb428f09d1cdd842bcfbd95531b10c13cd57968ace3fb1cbe737c1647f05d" Sep 30 14:23:18 crc kubenswrapper[4676]: I0930 14:23:18.019818 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c5q86" Sep 30 14:23:18 crc kubenswrapper[4676]: I0930 14:23:18.023781 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8c17bb9-4ccb-470c-9f5b-69dd8420b4b0-utilities\") pod \"a8c17bb9-4ccb-470c-9f5b-69dd8420b4b0\" (UID: \"a8c17bb9-4ccb-470c-9f5b-69dd8420b4b0\") " Sep 30 14:23:18 crc kubenswrapper[4676]: I0930 14:23:18.024106 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25898\" (UniqueName: \"kubernetes.io/projected/a8c17bb9-4ccb-470c-9f5b-69dd8420b4b0-kube-api-access-25898\") pod \"a8c17bb9-4ccb-470c-9f5b-69dd8420b4b0\" (UID: \"a8c17bb9-4ccb-470c-9f5b-69dd8420b4b0\") " Sep 30 14:23:18 crc kubenswrapper[4676]: I0930 14:23:18.024190 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8c17bb9-4ccb-470c-9f5b-69dd8420b4b0-catalog-content\") pod \"a8c17bb9-4ccb-470c-9f5b-69dd8420b4b0\" (UID: \"a8c17bb9-4ccb-470c-9f5b-69dd8420b4b0\") " Sep 30 14:23:18 crc kubenswrapper[4676]: I0930 14:23:18.024901 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8c17bb9-4ccb-470c-9f5b-69dd8420b4b0-utilities" (OuterVolumeSpecName: "utilities") pod "a8c17bb9-4ccb-470c-9f5b-69dd8420b4b0" (UID: "a8c17bb9-4ccb-470c-9f5b-69dd8420b4b0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:23:18 crc kubenswrapper[4676]: I0930 14:23:18.030940 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8c17bb9-4ccb-470c-9f5b-69dd8420b4b0-kube-api-access-25898" (OuterVolumeSpecName: "kube-api-access-25898") pod "a8c17bb9-4ccb-470c-9f5b-69dd8420b4b0" (UID: "a8c17bb9-4ccb-470c-9f5b-69dd8420b4b0"). InnerVolumeSpecName "kube-api-access-25898". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:23:18 crc kubenswrapper[4676]: I0930 14:23:18.046185 4676 scope.go:117] "RemoveContainer" containerID="1885dc8da6cc2d94469d4ee0a8533a2f9606fe74928f79b16cd7e30dea823d28" Sep 30 14:23:18 crc kubenswrapper[4676]: I0930 14:23:18.046563 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8c17bb9-4ccb-470c-9f5b-69dd8420b4b0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a8c17bb9-4ccb-470c-9f5b-69dd8420b4b0" (UID: "a8c17bb9-4ccb-470c-9f5b-69dd8420b4b0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:23:18 crc kubenswrapper[4676]: I0930 14:23:18.107453 4676 scope.go:117] "RemoveContainer" containerID="242942cd9e20d89b513415c89434ee39f6c9be6e01e8d55361fff901c232996a" Sep 30 14:23:18 crc kubenswrapper[4676]: I0930 14:23:18.127367 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25898\" (UniqueName: \"kubernetes.io/projected/a8c17bb9-4ccb-470c-9f5b-69dd8420b4b0-kube-api-access-25898\") on node \"crc\" DevicePath \"\"" Sep 30 14:23:18 crc kubenswrapper[4676]: I0930 14:23:18.127415 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8c17bb9-4ccb-470c-9f5b-69dd8420b4b0-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 14:23:18 crc kubenswrapper[4676]: I0930 14:23:18.127430 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8c17bb9-4ccb-470c-9f5b-69dd8420b4b0-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 14:23:18 crc kubenswrapper[4676]: I0930 14:23:18.151765 4676 scope.go:117] "RemoveContainer" containerID="132eb428f09d1cdd842bcfbd95531b10c13cd57968ace3fb1cbe737c1647f05d" Sep 30 14:23:18 crc kubenswrapper[4676]: E0930 14:23:18.152414 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"132eb428f09d1cdd842bcfbd95531b10c13cd57968ace3fb1cbe737c1647f05d\": container with ID starting with 132eb428f09d1cdd842bcfbd95531b10c13cd57968ace3fb1cbe737c1647f05d not found: ID does not exist" containerID="132eb428f09d1cdd842bcfbd95531b10c13cd57968ace3fb1cbe737c1647f05d" Sep 30 14:23:18 crc kubenswrapper[4676]: I0930 14:23:18.152478 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"132eb428f09d1cdd842bcfbd95531b10c13cd57968ace3fb1cbe737c1647f05d"} err="failed to get container status \"132eb428f09d1cdd842bcfbd95531b10c13cd57968ace3fb1cbe737c1647f05d\": rpc error: code = NotFound desc = could not find container \"132eb428f09d1cdd842bcfbd95531b10c13cd57968ace3fb1cbe737c1647f05d\": container with ID starting with 132eb428f09d1cdd842bcfbd95531b10c13cd57968ace3fb1cbe737c1647f05d not found: ID does not exist" Sep 30 14:23:18 crc kubenswrapper[4676]: I0930 14:23:18.152511 4676 scope.go:117] "RemoveContainer" containerID="1885dc8da6cc2d94469d4ee0a8533a2f9606fe74928f79b16cd7e30dea823d28" Sep 30 14:23:18 crc kubenswrapper[4676]: E0930 14:23:18.153234 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1885dc8da6cc2d94469d4ee0a8533a2f9606fe74928f79b16cd7e30dea823d28\": container with ID starting with 1885dc8da6cc2d94469d4ee0a8533a2f9606fe74928f79b16cd7e30dea823d28 not found: ID does not exist" containerID="1885dc8da6cc2d94469d4ee0a8533a2f9606fe74928f79b16cd7e30dea823d28" Sep 30 14:23:18 crc kubenswrapper[4676]: I0930 14:23:18.153318 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1885dc8da6cc2d94469d4ee0a8533a2f9606fe74928f79b16cd7e30dea823d28"} err="failed to get container status \"1885dc8da6cc2d94469d4ee0a8533a2f9606fe74928f79b16cd7e30dea823d28\": rpc error: code = NotFound desc = could not find container \"1885dc8da6cc2d94469d4ee0a8533a2f9606fe74928f79b16cd7e30dea823d28\": container with ID starting with 1885dc8da6cc2d94469d4ee0a8533a2f9606fe74928f79b16cd7e30dea823d28 not found: ID does not exist" Sep 30 14:23:18 crc kubenswrapper[4676]: I0930 14:23:18.153388 4676 scope.go:117] "RemoveContainer" containerID="242942cd9e20d89b513415c89434ee39f6c9be6e01e8d55361fff901c232996a" Sep 30 14:23:18 crc kubenswrapper[4676]: E0930 14:23:18.153805 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"242942cd9e20d89b513415c89434ee39f6c9be6e01e8d55361fff901c232996a\": container with ID starting with 242942cd9e20d89b513415c89434ee39f6c9be6e01e8d55361fff901c232996a not found: ID does not exist" containerID="242942cd9e20d89b513415c89434ee39f6c9be6e01e8d55361fff901c232996a" Sep 30 14:23:18 crc kubenswrapper[4676]: I0930 14:23:18.153842 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"242942cd9e20d89b513415c89434ee39f6c9be6e01e8d55361fff901c232996a"} err="failed to get container status \"242942cd9e20d89b513415c89434ee39f6c9be6e01e8d55361fff901c232996a\": rpc error: code = NotFound desc = could not find container \"242942cd9e20d89b513415c89434ee39f6c9be6e01e8d55361fff901c232996a\": container with ID starting with 242942cd9e20d89b513415c89434ee39f6c9be6e01e8d55361fff901c232996a not found: ID does not exist" Sep 30 14:23:18 crc kubenswrapper[4676]: I0930 14:23:18.356671 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-c5q86"] Sep 30 14:23:18 crc kubenswrapper[4676]: I0930 14:23:18.365539 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-c5q86"] Sep 30 14:23:18 crc kubenswrapper[4676]: I0930 14:23:18.806779 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Sep 30 14:23:19 crc kubenswrapper[4676]: I0930 14:23:19.444675 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8c17bb9-4ccb-470c-9f5b-69dd8420b4b0" path="/var/lib/kubelet/pods/a8c17bb9-4ccb-470c-9f5b-69dd8420b4b0/volumes" Sep 30 14:23:20 crc kubenswrapper[4676]: I0930 14:23:20.996661 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5ks5l" Sep 30 14:23:20 crc kubenswrapper[4676]: I0930 14:23:20.997059 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5ks5l" Sep 30 14:23:21 crc kubenswrapper[4676]: I0930 14:23:21.042823 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5ks5l" Sep 30 14:23:21 crc kubenswrapper[4676]: I0930 14:23:21.112531 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5ks5l" Sep 30 14:23:21 crc kubenswrapper[4676]: I0930 14:23:21.696387 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Sep 30 14:23:21 crc kubenswrapper[4676]: I0930 14:23:21.696457 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Sep 30 14:23:22 crc kubenswrapper[4676]: I0930 14:23:22.022574 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5ks5l"] Sep 30 14:23:22 crc kubenswrapper[4676]: I0930 14:23:22.708048 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b9578818-8dfa-4aec-8923-d1d9424068be" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.207:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 14:23:22 crc kubenswrapper[4676]: I0930 14:23:22.708149 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b9578818-8dfa-4aec-8923-d1d9424068be" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.207:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 14:23:23 crc kubenswrapper[4676]: I0930 14:23:23.085798 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5ks5l" podUID="f5e16da2-d89d-4896-85f9-727c80f00a9b" containerName="registry-server" containerID="cri-o://b1b4523951ab1e8bc1137c241434e6e7763322c9f155a07229684db62157a000" gracePeriod=2 Sep 30 14:23:23 crc kubenswrapper[4676]: E0930 14:23:23.201987 4676 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8c17bb9_4ccb_470c_9f5b_69dd8420b4b0.slice/crio-conmon-132eb428f09d1cdd842bcfbd95531b10c13cd57968ace3fb1cbe737c1647f05d.scope\": RecentStats: unable to find data in memory cache]" Sep 30 14:23:23 crc kubenswrapper[4676]: I0930 14:23:23.752396 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 14:23:23 crc kubenswrapper[4676]: I0930 14:23:23.753115 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 14:23:23 crc kubenswrapper[4676]: I0930 14:23:23.806371 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Sep 30 14:23:23 crc kubenswrapper[4676]: I0930 14:23:23.845854 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Sep 30 14:23:24 crc kubenswrapper[4676]: I0930 14:23:24.099000 4676 generic.go:334] "Generic (PLEG): container finished" podID="f5e16da2-d89d-4896-85f9-727c80f00a9b" containerID="b1b4523951ab1e8bc1137c241434e6e7763322c9f155a07229684db62157a000" exitCode=0 Sep 30 14:23:24 crc kubenswrapper[4676]: I0930 14:23:24.099088 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5ks5l" event={"ID":"f5e16da2-d89d-4896-85f9-727c80f00a9b","Type":"ContainerDied","Data":"b1b4523951ab1e8bc1137c241434e6e7763322c9f155a07229684db62157a000"} Sep 30 14:23:24 crc kubenswrapper[4676]: I0930 14:23:24.099126 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5ks5l" event={"ID":"f5e16da2-d89d-4896-85f9-727c80f00a9b","Type":"ContainerDied","Data":"ec9114d35d4f400dc88f3d27bcc84e4fc640d60efe249a919bd0f8211a9f8d38"} Sep 30 14:23:24 crc kubenswrapper[4676]: I0930 14:23:24.099137 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec9114d35d4f400dc88f3d27bcc84e4fc640d60efe249a919bd0f8211a9f8d38" Sep 30 14:23:24 crc kubenswrapper[4676]: I0930 14:23:24.127697 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Sep 30 14:23:24 crc kubenswrapper[4676]: I0930 14:23:24.141952 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5ks5l" Sep 30 14:23:24 crc kubenswrapper[4676]: I0930 14:23:24.242865 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwkn2\" (UniqueName: \"kubernetes.io/projected/f5e16da2-d89d-4896-85f9-727c80f00a9b-kube-api-access-hwkn2\") pod \"f5e16da2-d89d-4896-85f9-727c80f00a9b\" (UID: \"f5e16da2-d89d-4896-85f9-727c80f00a9b\") " Sep 30 14:23:24 crc kubenswrapper[4676]: I0930 14:23:24.242973 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5e16da2-d89d-4896-85f9-727c80f00a9b-catalog-content\") pod \"f5e16da2-d89d-4896-85f9-727c80f00a9b\" (UID: \"f5e16da2-d89d-4896-85f9-727c80f00a9b\") " Sep 30 14:23:24 crc kubenswrapper[4676]: I0930 14:23:24.243184 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5e16da2-d89d-4896-85f9-727c80f00a9b-utilities\") pod \"f5e16da2-d89d-4896-85f9-727c80f00a9b\" (UID: \"f5e16da2-d89d-4896-85f9-727c80f00a9b\") " Sep 30 14:23:24 crc kubenswrapper[4676]: I0930 14:23:24.243903 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5e16da2-d89d-4896-85f9-727c80f00a9b-utilities" (OuterVolumeSpecName: "utilities") pod "f5e16da2-d89d-4896-85f9-727c80f00a9b" (UID: "f5e16da2-d89d-4896-85f9-727c80f00a9b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:23:24 crc kubenswrapper[4676]: I0930 14:23:24.248521 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5e16da2-d89d-4896-85f9-727c80f00a9b-kube-api-access-hwkn2" (OuterVolumeSpecName: "kube-api-access-hwkn2") pod "f5e16da2-d89d-4896-85f9-727c80f00a9b" (UID: "f5e16da2-d89d-4896-85f9-727c80f00a9b"). InnerVolumeSpecName "kube-api-access-hwkn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:23:24 crc kubenswrapper[4676]: I0930 14:23:24.296715 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5e16da2-d89d-4896-85f9-727c80f00a9b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f5e16da2-d89d-4896-85f9-727c80f00a9b" (UID: "f5e16da2-d89d-4896-85f9-727c80f00a9b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:23:24 crc kubenswrapper[4676]: I0930 14:23:24.345069 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5e16da2-d89d-4896-85f9-727c80f00a9b-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 14:23:24 crc kubenswrapper[4676]: I0930 14:23:24.345110 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwkn2\" (UniqueName: \"kubernetes.io/projected/f5e16da2-d89d-4896-85f9-727c80f00a9b-kube-api-access-hwkn2\") on node \"crc\" DevicePath \"\"" Sep 30 14:23:24 crc kubenswrapper[4676]: I0930 14:23:24.345122 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5e16da2-d89d-4896-85f9-727c80f00a9b-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 14:23:24 crc kubenswrapper[4676]: I0930 14:23:24.798074 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b49d9f0a-d50a-409b-b985-c09b657e9ba2" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.208:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 14:23:24 crc kubenswrapper[4676]: I0930 14:23:24.798082 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b49d9f0a-d50a-409b-b985-c09b657e9ba2" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.208:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 14:23:25 crc kubenswrapper[4676]: I0930 14:23:25.115626 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5ks5l" Sep 30 14:23:25 crc kubenswrapper[4676]: I0930 14:23:25.154321 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5ks5l"] Sep 30 14:23:25 crc kubenswrapper[4676]: I0930 14:23:25.162612 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5ks5l"] Sep 30 14:23:25 crc kubenswrapper[4676]: I0930 14:23:25.442636 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5e16da2-d89d-4896-85f9-727c80f00a9b" path="/var/lib/kubelet/pods/f5e16da2-d89d-4896-85f9-727c80f00a9b/volumes" Sep 30 14:23:29 crc kubenswrapper[4676]: I0930 14:23:29.919633 4676 patch_prober.go:28] interesting pod/machine-config-daemon-4k2dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:23:29 crc kubenswrapper[4676]: I0930 14:23:29.920979 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:23:31 crc kubenswrapper[4676]: I0930 14:23:31.108844 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Sep 30 14:23:31 crc kubenswrapper[4676]: I0930 14:23:31.701356 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Sep 30 14:23:31 crc kubenswrapper[4676]: I0930 14:23:31.706669 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Sep 30 14:23:31 crc kubenswrapper[4676]: I0930 14:23:31.708791 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Sep 30 14:23:32 crc kubenswrapper[4676]: I0930 14:23:32.197325 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Sep 30 14:23:33 crc kubenswrapper[4676]: E0930 14:23:33.470769 4676 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8c17bb9_4ccb_470c_9f5b_69dd8420b4b0.slice/crio-conmon-132eb428f09d1cdd842bcfbd95531b10c13cd57968ace3fb1cbe737c1647f05d.scope\": RecentStats: unable to find data in memory cache]" Sep 30 14:23:33 crc kubenswrapper[4676]: I0930 14:23:33.758290 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Sep 30 14:23:33 crc kubenswrapper[4676]: I0930 14:23:33.758446 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Sep 30 14:23:33 crc kubenswrapper[4676]: I0930 14:23:33.758817 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Sep 30 14:23:33 crc kubenswrapper[4676]: I0930 14:23:33.758838 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Sep 30 14:23:33 crc kubenswrapper[4676]: I0930 14:23:33.770272 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Sep 30 14:23:33 crc kubenswrapper[4676]: I0930 14:23:33.770461 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Sep 30 14:23:42 crc kubenswrapper[4676]: I0930 14:23:42.104726 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 14:23:43 crc kubenswrapper[4676]: I0930 14:23:43.457795 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 14:23:43 crc kubenswrapper[4676]: E0930 14:23:43.748955 4676 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8c17bb9_4ccb_470c_9f5b_69dd8420b4b0.slice/crio-conmon-132eb428f09d1cdd842bcfbd95531b10c13cd57968ace3fb1cbe737c1647f05d.scope\": RecentStats: unable to find data in memory cache]" Sep 30 14:23:47 crc kubenswrapper[4676]: I0930 14:23:47.247405 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="0ee88379-95a2-4019-a41a-7931a5ab2f30" containerName="rabbitmq" containerID="cri-o://f061333676831439f5d651eeb48da1b574912c769ed66d6ad08be9c6767e44fd" gracePeriod=604795 Sep 30 14:23:47 crc kubenswrapper[4676]: I0930 14:23:47.704584 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="65fc51e6-2db0-4efd-880c-0ba599ef8637" containerName="rabbitmq" containerID="cri-o://bb59ac140098029e345d6da7202977df3632e37bef8af0d303536be23487701d" gracePeriod=604796 Sep 30 14:23:53 crc kubenswrapper[4676]: I0930 14:23:53.876941 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.036305 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"0ee88379-95a2-4019-a41a-7931a5ab2f30\" (UID: \"0ee88379-95a2-4019-a41a-7931a5ab2f30\") " Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.036410 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9fc6\" (UniqueName: \"kubernetes.io/projected/0ee88379-95a2-4019-a41a-7931a5ab2f30-kube-api-access-q9fc6\") pod \"0ee88379-95a2-4019-a41a-7931a5ab2f30\" (UID: \"0ee88379-95a2-4019-a41a-7931a5ab2f30\") " Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.036463 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0ee88379-95a2-4019-a41a-7931a5ab2f30-server-conf\") pod \"0ee88379-95a2-4019-a41a-7931a5ab2f30\" (UID: \"0ee88379-95a2-4019-a41a-7931a5ab2f30\") " Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.036491 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0ee88379-95a2-4019-a41a-7931a5ab2f30-rabbitmq-tls\") pod \"0ee88379-95a2-4019-a41a-7931a5ab2f30\" (UID: \"0ee88379-95a2-4019-a41a-7931a5ab2f30\") " Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.036523 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0ee88379-95a2-4019-a41a-7931a5ab2f30-rabbitmq-confd\") pod \"0ee88379-95a2-4019-a41a-7931a5ab2f30\" (UID: \"0ee88379-95a2-4019-a41a-7931a5ab2f30\") " Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.036556 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0ee88379-95a2-4019-a41a-7931a5ab2f30-pod-info\") pod \"0ee88379-95a2-4019-a41a-7931a5ab2f30\" (UID: \"0ee88379-95a2-4019-a41a-7931a5ab2f30\") " Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.036652 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0ee88379-95a2-4019-a41a-7931a5ab2f30-plugins-conf\") pod \"0ee88379-95a2-4019-a41a-7931a5ab2f30\" (UID: \"0ee88379-95a2-4019-a41a-7931a5ab2f30\") " Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.036685 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0ee88379-95a2-4019-a41a-7931a5ab2f30-rabbitmq-erlang-cookie\") pod \"0ee88379-95a2-4019-a41a-7931a5ab2f30\" (UID: \"0ee88379-95a2-4019-a41a-7931a5ab2f30\") " Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.036744 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0ee88379-95a2-4019-a41a-7931a5ab2f30-config-data\") pod \"0ee88379-95a2-4019-a41a-7931a5ab2f30\" (UID: \"0ee88379-95a2-4019-a41a-7931a5ab2f30\") " Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.036799 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0ee88379-95a2-4019-a41a-7931a5ab2f30-rabbitmq-plugins\") pod \"0ee88379-95a2-4019-a41a-7931a5ab2f30\" (UID: \"0ee88379-95a2-4019-a41a-7931a5ab2f30\") " Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.036939 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0ee88379-95a2-4019-a41a-7931a5ab2f30-erlang-cookie-secret\") pod \"0ee88379-95a2-4019-a41a-7931a5ab2f30\" (UID: \"0ee88379-95a2-4019-a41a-7931a5ab2f30\") " Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.037352 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ee88379-95a2-4019-a41a-7931a5ab2f30-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "0ee88379-95a2-4019-a41a-7931a5ab2f30" (UID: "0ee88379-95a2-4019-a41a-7931a5ab2f30"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.037819 4676 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0ee88379-95a2-4019-a41a-7931a5ab2f30-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.038072 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ee88379-95a2-4019-a41a-7931a5ab2f30-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "0ee88379-95a2-4019-a41a-7931a5ab2f30" (UID: "0ee88379-95a2-4019-a41a-7931a5ab2f30"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.038184 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ee88379-95a2-4019-a41a-7931a5ab2f30-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "0ee88379-95a2-4019-a41a-7931a5ab2f30" (UID: "0ee88379-95a2-4019-a41a-7931a5ab2f30"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.042928 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ee88379-95a2-4019-a41a-7931a5ab2f30-kube-api-access-q9fc6" (OuterVolumeSpecName: "kube-api-access-q9fc6") pod "0ee88379-95a2-4019-a41a-7931a5ab2f30" (UID: "0ee88379-95a2-4019-a41a-7931a5ab2f30"). InnerVolumeSpecName "kube-api-access-q9fc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.043045 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ee88379-95a2-4019-a41a-7931a5ab2f30-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "0ee88379-95a2-4019-a41a-7931a5ab2f30" (UID: "0ee88379-95a2-4019-a41a-7931a5ab2f30"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.043742 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "persistence") pod "0ee88379-95a2-4019-a41a-7931a5ab2f30" (UID: "0ee88379-95a2-4019-a41a-7931a5ab2f30"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.057072 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/0ee88379-95a2-4019-a41a-7931a5ab2f30-pod-info" (OuterVolumeSpecName: "pod-info") pod "0ee88379-95a2-4019-a41a-7931a5ab2f30" (UID: "0ee88379-95a2-4019-a41a-7931a5ab2f30"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.057271 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ee88379-95a2-4019-a41a-7931a5ab2f30-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "0ee88379-95a2-4019-a41a-7931a5ab2f30" (UID: "0ee88379-95a2-4019-a41a-7931a5ab2f30"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.072025 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ee88379-95a2-4019-a41a-7931a5ab2f30-config-data" (OuterVolumeSpecName: "config-data") pod "0ee88379-95a2-4019-a41a-7931a5ab2f30" (UID: "0ee88379-95a2-4019-a41a-7931a5ab2f30"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:23:54 crc kubenswrapper[4676]: E0930 14:23:54.094550 4676 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65fc51e6_2db0_4efd_880c_0ba599ef8637.slice/crio-bb59ac140098029e345d6da7202977df3632e37bef8af0d303536be23487701d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8c17bb9_4ccb_470c_9f5b_69dd8420b4b0.slice/crio-conmon-132eb428f09d1cdd842bcfbd95531b10c13cd57968ace3fb1cbe737c1647f05d.scope\": RecentStats: unable to find data in memory cache]" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.108210 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ee88379-95a2-4019-a41a-7931a5ab2f30-server-conf" (OuterVolumeSpecName: "server-conf") pod "0ee88379-95a2-4019-a41a-7931a5ab2f30" (UID: "0ee88379-95a2-4019-a41a-7931a5ab2f30"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.139283 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0ee88379-95a2-4019-a41a-7931a5ab2f30-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.139677 4676 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0ee88379-95a2-4019-a41a-7931a5ab2f30-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.139688 4676 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0ee88379-95a2-4019-a41a-7931a5ab2f30-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.139716 4676 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.139728 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9fc6\" (UniqueName: \"kubernetes.io/projected/0ee88379-95a2-4019-a41a-7931a5ab2f30-kube-api-access-q9fc6\") on node \"crc\" DevicePath \"\"" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.139736 4676 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0ee88379-95a2-4019-a41a-7931a5ab2f30-server-conf\") on node \"crc\" DevicePath \"\"" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.139744 4676 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0ee88379-95a2-4019-a41a-7931a5ab2f30-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.139752 4676 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0ee88379-95a2-4019-a41a-7931a5ab2f30-pod-info\") on node \"crc\" DevicePath \"\"" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.139759 4676 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0ee88379-95a2-4019-a41a-7931a5ab2f30-plugins-conf\") on node \"crc\" DevicePath \"\"" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.165040 4676 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.188508 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ee88379-95a2-4019-a41a-7931a5ab2f30-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "0ee88379-95a2-4019-a41a-7931a5ab2f30" (UID: "0ee88379-95a2-4019-a41a-7931a5ab2f30"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.241248 4676 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.241285 4676 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0ee88379-95a2-4019-a41a-7931a5ab2f30-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.405378 4676 generic.go:334] "Generic (PLEG): container finished" podID="0ee88379-95a2-4019-a41a-7931a5ab2f30" containerID="f061333676831439f5d651eeb48da1b574912c769ed66d6ad08be9c6767e44fd" exitCode=0 Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.405448 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0ee88379-95a2-4019-a41a-7931a5ab2f30","Type":"ContainerDied","Data":"f061333676831439f5d651eeb48da1b574912c769ed66d6ad08be9c6767e44fd"} Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.405480 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0ee88379-95a2-4019-a41a-7931a5ab2f30","Type":"ContainerDied","Data":"fcea88a4ef537edc4f1a5808b5837900199b338d0357a20388b5e52d683a133e"} Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.405500 4676 scope.go:117] "RemoveContainer" containerID="f061333676831439f5d651eeb48da1b574912c769ed66d6ad08be9c6767e44fd" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.405645 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.426434 4676 generic.go:334] "Generic (PLEG): container finished" podID="65fc51e6-2db0-4efd-880c-0ba599ef8637" containerID="bb59ac140098029e345d6da7202977df3632e37bef8af0d303536be23487701d" exitCode=0 Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.426481 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"65fc51e6-2db0-4efd-880c-0ba599ef8637","Type":"ContainerDied","Data":"bb59ac140098029e345d6da7202977df3632e37bef8af0d303536be23487701d"} Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.490052 4676 scope.go:117] "RemoveContainer" containerID="6c8ec10ab315c39eb7ff01d8e84e99116af149f3216722235ba385c2ec21e522" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.490222 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.510678 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.517951 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.550942 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 14:23:54 crc kubenswrapper[4676]: E0930 14:23:54.551374 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5e16da2-d89d-4896-85f9-727c80f00a9b" containerName="extract-content" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.551391 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5e16da2-d89d-4896-85f9-727c80f00a9b" containerName="extract-content" Sep 30 14:23:54 crc kubenswrapper[4676]: E0930 14:23:54.551409 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65fc51e6-2db0-4efd-880c-0ba599ef8637" containerName="setup-container" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.551415 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="65fc51e6-2db0-4efd-880c-0ba599ef8637" containerName="setup-container" Sep 30 14:23:54 crc kubenswrapper[4676]: E0930 14:23:54.551430 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8c17bb9-4ccb-470c-9f5b-69dd8420b4b0" containerName="registry-server" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.551436 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8c17bb9-4ccb-470c-9f5b-69dd8420b4b0" containerName="registry-server" Sep 30 14:23:54 crc kubenswrapper[4676]: E0930 14:23:54.551449 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8c17bb9-4ccb-470c-9f5b-69dd8420b4b0" containerName="extract-content" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.551454 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8c17bb9-4ccb-470c-9f5b-69dd8420b4b0" containerName="extract-content" Sep 30 14:23:54 crc kubenswrapper[4676]: E0930 14:23:54.551464 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ee88379-95a2-4019-a41a-7931a5ab2f30" containerName="rabbitmq" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.551470 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ee88379-95a2-4019-a41a-7931a5ab2f30" containerName="rabbitmq" Sep 30 14:23:54 crc kubenswrapper[4676]: E0930 14:23:54.551481 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8c17bb9-4ccb-470c-9f5b-69dd8420b4b0" containerName="extract-utilities" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.551487 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8c17bb9-4ccb-470c-9f5b-69dd8420b4b0" containerName="extract-utilities" Sep 30 14:23:54 crc kubenswrapper[4676]: E0930 14:23:54.551498 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5e16da2-d89d-4896-85f9-727c80f00a9b" containerName="registry-server" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.551504 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5e16da2-d89d-4896-85f9-727c80f00a9b" containerName="registry-server" Sep 30 14:23:54 crc kubenswrapper[4676]: E0930 14:23:54.551521 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ee88379-95a2-4019-a41a-7931a5ab2f30" containerName="setup-container" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.551526 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ee88379-95a2-4019-a41a-7931a5ab2f30" containerName="setup-container" Sep 30 14:23:54 crc kubenswrapper[4676]: E0930 14:23:54.551536 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65fc51e6-2db0-4efd-880c-0ba599ef8637" containerName="rabbitmq" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.551543 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="65fc51e6-2db0-4efd-880c-0ba599ef8637" containerName="rabbitmq" Sep 30 14:23:54 crc kubenswrapper[4676]: E0930 14:23:54.551558 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5e16da2-d89d-4896-85f9-727c80f00a9b" containerName="extract-utilities" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.551563 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5e16da2-d89d-4896-85f9-727c80f00a9b" containerName="extract-utilities" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.551722 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="65fc51e6-2db0-4efd-880c-0ba599ef8637" containerName="rabbitmq" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.551737 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5e16da2-d89d-4896-85f9-727c80f00a9b" containerName="registry-server" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.551753 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8c17bb9-4ccb-470c-9f5b-69dd8420b4b0" containerName="registry-server" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.551764 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ee88379-95a2-4019-a41a-7931a5ab2f30" containerName="rabbitmq" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.552699 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.562741 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.563037 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.563256 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.563571 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-lps8l" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.563761 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.577951 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.578186 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.611169 4676 scope.go:117] "RemoveContainer" containerID="f061333676831439f5d651eeb48da1b574912c769ed66d6ad08be9c6767e44fd" Sep 30 14:23:54 crc kubenswrapper[4676]: E0930 14:23:54.611724 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f061333676831439f5d651eeb48da1b574912c769ed66d6ad08be9c6767e44fd\": container with ID starting with f061333676831439f5d651eeb48da1b574912c769ed66d6ad08be9c6767e44fd not found: ID does not exist" containerID="f061333676831439f5d651eeb48da1b574912c769ed66d6ad08be9c6767e44fd" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.611761 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f061333676831439f5d651eeb48da1b574912c769ed66d6ad08be9c6767e44fd"} err="failed to get container status \"f061333676831439f5d651eeb48da1b574912c769ed66d6ad08be9c6767e44fd\": rpc error: code = NotFound desc = could not find container \"f061333676831439f5d651eeb48da1b574912c769ed66d6ad08be9c6767e44fd\": container with ID starting with f061333676831439f5d651eeb48da1b574912c769ed66d6ad08be9c6767e44fd not found: ID does not exist" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.611785 4676 scope.go:117] "RemoveContainer" containerID="6c8ec10ab315c39eb7ff01d8e84e99116af149f3216722235ba385c2ec21e522" Sep 30 14:23:54 crc kubenswrapper[4676]: E0930 14:23:54.612325 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c8ec10ab315c39eb7ff01d8e84e99116af149f3216722235ba385c2ec21e522\": container with ID starting with 6c8ec10ab315c39eb7ff01d8e84e99116af149f3216722235ba385c2ec21e522 not found: ID does not exist" containerID="6c8ec10ab315c39eb7ff01d8e84e99116af149f3216722235ba385c2ec21e522" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.612351 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c8ec10ab315c39eb7ff01d8e84e99116af149f3216722235ba385c2ec21e522"} err="failed to get container status \"6c8ec10ab315c39eb7ff01d8e84e99116af149f3216722235ba385c2ec21e522\": rpc error: code = NotFound desc = could not find container \"6c8ec10ab315c39eb7ff01d8e84e99116af149f3216722235ba385c2ec21e522\": container with ID starting with 6c8ec10ab315c39eb7ff01d8e84e99116af149f3216722235ba385c2ec21e522 not found: ID does not exist" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.624823 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.665580 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/65fc51e6-2db0-4efd-880c-0ba599ef8637-rabbitmq-plugins\") pod \"65fc51e6-2db0-4efd-880c-0ba599ef8637\" (UID: \"65fc51e6-2db0-4efd-880c-0ba599ef8637\") " Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.665629 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/65fc51e6-2db0-4efd-880c-0ba599ef8637-erlang-cookie-secret\") pod \"65fc51e6-2db0-4efd-880c-0ba599ef8637\" (UID: \"65fc51e6-2db0-4efd-880c-0ba599ef8637\") " Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.665654 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/65fc51e6-2db0-4efd-880c-0ba599ef8637-config-data\") pod \"65fc51e6-2db0-4efd-880c-0ba599ef8637\" (UID: \"65fc51e6-2db0-4efd-880c-0ba599ef8637\") " Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.667853 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/65fc51e6-2db0-4efd-880c-0ba599ef8637-server-conf\") pod \"65fc51e6-2db0-4efd-880c-0ba599ef8637\" (UID: \"65fc51e6-2db0-4efd-880c-0ba599ef8637\") " Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.667932 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/65fc51e6-2db0-4efd-880c-0ba599ef8637-rabbitmq-confd\") pod \"65fc51e6-2db0-4efd-880c-0ba599ef8637\" (UID: \"65fc51e6-2db0-4efd-880c-0ba599ef8637\") " Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.667973 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/65fc51e6-2db0-4efd-880c-0ba599ef8637-rabbitmq-tls\") pod \"65fc51e6-2db0-4efd-880c-0ba599ef8637\" (UID: \"65fc51e6-2db0-4efd-880c-0ba599ef8637\") " Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.668016 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/65fc51e6-2db0-4efd-880c-0ba599ef8637-rabbitmq-erlang-cookie\") pod \"65fc51e6-2db0-4efd-880c-0ba599ef8637\" (UID: \"65fc51e6-2db0-4efd-880c-0ba599ef8637\") " Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.668062 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"65fc51e6-2db0-4efd-880c-0ba599ef8637\" (UID: \"65fc51e6-2db0-4efd-880c-0ba599ef8637\") " Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.668099 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/65fc51e6-2db0-4efd-880c-0ba599ef8637-pod-info\") pod \"65fc51e6-2db0-4efd-880c-0ba599ef8637\" (UID: \"65fc51e6-2db0-4efd-880c-0ba599ef8637\") " Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.668210 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xz85l\" (UniqueName: \"kubernetes.io/projected/65fc51e6-2db0-4efd-880c-0ba599ef8637-kube-api-access-xz85l\") pod \"65fc51e6-2db0-4efd-880c-0ba599ef8637\" (UID: \"65fc51e6-2db0-4efd-880c-0ba599ef8637\") " Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.668246 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/65fc51e6-2db0-4efd-880c-0ba599ef8637-plugins-conf\") pod \"65fc51e6-2db0-4efd-880c-0ba599ef8637\" (UID: \"65fc51e6-2db0-4efd-880c-0ba599ef8637\") " Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.668411 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65fc51e6-2db0-4efd-880c-0ba599ef8637-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "65fc51e6-2db0-4efd-880c-0ba599ef8637" (UID: "65fc51e6-2db0-4efd-880c-0ba599ef8637"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.668567 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/140577c7-99f4-4dc1-85dd-1bec990df549-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"140577c7-99f4-4dc1-85dd-1bec990df549\") " pod="openstack/rabbitmq-server-0" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.668618 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/140577c7-99f4-4dc1-85dd-1bec990df549-pod-info\") pod \"rabbitmq-server-0\" (UID: \"140577c7-99f4-4dc1-85dd-1bec990df549\") " pod="openstack/rabbitmq-server-0" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.668788 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/140577c7-99f4-4dc1-85dd-1bec990df549-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"140577c7-99f4-4dc1-85dd-1bec990df549\") " pod="openstack/rabbitmq-server-0" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.668824 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/140577c7-99f4-4dc1-85dd-1bec990df549-config-data\") pod \"rabbitmq-server-0\" (UID: \"140577c7-99f4-4dc1-85dd-1bec990df549\") " pod="openstack/rabbitmq-server-0" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.668911 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/140577c7-99f4-4dc1-85dd-1bec990df549-server-conf\") pod \"rabbitmq-server-0\" (UID: \"140577c7-99f4-4dc1-85dd-1bec990df549\") " pod="openstack/rabbitmq-server-0" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.668974 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/140577c7-99f4-4dc1-85dd-1bec990df549-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"140577c7-99f4-4dc1-85dd-1bec990df549\") " pod="openstack/rabbitmq-server-0" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.669032 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pczl7\" (UniqueName: \"kubernetes.io/projected/140577c7-99f4-4dc1-85dd-1bec990df549-kube-api-access-pczl7\") pod \"rabbitmq-server-0\" (UID: \"140577c7-99f4-4dc1-85dd-1bec990df549\") " pod="openstack/rabbitmq-server-0" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.669142 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/140577c7-99f4-4dc1-85dd-1bec990df549-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"140577c7-99f4-4dc1-85dd-1bec990df549\") " pod="openstack/rabbitmq-server-0" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.669189 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/140577c7-99f4-4dc1-85dd-1bec990df549-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"140577c7-99f4-4dc1-85dd-1bec990df549\") " pod="openstack/rabbitmq-server-0" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.669266 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-server-0\" (UID: \"140577c7-99f4-4dc1-85dd-1bec990df549\") " pod="openstack/rabbitmq-server-0" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.669318 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/140577c7-99f4-4dc1-85dd-1bec990df549-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"140577c7-99f4-4dc1-85dd-1bec990df549\") " pod="openstack/rabbitmq-server-0" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.669447 4676 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/65fc51e6-2db0-4efd-880c-0ba599ef8637-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.673241 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65fc51e6-2db0-4efd-880c-0ba599ef8637-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "65fc51e6-2db0-4efd-880c-0ba599ef8637" (UID: "65fc51e6-2db0-4efd-880c-0ba599ef8637"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.674606 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/65fc51e6-2db0-4efd-880c-0ba599ef8637-pod-info" (OuterVolumeSpecName: "pod-info") pod "65fc51e6-2db0-4efd-880c-0ba599ef8637" (UID: "65fc51e6-2db0-4efd-880c-0ba599ef8637"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.674643 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65fc51e6-2db0-4efd-880c-0ba599ef8637-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "65fc51e6-2db0-4efd-880c-0ba599ef8637" (UID: "65fc51e6-2db0-4efd-880c-0ba599ef8637"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.677513 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65fc51e6-2db0-4efd-880c-0ba599ef8637-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "65fc51e6-2db0-4efd-880c-0ba599ef8637" (UID: "65fc51e6-2db0-4efd-880c-0ba599ef8637"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.677497 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65fc51e6-2db0-4efd-880c-0ba599ef8637-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "65fc51e6-2db0-4efd-880c-0ba599ef8637" (UID: "65fc51e6-2db0-4efd-880c-0ba599ef8637"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.683120 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65fc51e6-2db0-4efd-880c-0ba599ef8637-kube-api-access-xz85l" (OuterVolumeSpecName: "kube-api-access-xz85l") pod "65fc51e6-2db0-4efd-880c-0ba599ef8637" (UID: "65fc51e6-2db0-4efd-880c-0ba599ef8637"). InnerVolumeSpecName "kube-api-access-xz85l". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.683345 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "persistence") pod "65fc51e6-2db0-4efd-880c-0ba599ef8637" (UID: "65fc51e6-2db0-4efd-880c-0ba599ef8637"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.707300 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65fc51e6-2db0-4efd-880c-0ba599ef8637-config-data" (OuterVolumeSpecName: "config-data") pod "65fc51e6-2db0-4efd-880c-0ba599ef8637" (UID: "65fc51e6-2db0-4efd-880c-0ba599ef8637"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.762095 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65fc51e6-2db0-4efd-880c-0ba599ef8637-server-conf" (OuterVolumeSpecName: "server-conf") pod "65fc51e6-2db0-4efd-880c-0ba599ef8637" (UID: "65fc51e6-2db0-4efd-880c-0ba599ef8637"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.771329 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/140577c7-99f4-4dc1-85dd-1bec990df549-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"140577c7-99f4-4dc1-85dd-1bec990df549\") " pod="openstack/rabbitmq-server-0" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.771409 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-server-0\" (UID: \"140577c7-99f4-4dc1-85dd-1bec990df549\") " pod="openstack/rabbitmq-server-0" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.771458 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/140577c7-99f4-4dc1-85dd-1bec990df549-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"140577c7-99f4-4dc1-85dd-1bec990df549\") " pod="openstack/rabbitmq-server-0" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.771513 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/140577c7-99f4-4dc1-85dd-1bec990df549-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"140577c7-99f4-4dc1-85dd-1bec990df549\") " pod="openstack/rabbitmq-server-0" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.771540 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/140577c7-99f4-4dc1-85dd-1bec990df549-pod-info\") pod \"rabbitmq-server-0\" (UID: \"140577c7-99f4-4dc1-85dd-1bec990df549\") " pod="openstack/rabbitmq-server-0" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.771580 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/140577c7-99f4-4dc1-85dd-1bec990df549-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"140577c7-99f4-4dc1-85dd-1bec990df549\") " pod="openstack/rabbitmq-server-0" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.771602 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/140577c7-99f4-4dc1-85dd-1bec990df549-config-data\") pod \"rabbitmq-server-0\" (UID: \"140577c7-99f4-4dc1-85dd-1bec990df549\") " pod="openstack/rabbitmq-server-0" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.771635 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/140577c7-99f4-4dc1-85dd-1bec990df549-server-conf\") pod \"rabbitmq-server-0\" (UID: \"140577c7-99f4-4dc1-85dd-1bec990df549\") " pod="openstack/rabbitmq-server-0" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.771673 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/140577c7-99f4-4dc1-85dd-1bec990df549-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"140577c7-99f4-4dc1-85dd-1bec990df549\") " pod="openstack/rabbitmq-server-0" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.771705 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pczl7\" (UniqueName: \"kubernetes.io/projected/140577c7-99f4-4dc1-85dd-1bec990df549-kube-api-access-pczl7\") pod \"rabbitmq-server-0\" (UID: \"140577c7-99f4-4dc1-85dd-1bec990df549\") " pod="openstack/rabbitmq-server-0" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.771759 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/140577c7-99f4-4dc1-85dd-1bec990df549-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"140577c7-99f4-4dc1-85dd-1bec990df549\") " pod="openstack/rabbitmq-server-0" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.771843 4676 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/65fc51e6-2db0-4efd-880c-0ba599ef8637-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.771857 4676 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/65fc51e6-2db0-4efd-880c-0ba599ef8637-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.771897 4676 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.771909 4676 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/65fc51e6-2db0-4efd-880c-0ba599ef8637-pod-info\") on node \"crc\" DevicePath \"\"" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.771921 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xz85l\" (UniqueName: \"kubernetes.io/projected/65fc51e6-2db0-4efd-880c-0ba599ef8637-kube-api-access-xz85l\") on node \"crc\" DevicePath \"\"" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.771932 4676 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/65fc51e6-2db0-4efd-880c-0ba599ef8637-plugins-conf\") on node \"crc\" DevicePath \"\"" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.771942 4676 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/65fc51e6-2db0-4efd-880c-0ba599ef8637-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.771952 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/65fc51e6-2db0-4efd-880c-0ba599ef8637-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.771962 4676 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/65fc51e6-2db0-4efd-880c-0ba599ef8637-server-conf\") on node \"crc\" DevicePath \"\"" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.774016 4676 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-server-0\" (UID: \"140577c7-99f4-4dc1-85dd-1bec990df549\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/rabbitmq-server-0" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.775441 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/140577c7-99f4-4dc1-85dd-1bec990df549-config-data\") pod \"rabbitmq-server-0\" (UID: \"140577c7-99f4-4dc1-85dd-1bec990df549\") " pod="openstack/rabbitmq-server-0" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.775511 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/140577c7-99f4-4dc1-85dd-1bec990df549-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"140577c7-99f4-4dc1-85dd-1bec990df549\") " pod="openstack/rabbitmq-server-0" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.775690 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/140577c7-99f4-4dc1-85dd-1bec990df549-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"140577c7-99f4-4dc1-85dd-1bec990df549\") " pod="openstack/rabbitmq-server-0" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.776153 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/140577c7-99f4-4dc1-85dd-1bec990df549-server-conf\") pod \"rabbitmq-server-0\" (UID: \"140577c7-99f4-4dc1-85dd-1bec990df549\") " pod="openstack/rabbitmq-server-0" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.776615 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/140577c7-99f4-4dc1-85dd-1bec990df549-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"140577c7-99f4-4dc1-85dd-1bec990df549\") " pod="openstack/rabbitmq-server-0" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.777034 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/140577c7-99f4-4dc1-85dd-1bec990df549-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"140577c7-99f4-4dc1-85dd-1bec990df549\") " pod="openstack/rabbitmq-server-0" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.781976 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/140577c7-99f4-4dc1-85dd-1bec990df549-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"140577c7-99f4-4dc1-85dd-1bec990df549\") " pod="openstack/rabbitmq-server-0" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.786819 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/140577c7-99f4-4dc1-85dd-1bec990df549-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"140577c7-99f4-4dc1-85dd-1bec990df549\") " pod="openstack/rabbitmq-server-0" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.788371 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/140577c7-99f4-4dc1-85dd-1bec990df549-pod-info\") pod \"rabbitmq-server-0\" (UID: \"140577c7-99f4-4dc1-85dd-1bec990df549\") " pod="openstack/rabbitmq-server-0" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.792731 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pczl7\" (UniqueName: \"kubernetes.io/projected/140577c7-99f4-4dc1-85dd-1bec990df549-kube-api-access-pczl7\") pod \"rabbitmq-server-0\" (UID: \"140577c7-99f4-4dc1-85dd-1bec990df549\") " pod="openstack/rabbitmq-server-0" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.806287 4676 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.827712 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65fc51e6-2db0-4efd-880c-0ba599ef8637-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "65fc51e6-2db0-4efd-880c-0ba599ef8637" (UID: "65fc51e6-2db0-4efd-880c-0ba599ef8637"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.832463 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-server-0\" (UID: \"140577c7-99f4-4dc1-85dd-1bec990df549\") " pod="openstack/rabbitmq-server-0" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.873648 4676 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/65fc51e6-2db0-4efd-880c-0ba599ef8637-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.873684 4676 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Sep 30 14:23:54 crc kubenswrapper[4676]: I0930 14:23:54.927663 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 14:23:55 crc kubenswrapper[4676]: I0930 14:23:55.399549 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 14:23:55 crc kubenswrapper[4676]: I0930 14:23:55.444699 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:23:55 crc kubenswrapper[4676]: I0930 14:23:55.450032 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ee88379-95a2-4019-a41a-7931a5ab2f30" path="/var/lib/kubelet/pods/0ee88379-95a2-4019-a41a-7931a5ab2f30/volumes" Sep 30 14:23:55 crc kubenswrapper[4676]: I0930 14:23:55.451219 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"65fc51e6-2db0-4efd-880c-0ba599ef8637","Type":"ContainerDied","Data":"e56d13fc7d0baeaa2fcbfcecd2480ee9e2cfc3d4f710aabec23ca62f8990a525"} Sep 30 14:23:55 crc kubenswrapper[4676]: I0930 14:23:55.451270 4676 scope.go:117] "RemoveContainer" containerID="bb59ac140098029e345d6da7202977df3632e37bef8af0d303536be23487701d" Sep 30 14:23:55 crc kubenswrapper[4676]: I0930 14:23:55.464847 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"140577c7-99f4-4dc1-85dd-1bec990df549","Type":"ContainerStarted","Data":"811f2408c78fc97d733de12effa31bb9efbe652d1d378353c1467695c19567ee"} Sep 30 14:23:55 crc kubenswrapper[4676]: I0930 14:23:55.490794 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 14:23:55 crc kubenswrapper[4676]: I0930 14:23:55.507939 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 14:23:55 crc kubenswrapper[4676]: I0930 14:23:55.522696 4676 scope.go:117] "RemoveContainer" containerID="00862cfe13d270a978c29c8f3cf52989a207952fa61191fd0dfcc057ec6241f6" Sep 30 14:23:55 crc kubenswrapper[4676]: I0930 14:23:55.530094 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 14:23:55 crc kubenswrapper[4676]: I0930 14:23:55.532277 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:23:55 crc kubenswrapper[4676]: I0930 14:23:55.536404 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Sep 30 14:23:55 crc kubenswrapper[4676]: I0930 14:23:55.536661 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Sep 30 14:23:55 crc kubenswrapper[4676]: I0930 14:23:55.536980 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Sep 30 14:23:55 crc kubenswrapper[4676]: I0930 14:23:55.537178 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-mwbdr" Sep 30 14:23:55 crc kubenswrapper[4676]: I0930 14:23:55.537369 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Sep 30 14:23:55 crc kubenswrapper[4676]: I0930 14:23:55.536989 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Sep 30 14:23:55 crc kubenswrapper[4676]: I0930 14:23:55.537642 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Sep 30 14:23:55 crc kubenswrapper[4676]: I0930 14:23:55.564853 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 14:23:55 crc kubenswrapper[4676]: I0930 14:23:55.586124 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/56d86f95-7d72-42d4-84d7-fdca29b1270f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"56d86f95-7d72-42d4-84d7-fdca29b1270f\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:23:55 crc kubenswrapper[4676]: I0930 14:23:55.586196 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbdrj\" (UniqueName: \"kubernetes.io/projected/56d86f95-7d72-42d4-84d7-fdca29b1270f-kube-api-access-pbdrj\") pod \"rabbitmq-cell1-server-0\" (UID: \"56d86f95-7d72-42d4-84d7-fdca29b1270f\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:23:55 crc kubenswrapper[4676]: I0930 14:23:55.586246 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/56d86f95-7d72-42d4-84d7-fdca29b1270f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"56d86f95-7d72-42d4-84d7-fdca29b1270f\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:23:55 crc kubenswrapper[4676]: I0930 14:23:55.586272 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/56d86f95-7d72-42d4-84d7-fdca29b1270f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"56d86f95-7d72-42d4-84d7-fdca29b1270f\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:23:55 crc kubenswrapper[4676]: I0930 14:23:55.586324 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/56d86f95-7d72-42d4-84d7-fdca29b1270f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"56d86f95-7d72-42d4-84d7-fdca29b1270f\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:23:55 crc kubenswrapper[4676]: I0930 14:23:55.586370 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/56d86f95-7d72-42d4-84d7-fdca29b1270f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"56d86f95-7d72-42d4-84d7-fdca29b1270f\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:23:55 crc kubenswrapper[4676]: I0930 14:23:55.586407 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56d86f95-7d72-42d4-84d7-fdca29b1270f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"56d86f95-7d72-42d4-84d7-fdca29b1270f\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:23:55 crc kubenswrapper[4676]: I0930 14:23:55.586435 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/56d86f95-7d72-42d4-84d7-fdca29b1270f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"56d86f95-7d72-42d4-84d7-fdca29b1270f\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:23:55 crc kubenswrapper[4676]: I0930 14:23:55.586461 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/56d86f95-7d72-42d4-84d7-fdca29b1270f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"56d86f95-7d72-42d4-84d7-fdca29b1270f\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:23:55 crc kubenswrapper[4676]: I0930 14:23:55.586492 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"56d86f95-7d72-42d4-84d7-fdca29b1270f\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:23:55 crc kubenswrapper[4676]: I0930 14:23:55.586510 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/56d86f95-7d72-42d4-84d7-fdca29b1270f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"56d86f95-7d72-42d4-84d7-fdca29b1270f\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:23:55 crc kubenswrapper[4676]: I0930 14:23:55.688025 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/56d86f95-7d72-42d4-84d7-fdca29b1270f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"56d86f95-7d72-42d4-84d7-fdca29b1270f\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:23:55 crc kubenswrapper[4676]: I0930 14:23:55.688427 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/56d86f95-7d72-42d4-84d7-fdca29b1270f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"56d86f95-7d72-42d4-84d7-fdca29b1270f\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:23:55 crc kubenswrapper[4676]: I0930 14:23:55.688461 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56d86f95-7d72-42d4-84d7-fdca29b1270f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"56d86f95-7d72-42d4-84d7-fdca29b1270f\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:23:55 crc kubenswrapper[4676]: I0930 14:23:55.688488 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/56d86f95-7d72-42d4-84d7-fdca29b1270f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"56d86f95-7d72-42d4-84d7-fdca29b1270f\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:23:55 crc kubenswrapper[4676]: I0930 14:23:55.688515 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/56d86f95-7d72-42d4-84d7-fdca29b1270f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"56d86f95-7d72-42d4-84d7-fdca29b1270f\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:23:55 crc kubenswrapper[4676]: I0930 14:23:55.688541 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"56d86f95-7d72-42d4-84d7-fdca29b1270f\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:23:55 crc kubenswrapper[4676]: I0930 14:23:55.688556 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/56d86f95-7d72-42d4-84d7-fdca29b1270f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"56d86f95-7d72-42d4-84d7-fdca29b1270f\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:23:55 crc kubenswrapper[4676]: I0930 14:23:55.688618 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/56d86f95-7d72-42d4-84d7-fdca29b1270f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"56d86f95-7d72-42d4-84d7-fdca29b1270f\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:23:55 crc kubenswrapper[4676]: I0930 14:23:55.688647 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbdrj\" (UniqueName: \"kubernetes.io/projected/56d86f95-7d72-42d4-84d7-fdca29b1270f-kube-api-access-pbdrj\") pod \"rabbitmq-cell1-server-0\" (UID: \"56d86f95-7d72-42d4-84d7-fdca29b1270f\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:23:55 crc kubenswrapper[4676]: I0930 14:23:55.688680 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/56d86f95-7d72-42d4-84d7-fdca29b1270f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"56d86f95-7d72-42d4-84d7-fdca29b1270f\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:23:55 crc kubenswrapper[4676]: I0930 14:23:55.688699 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/56d86f95-7d72-42d4-84d7-fdca29b1270f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"56d86f95-7d72-42d4-84d7-fdca29b1270f\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:23:55 crc kubenswrapper[4676]: I0930 14:23:55.689127 4676 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"56d86f95-7d72-42d4-84d7-fdca29b1270f\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:23:55 crc kubenswrapper[4676]: I0930 14:23:55.689291 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/56d86f95-7d72-42d4-84d7-fdca29b1270f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"56d86f95-7d72-42d4-84d7-fdca29b1270f\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:23:55 crc kubenswrapper[4676]: I0930 14:23:55.689510 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56d86f95-7d72-42d4-84d7-fdca29b1270f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"56d86f95-7d72-42d4-84d7-fdca29b1270f\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:23:55 crc kubenswrapper[4676]: I0930 14:23:55.689601 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/56d86f95-7d72-42d4-84d7-fdca29b1270f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"56d86f95-7d72-42d4-84d7-fdca29b1270f\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:23:55 crc kubenswrapper[4676]: I0930 14:23:55.689636 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/56d86f95-7d72-42d4-84d7-fdca29b1270f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"56d86f95-7d72-42d4-84d7-fdca29b1270f\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:23:55 crc kubenswrapper[4676]: I0930 14:23:55.689656 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/56d86f95-7d72-42d4-84d7-fdca29b1270f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"56d86f95-7d72-42d4-84d7-fdca29b1270f\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:23:55 crc kubenswrapper[4676]: I0930 14:23:55.694208 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/56d86f95-7d72-42d4-84d7-fdca29b1270f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"56d86f95-7d72-42d4-84d7-fdca29b1270f\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:23:55 crc kubenswrapper[4676]: I0930 14:23:55.694269 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/56d86f95-7d72-42d4-84d7-fdca29b1270f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"56d86f95-7d72-42d4-84d7-fdca29b1270f\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:23:55 crc kubenswrapper[4676]: I0930 14:23:55.694627 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/56d86f95-7d72-42d4-84d7-fdca29b1270f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"56d86f95-7d72-42d4-84d7-fdca29b1270f\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:23:55 crc kubenswrapper[4676]: I0930 14:23:55.700543 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/56d86f95-7d72-42d4-84d7-fdca29b1270f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"56d86f95-7d72-42d4-84d7-fdca29b1270f\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:23:55 crc kubenswrapper[4676]: I0930 14:23:55.708040 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbdrj\" (UniqueName: \"kubernetes.io/projected/56d86f95-7d72-42d4-84d7-fdca29b1270f-kube-api-access-pbdrj\") pod \"rabbitmq-cell1-server-0\" (UID: \"56d86f95-7d72-42d4-84d7-fdca29b1270f\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:23:55 crc kubenswrapper[4676]: I0930 14:23:55.728463 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"56d86f95-7d72-42d4-84d7-fdca29b1270f\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:23:55 crc kubenswrapper[4676]: I0930 14:23:55.973727 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:23:56 crc kubenswrapper[4676]: I0930 14:23:56.473942 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 14:23:56 crc kubenswrapper[4676]: I0930 14:23:56.475649 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"140577c7-99f4-4dc1-85dd-1bec990df549","Type":"ContainerStarted","Data":"a27c6e0ad828c012ec9442f693ba51b22f10ee952758c3ec547473cb6dae45ae"} Sep 30 14:23:57 crc kubenswrapper[4676]: I0930 14:23:57.443574 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65fc51e6-2db0-4efd-880c-0ba599ef8637" path="/var/lib/kubelet/pods/65fc51e6-2db0-4efd-880c-0ba599ef8637/volumes" Sep 30 14:23:57 crc kubenswrapper[4676]: I0930 14:23:57.495313 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"56d86f95-7d72-42d4-84d7-fdca29b1270f","Type":"ContainerStarted","Data":"280aac38b9e3d9e418df498b4aa8fbaa57ec6ddacaca40c2265ebfb7941b0bdf"} Sep 30 14:23:57 crc kubenswrapper[4676]: I0930 14:23:57.495397 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"56d86f95-7d72-42d4-84d7-fdca29b1270f","Type":"ContainerStarted","Data":"a7e4ad4e6c9c4ad7faa0b58be77af0d3b7fabb5db225eee4d497b661a21da773"} Sep 30 14:23:58 crc kubenswrapper[4676]: I0930 14:23:58.119250 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-jnt8d"] Sep 30 14:23:58 crc kubenswrapper[4676]: I0930 14:23:58.123608 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-jnt8d" Sep 30 14:23:58 crc kubenswrapper[4676]: I0930 14:23:58.125864 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Sep 30 14:23:58 crc kubenswrapper[4676]: I0930 14:23:58.153629 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-jnt8d"] Sep 30 14:23:58 crc kubenswrapper[4676]: I0930 14:23:58.242097 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f87bbb67-f381-4585-be53-0ddea95f1495-ovsdbserver-sb\") pod \"dnsmasq-dns-67b789f86c-jnt8d\" (UID: \"f87bbb67-f381-4585-be53-0ddea95f1495\") " pod="openstack/dnsmasq-dns-67b789f86c-jnt8d" Sep 30 14:23:58 crc kubenswrapper[4676]: I0930 14:23:58.242184 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f87bbb67-f381-4585-be53-0ddea95f1495-config\") pod \"dnsmasq-dns-67b789f86c-jnt8d\" (UID: \"f87bbb67-f381-4585-be53-0ddea95f1495\") " pod="openstack/dnsmasq-dns-67b789f86c-jnt8d" Sep 30 14:23:58 crc kubenswrapper[4676]: I0930 14:23:58.242358 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f87bbb67-f381-4585-be53-0ddea95f1495-dns-swift-storage-0\") pod \"dnsmasq-dns-67b789f86c-jnt8d\" (UID: \"f87bbb67-f381-4585-be53-0ddea95f1495\") " pod="openstack/dnsmasq-dns-67b789f86c-jnt8d" Sep 30 14:23:58 crc kubenswrapper[4676]: I0930 14:23:58.242597 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f87bbb67-f381-4585-be53-0ddea95f1495-dns-svc\") pod \"dnsmasq-dns-67b789f86c-jnt8d\" (UID: \"f87bbb67-f381-4585-be53-0ddea95f1495\") " pod="openstack/dnsmasq-dns-67b789f86c-jnt8d" Sep 30 14:23:58 crc kubenswrapper[4676]: I0930 14:23:58.242644 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f87bbb67-f381-4585-be53-0ddea95f1495-openstack-edpm-ipam\") pod \"dnsmasq-dns-67b789f86c-jnt8d\" (UID: \"f87bbb67-f381-4585-be53-0ddea95f1495\") " pod="openstack/dnsmasq-dns-67b789f86c-jnt8d" Sep 30 14:23:58 crc kubenswrapper[4676]: I0930 14:23:58.242682 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zm5vv\" (UniqueName: \"kubernetes.io/projected/f87bbb67-f381-4585-be53-0ddea95f1495-kube-api-access-zm5vv\") pod \"dnsmasq-dns-67b789f86c-jnt8d\" (UID: \"f87bbb67-f381-4585-be53-0ddea95f1495\") " pod="openstack/dnsmasq-dns-67b789f86c-jnt8d" Sep 30 14:23:58 crc kubenswrapper[4676]: I0930 14:23:58.242706 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f87bbb67-f381-4585-be53-0ddea95f1495-ovsdbserver-nb\") pod \"dnsmasq-dns-67b789f86c-jnt8d\" (UID: \"f87bbb67-f381-4585-be53-0ddea95f1495\") " pod="openstack/dnsmasq-dns-67b789f86c-jnt8d" Sep 30 14:23:58 crc kubenswrapper[4676]: I0930 14:23:58.344011 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f87bbb67-f381-4585-be53-0ddea95f1495-dns-svc\") pod \"dnsmasq-dns-67b789f86c-jnt8d\" (UID: \"f87bbb67-f381-4585-be53-0ddea95f1495\") " pod="openstack/dnsmasq-dns-67b789f86c-jnt8d" Sep 30 14:23:58 crc kubenswrapper[4676]: I0930 14:23:58.344058 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f87bbb67-f381-4585-be53-0ddea95f1495-openstack-edpm-ipam\") pod \"dnsmasq-dns-67b789f86c-jnt8d\" (UID: \"f87bbb67-f381-4585-be53-0ddea95f1495\") " pod="openstack/dnsmasq-dns-67b789f86c-jnt8d" Sep 30 14:23:58 crc kubenswrapper[4676]: I0930 14:23:58.344081 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zm5vv\" (UniqueName: \"kubernetes.io/projected/f87bbb67-f381-4585-be53-0ddea95f1495-kube-api-access-zm5vv\") pod \"dnsmasq-dns-67b789f86c-jnt8d\" (UID: \"f87bbb67-f381-4585-be53-0ddea95f1495\") " pod="openstack/dnsmasq-dns-67b789f86c-jnt8d" Sep 30 14:23:58 crc kubenswrapper[4676]: I0930 14:23:58.344098 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f87bbb67-f381-4585-be53-0ddea95f1495-ovsdbserver-nb\") pod \"dnsmasq-dns-67b789f86c-jnt8d\" (UID: \"f87bbb67-f381-4585-be53-0ddea95f1495\") " pod="openstack/dnsmasq-dns-67b789f86c-jnt8d" Sep 30 14:23:58 crc kubenswrapper[4676]: I0930 14:23:58.344127 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f87bbb67-f381-4585-be53-0ddea95f1495-ovsdbserver-sb\") pod \"dnsmasq-dns-67b789f86c-jnt8d\" (UID: \"f87bbb67-f381-4585-be53-0ddea95f1495\") " pod="openstack/dnsmasq-dns-67b789f86c-jnt8d" Sep 30 14:23:58 crc kubenswrapper[4676]: I0930 14:23:58.344156 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f87bbb67-f381-4585-be53-0ddea95f1495-config\") pod \"dnsmasq-dns-67b789f86c-jnt8d\" (UID: \"f87bbb67-f381-4585-be53-0ddea95f1495\") " pod="openstack/dnsmasq-dns-67b789f86c-jnt8d" Sep 30 14:23:58 crc kubenswrapper[4676]: I0930 14:23:58.344196 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f87bbb67-f381-4585-be53-0ddea95f1495-dns-swift-storage-0\") pod \"dnsmasq-dns-67b789f86c-jnt8d\" (UID: \"f87bbb67-f381-4585-be53-0ddea95f1495\") " pod="openstack/dnsmasq-dns-67b789f86c-jnt8d" Sep 30 14:23:58 crc kubenswrapper[4676]: I0930 14:23:58.345543 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f87bbb67-f381-4585-be53-0ddea95f1495-dns-svc\") pod \"dnsmasq-dns-67b789f86c-jnt8d\" (UID: \"f87bbb67-f381-4585-be53-0ddea95f1495\") " pod="openstack/dnsmasq-dns-67b789f86c-jnt8d" Sep 30 14:23:58 crc kubenswrapper[4676]: I0930 14:23:58.346052 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f87bbb67-f381-4585-be53-0ddea95f1495-openstack-edpm-ipam\") pod \"dnsmasq-dns-67b789f86c-jnt8d\" (UID: \"f87bbb67-f381-4585-be53-0ddea95f1495\") " pod="openstack/dnsmasq-dns-67b789f86c-jnt8d" Sep 30 14:23:58 crc kubenswrapper[4676]: I0930 14:23:58.346834 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f87bbb67-f381-4585-be53-0ddea95f1495-ovsdbserver-nb\") pod \"dnsmasq-dns-67b789f86c-jnt8d\" (UID: \"f87bbb67-f381-4585-be53-0ddea95f1495\") " pod="openstack/dnsmasq-dns-67b789f86c-jnt8d" Sep 30 14:23:58 crc kubenswrapper[4676]: I0930 14:23:58.347405 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f87bbb67-f381-4585-be53-0ddea95f1495-dns-swift-storage-0\") pod \"dnsmasq-dns-67b789f86c-jnt8d\" (UID: \"f87bbb67-f381-4585-be53-0ddea95f1495\") " pod="openstack/dnsmasq-dns-67b789f86c-jnt8d" Sep 30 14:23:58 crc kubenswrapper[4676]: I0930 14:23:58.347959 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f87bbb67-f381-4585-be53-0ddea95f1495-ovsdbserver-sb\") pod \"dnsmasq-dns-67b789f86c-jnt8d\" (UID: \"f87bbb67-f381-4585-be53-0ddea95f1495\") " pod="openstack/dnsmasq-dns-67b789f86c-jnt8d" Sep 30 14:23:58 crc kubenswrapper[4676]: I0930 14:23:58.349123 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f87bbb67-f381-4585-be53-0ddea95f1495-config\") pod \"dnsmasq-dns-67b789f86c-jnt8d\" (UID: \"f87bbb67-f381-4585-be53-0ddea95f1495\") " pod="openstack/dnsmasq-dns-67b789f86c-jnt8d" Sep 30 14:23:58 crc kubenswrapper[4676]: I0930 14:23:58.366279 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zm5vv\" (UniqueName: \"kubernetes.io/projected/f87bbb67-f381-4585-be53-0ddea95f1495-kube-api-access-zm5vv\") pod \"dnsmasq-dns-67b789f86c-jnt8d\" (UID: \"f87bbb67-f381-4585-be53-0ddea95f1495\") " pod="openstack/dnsmasq-dns-67b789f86c-jnt8d" Sep 30 14:23:58 crc kubenswrapper[4676]: I0930 14:23:58.460173 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-jnt8d" Sep 30 14:23:58 crc kubenswrapper[4676]: I0930 14:23:58.936782 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-jnt8d"] Sep 30 14:23:59 crc kubenswrapper[4676]: I0930 14:23:59.523732 4676 generic.go:334] "Generic (PLEG): container finished" podID="f87bbb67-f381-4585-be53-0ddea95f1495" containerID="da55696fa9c4f256f295d73488e460c9e0e5b8cb2b59e6126705f55467203b89" exitCode=0 Sep 30 14:23:59 crc kubenswrapper[4676]: I0930 14:23:59.523779 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-jnt8d" event={"ID":"f87bbb67-f381-4585-be53-0ddea95f1495","Type":"ContainerDied","Data":"da55696fa9c4f256f295d73488e460c9e0e5b8cb2b59e6126705f55467203b89"} Sep 30 14:23:59 crc kubenswrapper[4676]: I0930 14:23:59.524051 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-jnt8d" event={"ID":"f87bbb67-f381-4585-be53-0ddea95f1495","Type":"ContainerStarted","Data":"94e25d05e8e67d98aa41a53a8f66826150c8f6aee2ee09152ad39717d7c229fd"} Sep 30 14:23:59 crc kubenswrapper[4676]: I0930 14:23:59.919323 4676 patch_prober.go:28] interesting pod/machine-config-daemon-4k2dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:23:59 crc kubenswrapper[4676]: I0930 14:23:59.919609 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:23:59 crc kubenswrapper[4676]: I0930 14:23:59.919652 4676 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" Sep 30 14:23:59 crc kubenswrapper[4676]: I0930 14:23:59.920388 4676 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"629ff1f63cc08d0a90639ac0dfdfcd800429b2ca079d983358e85abb811d00e0"} pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 14:23:59 crc kubenswrapper[4676]: I0930 14:23:59.920457 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerName="machine-config-daemon" containerID="cri-o://629ff1f63cc08d0a90639ac0dfdfcd800429b2ca079d983358e85abb811d00e0" gracePeriod=600 Sep 30 14:24:00 crc kubenswrapper[4676]: I0930 14:24:00.536851 4676 generic.go:334] "Generic (PLEG): container finished" podID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerID="629ff1f63cc08d0a90639ac0dfdfcd800429b2ca079d983358e85abb811d00e0" exitCode=0 Sep 30 14:24:00 crc kubenswrapper[4676]: I0930 14:24:00.537018 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" event={"ID":"af133cb7-f0e4-428e-b348-c6e81493fc1d","Type":"ContainerDied","Data":"629ff1f63cc08d0a90639ac0dfdfcd800429b2ca079d983358e85abb811d00e0"} Sep 30 14:24:00 crc kubenswrapper[4676]: I0930 14:24:00.537278 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" event={"ID":"af133cb7-f0e4-428e-b348-c6e81493fc1d","Type":"ContainerStarted","Data":"594bb900fe8a86f6d24186adf1f821b6fb52e495dce4ee8695064f3b8033f590"} Sep 30 14:24:00 crc kubenswrapper[4676]: I0930 14:24:00.537300 4676 scope.go:117] "RemoveContainer" containerID="5e91fb257d3a45cd5a74b5617de04aa40d1ce872ef596abb2a4557538639b58d" Sep 30 14:24:00 crc kubenswrapper[4676]: I0930 14:24:00.541583 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-jnt8d" event={"ID":"f87bbb67-f381-4585-be53-0ddea95f1495","Type":"ContainerStarted","Data":"aaf813e9cfb6cabf673e2eb3c8f7aaa9f41ca89bdb522967fe99cf9153836c39"} Sep 30 14:24:00 crc kubenswrapper[4676]: I0930 14:24:00.541813 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67b789f86c-jnt8d" Sep 30 14:24:00 crc kubenswrapper[4676]: I0930 14:24:00.575936 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67b789f86c-jnt8d" podStartSLOduration=2.575873858 podStartE2EDuration="2.575873858s" podCreationTimestamp="2025-09-30 14:23:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:24:00.574847882 +0000 UTC m=+1544.557936311" watchObservedRunningTime="2025-09-30 14:24:00.575873858 +0000 UTC m=+1544.558962287" Sep 30 14:24:04 crc kubenswrapper[4676]: E0930 14:24:04.363954 4676 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8c17bb9_4ccb_470c_9f5b_69dd8420b4b0.slice/crio-conmon-132eb428f09d1cdd842bcfbd95531b10c13cd57968ace3fb1cbe737c1647f05d.scope\": RecentStats: unable to find data in memory cache]" Sep 30 14:24:08 crc kubenswrapper[4676]: I0930 14:24:08.462099 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-67b789f86c-jnt8d" Sep 30 14:24:08 crc kubenswrapper[4676]: I0930 14:24:08.537784 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-dq8jp"] Sep 30 14:24:08 crc kubenswrapper[4676]: I0930 14:24:08.538084 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-59cf4bdb65-dq8jp" podUID="6090ed01-b4ad-4d41-8e55-a250b4fb9d1c" containerName="dnsmasq-dns" containerID="cri-o://03eaca873fb16fb2177e7dbbbc00649c914e4cfe297edfe2389b124064a3c3a1" gracePeriod=10 Sep 30 14:24:08 crc kubenswrapper[4676]: I0930 14:24:08.695366 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cb6ffcf87-qglmb"] Sep 30 14:24:08 crc kubenswrapper[4676]: I0930 14:24:08.697671 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb6ffcf87-qglmb" Sep 30 14:24:08 crc kubenswrapper[4676]: I0930 14:24:08.705568 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cb6ffcf87-qglmb"] Sep 30 14:24:08 crc kubenswrapper[4676]: I0930 14:24:08.861977 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6nb9\" (UniqueName: \"kubernetes.io/projected/a391dbc4-4a80-4a26-9e6d-5903b425ae97-kube-api-access-f6nb9\") pod \"dnsmasq-dns-cb6ffcf87-qglmb\" (UID: \"a391dbc4-4a80-4a26-9e6d-5903b425ae97\") " pod="openstack/dnsmasq-dns-cb6ffcf87-qglmb" Sep 30 14:24:08 crc kubenswrapper[4676]: I0930 14:24:08.862087 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a391dbc4-4a80-4a26-9e6d-5903b425ae97-dns-swift-storage-0\") pod \"dnsmasq-dns-cb6ffcf87-qglmb\" (UID: \"a391dbc4-4a80-4a26-9e6d-5903b425ae97\") " pod="openstack/dnsmasq-dns-cb6ffcf87-qglmb" Sep 30 14:24:08 crc kubenswrapper[4676]: I0930 14:24:08.862150 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a391dbc4-4a80-4a26-9e6d-5903b425ae97-ovsdbserver-sb\") pod \"dnsmasq-dns-cb6ffcf87-qglmb\" (UID: \"a391dbc4-4a80-4a26-9e6d-5903b425ae97\") " pod="openstack/dnsmasq-dns-cb6ffcf87-qglmb" Sep 30 14:24:08 crc kubenswrapper[4676]: I0930 14:24:08.862178 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a391dbc4-4a80-4a26-9e6d-5903b425ae97-ovsdbserver-nb\") pod \"dnsmasq-dns-cb6ffcf87-qglmb\" (UID: \"a391dbc4-4a80-4a26-9e6d-5903b425ae97\") " pod="openstack/dnsmasq-dns-cb6ffcf87-qglmb" Sep 30 14:24:08 crc kubenswrapper[4676]: I0930 14:24:08.862277 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a391dbc4-4a80-4a26-9e6d-5903b425ae97-dns-svc\") pod \"dnsmasq-dns-cb6ffcf87-qglmb\" (UID: \"a391dbc4-4a80-4a26-9e6d-5903b425ae97\") " pod="openstack/dnsmasq-dns-cb6ffcf87-qglmb" Sep 30 14:24:08 crc kubenswrapper[4676]: I0930 14:24:08.862329 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a391dbc4-4a80-4a26-9e6d-5903b425ae97-openstack-edpm-ipam\") pod \"dnsmasq-dns-cb6ffcf87-qglmb\" (UID: \"a391dbc4-4a80-4a26-9e6d-5903b425ae97\") " pod="openstack/dnsmasq-dns-cb6ffcf87-qglmb" Sep 30 14:24:08 crc kubenswrapper[4676]: I0930 14:24:08.862470 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a391dbc4-4a80-4a26-9e6d-5903b425ae97-config\") pod \"dnsmasq-dns-cb6ffcf87-qglmb\" (UID: \"a391dbc4-4a80-4a26-9e6d-5903b425ae97\") " pod="openstack/dnsmasq-dns-cb6ffcf87-qglmb" Sep 30 14:24:08 crc kubenswrapper[4676]: I0930 14:24:08.964815 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a391dbc4-4a80-4a26-9e6d-5903b425ae97-dns-swift-storage-0\") pod \"dnsmasq-dns-cb6ffcf87-qglmb\" (UID: \"a391dbc4-4a80-4a26-9e6d-5903b425ae97\") " pod="openstack/dnsmasq-dns-cb6ffcf87-qglmb" Sep 30 14:24:08 crc kubenswrapper[4676]: I0930 14:24:08.964946 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a391dbc4-4a80-4a26-9e6d-5903b425ae97-ovsdbserver-sb\") pod \"dnsmasq-dns-cb6ffcf87-qglmb\" (UID: \"a391dbc4-4a80-4a26-9e6d-5903b425ae97\") " pod="openstack/dnsmasq-dns-cb6ffcf87-qglmb" Sep 30 14:24:08 crc kubenswrapper[4676]: I0930 14:24:08.964970 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a391dbc4-4a80-4a26-9e6d-5903b425ae97-ovsdbserver-nb\") pod \"dnsmasq-dns-cb6ffcf87-qglmb\" (UID: \"a391dbc4-4a80-4a26-9e6d-5903b425ae97\") " pod="openstack/dnsmasq-dns-cb6ffcf87-qglmb" Sep 30 14:24:08 crc kubenswrapper[4676]: I0930 14:24:08.965048 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a391dbc4-4a80-4a26-9e6d-5903b425ae97-dns-svc\") pod \"dnsmasq-dns-cb6ffcf87-qglmb\" (UID: \"a391dbc4-4a80-4a26-9e6d-5903b425ae97\") " pod="openstack/dnsmasq-dns-cb6ffcf87-qglmb" Sep 30 14:24:08 crc kubenswrapper[4676]: I0930 14:24:08.965091 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a391dbc4-4a80-4a26-9e6d-5903b425ae97-openstack-edpm-ipam\") pod \"dnsmasq-dns-cb6ffcf87-qglmb\" (UID: \"a391dbc4-4a80-4a26-9e6d-5903b425ae97\") " pod="openstack/dnsmasq-dns-cb6ffcf87-qglmb" Sep 30 14:24:08 crc kubenswrapper[4676]: I0930 14:24:08.965301 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a391dbc4-4a80-4a26-9e6d-5903b425ae97-config\") pod \"dnsmasq-dns-cb6ffcf87-qglmb\" (UID: \"a391dbc4-4a80-4a26-9e6d-5903b425ae97\") " pod="openstack/dnsmasq-dns-cb6ffcf87-qglmb" Sep 30 14:24:08 crc kubenswrapper[4676]: I0930 14:24:08.965351 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6nb9\" (UniqueName: \"kubernetes.io/projected/a391dbc4-4a80-4a26-9e6d-5903b425ae97-kube-api-access-f6nb9\") pod \"dnsmasq-dns-cb6ffcf87-qglmb\" (UID: \"a391dbc4-4a80-4a26-9e6d-5903b425ae97\") " pod="openstack/dnsmasq-dns-cb6ffcf87-qglmb" Sep 30 14:24:08 crc kubenswrapper[4676]: I0930 14:24:08.966077 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a391dbc4-4a80-4a26-9e6d-5903b425ae97-ovsdbserver-sb\") pod \"dnsmasq-dns-cb6ffcf87-qglmb\" (UID: \"a391dbc4-4a80-4a26-9e6d-5903b425ae97\") " pod="openstack/dnsmasq-dns-cb6ffcf87-qglmb" Sep 30 14:24:08 crc kubenswrapper[4676]: I0930 14:24:08.966420 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a391dbc4-4a80-4a26-9e6d-5903b425ae97-dns-swift-storage-0\") pod \"dnsmasq-dns-cb6ffcf87-qglmb\" (UID: \"a391dbc4-4a80-4a26-9e6d-5903b425ae97\") " pod="openstack/dnsmasq-dns-cb6ffcf87-qglmb" Sep 30 14:24:08 crc kubenswrapper[4676]: I0930 14:24:08.968871 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a391dbc4-4a80-4a26-9e6d-5903b425ae97-ovsdbserver-nb\") pod \"dnsmasq-dns-cb6ffcf87-qglmb\" (UID: \"a391dbc4-4a80-4a26-9e6d-5903b425ae97\") " pod="openstack/dnsmasq-dns-cb6ffcf87-qglmb" Sep 30 14:24:08 crc kubenswrapper[4676]: I0930 14:24:08.969564 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a391dbc4-4a80-4a26-9e6d-5903b425ae97-openstack-edpm-ipam\") pod \"dnsmasq-dns-cb6ffcf87-qglmb\" (UID: \"a391dbc4-4a80-4a26-9e6d-5903b425ae97\") " pod="openstack/dnsmasq-dns-cb6ffcf87-qglmb" Sep 30 14:24:08 crc kubenswrapper[4676]: I0930 14:24:08.972361 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a391dbc4-4a80-4a26-9e6d-5903b425ae97-dns-svc\") pod \"dnsmasq-dns-cb6ffcf87-qglmb\" (UID: \"a391dbc4-4a80-4a26-9e6d-5903b425ae97\") " pod="openstack/dnsmasq-dns-cb6ffcf87-qglmb" Sep 30 14:24:08 crc kubenswrapper[4676]: I0930 14:24:08.972374 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a391dbc4-4a80-4a26-9e6d-5903b425ae97-config\") pod \"dnsmasq-dns-cb6ffcf87-qglmb\" (UID: \"a391dbc4-4a80-4a26-9e6d-5903b425ae97\") " pod="openstack/dnsmasq-dns-cb6ffcf87-qglmb" Sep 30 14:24:08 crc kubenswrapper[4676]: I0930 14:24:08.986079 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6nb9\" (UniqueName: \"kubernetes.io/projected/a391dbc4-4a80-4a26-9e6d-5903b425ae97-kube-api-access-f6nb9\") pod \"dnsmasq-dns-cb6ffcf87-qglmb\" (UID: \"a391dbc4-4a80-4a26-9e6d-5903b425ae97\") " pod="openstack/dnsmasq-dns-cb6ffcf87-qglmb" Sep 30 14:24:09 crc kubenswrapper[4676]: I0930 14:24:09.044985 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb6ffcf87-qglmb" Sep 30 14:24:09 crc kubenswrapper[4676]: I0930 14:24:09.074341 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-dq8jp" Sep 30 14:24:09 crc kubenswrapper[4676]: I0930 14:24:09.169510 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6090ed01-b4ad-4d41-8e55-a250b4fb9d1c-config\") pod \"6090ed01-b4ad-4d41-8e55-a250b4fb9d1c\" (UID: \"6090ed01-b4ad-4d41-8e55-a250b4fb9d1c\") " Sep 30 14:24:09 crc kubenswrapper[4676]: I0930 14:24:09.169621 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6090ed01-b4ad-4d41-8e55-a250b4fb9d1c-dns-swift-storage-0\") pod \"6090ed01-b4ad-4d41-8e55-a250b4fb9d1c\" (UID: \"6090ed01-b4ad-4d41-8e55-a250b4fb9d1c\") " Sep 30 14:24:09 crc kubenswrapper[4676]: I0930 14:24:09.169675 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6090ed01-b4ad-4d41-8e55-a250b4fb9d1c-ovsdbserver-sb\") pod \"6090ed01-b4ad-4d41-8e55-a250b4fb9d1c\" (UID: \"6090ed01-b4ad-4d41-8e55-a250b4fb9d1c\") " Sep 30 14:24:09 crc kubenswrapper[4676]: I0930 14:24:09.169718 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7s7x\" (UniqueName: \"kubernetes.io/projected/6090ed01-b4ad-4d41-8e55-a250b4fb9d1c-kube-api-access-l7s7x\") pod \"6090ed01-b4ad-4d41-8e55-a250b4fb9d1c\" (UID: \"6090ed01-b4ad-4d41-8e55-a250b4fb9d1c\") " Sep 30 14:24:09 crc kubenswrapper[4676]: I0930 14:24:09.169762 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6090ed01-b4ad-4d41-8e55-a250b4fb9d1c-ovsdbserver-nb\") pod \"6090ed01-b4ad-4d41-8e55-a250b4fb9d1c\" (UID: \"6090ed01-b4ad-4d41-8e55-a250b4fb9d1c\") " Sep 30 14:24:09 crc kubenswrapper[4676]: I0930 14:24:09.169787 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6090ed01-b4ad-4d41-8e55-a250b4fb9d1c-dns-svc\") pod \"6090ed01-b4ad-4d41-8e55-a250b4fb9d1c\" (UID: \"6090ed01-b4ad-4d41-8e55-a250b4fb9d1c\") " Sep 30 14:24:09 crc kubenswrapper[4676]: I0930 14:24:09.176331 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6090ed01-b4ad-4d41-8e55-a250b4fb9d1c-kube-api-access-l7s7x" (OuterVolumeSpecName: "kube-api-access-l7s7x") pod "6090ed01-b4ad-4d41-8e55-a250b4fb9d1c" (UID: "6090ed01-b4ad-4d41-8e55-a250b4fb9d1c"). InnerVolumeSpecName "kube-api-access-l7s7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:24:09 crc kubenswrapper[4676]: I0930 14:24:09.241986 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6090ed01-b4ad-4d41-8e55-a250b4fb9d1c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6090ed01-b4ad-4d41-8e55-a250b4fb9d1c" (UID: "6090ed01-b4ad-4d41-8e55-a250b4fb9d1c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:24:09 crc kubenswrapper[4676]: I0930 14:24:09.254101 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6090ed01-b4ad-4d41-8e55-a250b4fb9d1c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6090ed01-b4ad-4d41-8e55-a250b4fb9d1c" (UID: "6090ed01-b4ad-4d41-8e55-a250b4fb9d1c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:24:09 crc kubenswrapper[4676]: I0930 14:24:09.258828 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6090ed01-b4ad-4d41-8e55-a250b4fb9d1c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6090ed01-b4ad-4d41-8e55-a250b4fb9d1c" (UID: "6090ed01-b4ad-4d41-8e55-a250b4fb9d1c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:24:09 crc kubenswrapper[4676]: I0930 14:24:09.263052 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6090ed01-b4ad-4d41-8e55-a250b4fb9d1c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6090ed01-b4ad-4d41-8e55-a250b4fb9d1c" (UID: "6090ed01-b4ad-4d41-8e55-a250b4fb9d1c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:24:09 crc kubenswrapper[4676]: I0930 14:24:09.264062 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6090ed01-b4ad-4d41-8e55-a250b4fb9d1c-config" (OuterVolumeSpecName: "config") pod "6090ed01-b4ad-4d41-8e55-a250b4fb9d1c" (UID: "6090ed01-b4ad-4d41-8e55-a250b4fb9d1c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:24:09 crc kubenswrapper[4676]: I0930 14:24:09.272596 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6090ed01-b4ad-4d41-8e55-a250b4fb9d1c-config\") on node \"crc\" DevicePath \"\"" Sep 30 14:24:09 crc kubenswrapper[4676]: I0930 14:24:09.272642 4676 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6090ed01-b4ad-4d41-8e55-a250b4fb9d1c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 14:24:09 crc kubenswrapper[4676]: I0930 14:24:09.272659 4676 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6090ed01-b4ad-4d41-8e55-a250b4fb9d1c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 14:24:09 crc kubenswrapper[4676]: I0930 14:24:09.272671 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7s7x\" (UniqueName: \"kubernetes.io/projected/6090ed01-b4ad-4d41-8e55-a250b4fb9d1c-kube-api-access-l7s7x\") on node \"crc\" DevicePath \"\"" Sep 30 14:24:09 crc kubenswrapper[4676]: I0930 14:24:09.272681 4676 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6090ed01-b4ad-4d41-8e55-a250b4fb9d1c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 14:24:09 crc kubenswrapper[4676]: I0930 14:24:09.272690 4676 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6090ed01-b4ad-4d41-8e55-a250b4fb9d1c-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 14:24:09 crc kubenswrapper[4676]: I0930 14:24:09.591349 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cb6ffcf87-qglmb"] Sep 30 14:24:09 crc kubenswrapper[4676]: W0930 14:24:09.595922 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda391dbc4_4a80_4a26_9e6d_5903b425ae97.slice/crio-8b62175e482da7c7a19b7dca2927bf4ba8d192e263b9122417386d3caf6483db WatchSource:0}: Error finding container 8b62175e482da7c7a19b7dca2927bf4ba8d192e263b9122417386d3caf6483db: Status 404 returned error can't find the container with id 8b62175e482da7c7a19b7dca2927bf4ba8d192e263b9122417386d3caf6483db Sep 30 14:24:09 crc kubenswrapper[4676]: I0930 14:24:09.629497 4676 generic.go:334] "Generic (PLEG): container finished" podID="6090ed01-b4ad-4d41-8e55-a250b4fb9d1c" containerID="03eaca873fb16fb2177e7dbbbc00649c914e4cfe297edfe2389b124064a3c3a1" exitCode=0 Sep 30 14:24:09 crc kubenswrapper[4676]: I0930 14:24:09.629550 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-dq8jp" Sep 30 14:24:09 crc kubenswrapper[4676]: I0930 14:24:09.629568 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-dq8jp" event={"ID":"6090ed01-b4ad-4d41-8e55-a250b4fb9d1c","Type":"ContainerDied","Data":"03eaca873fb16fb2177e7dbbbc00649c914e4cfe297edfe2389b124064a3c3a1"} Sep 30 14:24:09 crc kubenswrapper[4676]: I0930 14:24:09.630704 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-dq8jp" event={"ID":"6090ed01-b4ad-4d41-8e55-a250b4fb9d1c","Type":"ContainerDied","Data":"e46645cd417d3e064b1bee34a8bbb604763c29d757046d9413f923d3d6bebbb3"} Sep 30 14:24:09 crc kubenswrapper[4676]: I0930 14:24:09.630759 4676 scope.go:117] "RemoveContainer" containerID="03eaca873fb16fb2177e7dbbbc00649c914e4cfe297edfe2389b124064a3c3a1" Sep 30 14:24:09 crc kubenswrapper[4676]: I0930 14:24:09.635776 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb6ffcf87-qglmb" event={"ID":"a391dbc4-4a80-4a26-9e6d-5903b425ae97","Type":"ContainerStarted","Data":"8b62175e482da7c7a19b7dca2927bf4ba8d192e263b9122417386d3caf6483db"} Sep 30 14:24:09 crc kubenswrapper[4676]: I0930 14:24:09.671686 4676 scope.go:117] "RemoveContainer" containerID="7774731a0beec422fd24a38ec6d0ca2e543cf4ef9448e3831525e186d53bcc46" Sep 30 14:24:09 crc kubenswrapper[4676]: I0930 14:24:09.673456 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-dq8jp"] Sep 30 14:24:09 crc kubenswrapper[4676]: I0930 14:24:09.684586 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-dq8jp"] Sep 30 14:24:09 crc kubenswrapper[4676]: I0930 14:24:09.774326 4676 scope.go:117] "RemoveContainer" containerID="03eaca873fb16fb2177e7dbbbc00649c914e4cfe297edfe2389b124064a3c3a1" Sep 30 14:24:09 crc kubenswrapper[4676]: E0930 14:24:09.775077 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03eaca873fb16fb2177e7dbbbc00649c914e4cfe297edfe2389b124064a3c3a1\": container with ID starting with 03eaca873fb16fb2177e7dbbbc00649c914e4cfe297edfe2389b124064a3c3a1 not found: ID does not exist" containerID="03eaca873fb16fb2177e7dbbbc00649c914e4cfe297edfe2389b124064a3c3a1" Sep 30 14:24:09 crc kubenswrapper[4676]: I0930 14:24:09.775140 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03eaca873fb16fb2177e7dbbbc00649c914e4cfe297edfe2389b124064a3c3a1"} err="failed to get container status \"03eaca873fb16fb2177e7dbbbc00649c914e4cfe297edfe2389b124064a3c3a1\": rpc error: code = NotFound desc = could not find container \"03eaca873fb16fb2177e7dbbbc00649c914e4cfe297edfe2389b124064a3c3a1\": container with ID starting with 03eaca873fb16fb2177e7dbbbc00649c914e4cfe297edfe2389b124064a3c3a1 not found: ID does not exist" Sep 30 14:24:09 crc kubenswrapper[4676]: I0930 14:24:09.775169 4676 scope.go:117] "RemoveContainer" containerID="7774731a0beec422fd24a38ec6d0ca2e543cf4ef9448e3831525e186d53bcc46" Sep 30 14:24:09 crc kubenswrapper[4676]: E0930 14:24:09.776485 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7774731a0beec422fd24a38ec6d0ca2e543cf4ef9448e3831525e186d53bcc46\": container with ID starting with 7774731a0beec422fd24a38ec6d0ca2e543cf4ef9448e3831525e186d53bcc46 not found: ID does not exist" containerID="7774731a0beec422fd24a38ec6d0ca2e543cf4ef9448e3831525e186d53bcc46" Sep 30 14:24:09 crc kubenswrapper[4676]: I0930 14:24:09.776514 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7774731a0beec422fd24a38ec6d0ca2e543cf4ef9448e3831525e186d53bcc46"} err="failed to get container status \"7774731a0beec422fd24a38ec6d0ca2e543cf4ef9448e3831525e186d53bcc46\": rpc error: code = NotFound desc = could not find container \"7774731a0beec422fd24a38ec6d0ca2e543cf4ef9448e3831525e186d53bcc46\": container with ID starting with 7774731a0beec422fd24a38ec6d0ca2e543cf4ef9448e3831525e186d53bcc46 not found: ID does not exist" Sep 30 14:24:10 crc kubenswrapper[4676]: I0930 14:24:10.648586 4676 generic.go:334] "Generic (PLEG): container finished" podID="a391dbc4-4a80-4a26-9e6d-5903b425ae97" containerID="6d1583b5752d3dc72bdb120ae3e66c2af501727821e14df0ba08abf8341ead34" exitCode=0 Sep 30 14:24:10 crc kubenswrapper[4676]: I0930 14:24:10.648691 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb6ffcf87-qglmb" event={"ID":"a391dbc4-4a80-4a26-9e6d-5903b425ae97","Type":"ContainerDied","Data":"6d1583b5752d3dc72bdb120ae3e66c2af501727821e14df0ba08abf8341ead34"} Sep 30 14:24:11 crc kubenswrapper[4676]: I0930 14:24:11.446013 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6090ed01-b4ad-4d41-8e55-a250b4fb9d1c" path="/var/lib/kubelet/pods/6090ed01-b4ad-4d41-8e55-a250b4fb9d1c/volumes" Sep 30 14:24:11 crc kubenswrapper[4676]: I0930 14:24:11.660329 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb6ffcf87-qglmb" event={"ID":"a391dbc4-4a80-4a26-9e6d-5903b425ae97","Type":"ContainerStarted","Data":"07953b68f419b3a37b65e358b75dee7bfe9513b01e99603541c20d5e1e243775"} Sep 30 14:24:11 crc kubenswrapper[4676]: I0930 14:24:11.660481 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cb6ffcf87-qglmb" Sep 30 14:24:11 crc kubenswrapper[4676]: I0930 14:24:11.684160 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cb6ffcf87-qglmb" podStartSLOduration=3.684113157 podStartE2EDuration="3.684113157s" podCreationTimestamp="2025-09-30 14:24:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:24:11.679530833 +0000 UTC m=+1555.662619272" watchObservedRunningTime="2025-09-30 14:24:11.684113157 +0000 UTC m=+1555.667201586" Sep 30 14:24:14 crc kubenswrapper[4676]: I0930 14:24:14.056338 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-59cf4bdb65-dq8jp" podUID="6090ed01-b4ad-4d41-8e55-a250b4fb9d1c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.200:5353: i/o timeout" Sep 30 14:24:14 crc kubenswrapper[4676]: E0930 14:24:14.631667 4676 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8c17bb9_4ccb_470c_9f5b_69dd8420b4b0.slice/crio-conmon-132eb428f09d1cdd842bcfbd95531b10c13cd57968ace3fb1cbe737c1647f05d.scope\": RecentStats: unable to find data in memory cache]" Sep 30 14:24:19 crc kubenswrapper[4676]: I0930 14:24:19.046796 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cb6ffcf87-qglmb" Sep 30 14:24:19 crc kubenswrapper[4676]: I0930 14:24:19.115017 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-jnt8d"] Sep 30 14:24:19 crc kubenswrapper[4676]: I0930 14:24:19.115572 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-67b789f86c-jnt8d" podUID="f87bbb67-f381-4585-be53-0ddea95f1495" containerName="dnsmasq-dns" containerID="cri-o://aaf813e9cfb6cabf673e2eb3c8f7aaa9f41ca89bdb522967fe99cf9153836c39" gracePeriod=10 Sep 30 14:24:19 crc kubenswrapper[4676]: I0930 14:24:19.651765 4676 scope.go:117] "RemoveContainer" containerID="bc1ab75b26feecc1cc385882130093498814d61294f51694dac523605102ac71" Sep 30 14:24:19 crc kubenswrapper[4676]: I0930 14:24:19.739039 4676 generic.go:334] "Generic (PLEG): container finished" podID="f87bbb67-f381-4585-be53-0ddea95f1495" containerID="aaf813e9cfb6cabf673e2eb3c8f7aaa9f41ca89bdb522967fe99cf9153836c39" exitCode=0 Sep 30 14:24:19 crc kubenswrapper[4676]: I0930 14:24:19.739083 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-jnt8d" event={"ID":"f87bbb67-f381-4585-be53-0ddea95f1495","Type":"ContainerDied","Data":"aaf813e9cfb6cabf673e2eb3c8f7aaa9f41ca89bdb522967fe99cf9153836c39"} Sep 30 14:24:19 crc kubenswrapper[4676]: I0930 14:24:19.739110 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-jnt8d" event={"ID":"f87bbb67-f381-4585-be53-0ddea95f1495","Type":"ContainerDied","Data":"94e25d05e8e67d98aa41a53a8f66826150c8f6aee2ee09152ad39717d7c229fd"} Sep 30 14:24:19 crc kubenswrapper[4676]: I0930 14:24:19.739122 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94e25d05e8e67d98aa41a53a8f66826150c8f6aee2ee09152ad39717d7c229fd" Sep 30 14:24:19 crc kubenswrapper[4676]: I0930 14:24:19.754714 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-jnt8d" Sep 30 14:24:19 crc kubenswrapper[4676]: I0930 14:24:19.776093 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zm5vv\" (UniqueName: \"kubernetes.io/projected/f87bbb67-f381-4585-be53-0ddea95f1495-kube-api-access-zm5vv\") pod \"f87bbb67-f381-4585-be53-0ddea95f1495\" (UID: \"f87bbb67-f381-4585-be53-0ddea95f1495\") " Sep 30 14:24:19 crc kubenswrapper[4676]: I0930 14:24:19.776143 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f87bbb67-f381-4585-be53-0ddea95f1495-dns-swift-storage-0\") pod \"f87bbb67-f381-4585-be53-0ddea95f1495\" (UID: \"f87bbb67-f381-4585-be53-0ddea95f1495\") " Sep 30 14:24:19 crc kubenswrapper[4676]: I0930 14:24:19.776200 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f87bbb67-f381-4585-be53-0ddea95f1495-ovsdbserver-sb\") pod \"f87bbb67-f381-4585-be53-0ddea95f1495\" (UID: \"f87bbb67-f381-4585-be53-0ddea95f1495\") " Sep 30 14:24:19 crc kubenswrapper[4676]: I0930 14:24:19.776223 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f87bbb67-f381-4585-be53-0ddea95f1495-config\") pod \"f87bbb67-f381-4585-be53-0ddea95f1495\" (UID: \"f87bbb67-f381-4585-be53-0ddea95f1495\") " Sep 30 14:24:19 crc kubenswrapper[4676]: I0930 14:24:19.776259 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f87bbb67-f381-4585-be53-0ddea95f1495-openstack-edpm-ipam\") pod \"f87bbb67-f381-4585-be53-0ddea95f1495\" (UID: \"f87bbb67-f381-4585-be53-0ddea95f1495\") " Sep 30 14:24:19 crc kubenswrapper[4676]: I0930 14:24:19.776289 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f87bbb67-f381-4585-be53-0ddea95f1495-ovsdbserver-nb\") pod \"f87bbb67-f381-4585-be53-0ddea95f1495\" (UID: \"f87bbb67-f381-4585-be53-0ddea95f1495\") " Sep 30 14:24:19 crc kubenswrapper[4676]: I0930 14:24:19.776327 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f87bbb67-f381-4585-be53-0ddea95f1495-dns-svc\") pod \"f87bbb67-f381-4585-be53-0ddea95f1495\" (UID: \"f87bbb67-f381-4585-be53-0ddea95f1495\") " Sep 30 14:24:19 crc kubenswrapper[4676]: I0930 14:24:19.788634 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f87bbb67-f381-4585-be53-0ddea95f1495-kube-api-access-zm5vv" (OuterVolumeSpecName: "kube-api-access-zm5vv") pod "f87bbb67-f381-4585-be53-0ddea95f1495" (UID: "f87bbb67-f381-4585-be53-0ddea95f1495"). InnerVolumeSpecName "kube-api-access-zm5vv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:24:19 crc kubenswrapper[4676]: I0930 14:24:19.855000 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f87bbb67-f381-4585-be53-0ddea95f1495-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "f87bbb67-f381-4585-be53-0ddea95f1495" (UID: "f87bbb67-f381-4585-be53-0ddea95f1495"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:24:19 crc kubenswrapper[4676]: I0930 14:24:19.855109 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f87bbb67-f381-4585-be53-0ddea95f1495-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f87bbb67-f381-4585-be53-0ddea95f1495" (UID: "f87bbb67-f381-4585-be53-0ddea95f1495"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:24:19 crc kubenswrapper[4676]: I0930 14:24:19.859044 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f87bbb67-f381-4585-be53-0ddea95f1495-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f87bbb67-f381-4585-be53-0ddea95f1495" (UID: "f87bbb67-f381-4585-be53-0ddea95f1495"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:24:19 crc kubenswrapper[4676]: I0930 14:24:19.862723 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f87bbb67-f381-4585-be53-0ddea95f1495-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f87bbb67-f381-4585-be53-0ddea95f1495" (UID: "f87bbb67-f381-4585-be53-0ddea95f1495"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:24:19 crc kubenswrapper[4676]: I0930 14:24:19.864572 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f87bbb67-f381-4585-be53-0ddea95f1495-config" (OuterVolumeSpecName: "config") pod "f87bbb67-f381-4585-be53-0ddea95f1495" (UID: "f87bbb67-f381-4585-be53-0ddea95f1495"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:24:19 crc kubenswrapper[4676]: I0930 14:24:19.867911 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f87bbb67-f381-4585-be53-0ddea95f1495-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f87bbb67-f381-4585-be53-0ddea95f1495" (UID: "f87bbb67-f381-4585-be53-0ddea95f1495"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:24:19 crc kubenswrapper[4676]: I0930 14:24:19.877842 4676 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f87bbb67-f381-4585-be53-0ddea95f1495-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Sep 30 14:24:19 crc kubenswrapper[4676]: I0930 14:24:19.877869 4676 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f87bbb67-f381-4585-be53-0ddea95f1495-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 14:24:19 crc kubenswrapper[4676]: I0930 14:24:19.877893 4676 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f87bbb67-f381-4585-be53-0ddea95f1495-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 14:24:19 crc kubenswrapper[4676]: I0930 14:24:19.877903 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zm5vv\" (UniqueName: \"kubernetes.io/projected/f87bbb67-f381-4585-be53-0ddea95f1495-kube-api-access-zm5vv\") on node \"crc\" DevicePath \"\"" Sep 30 14:24:19 crc kubenswrapper[4676]: I0930 14:24:19.877914 4676 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f87bbb67-f381-4585-be53-0ddea95f1495-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 14:24:19 crc kubenswrapper[4676]: I0930 14:24:19.877922 4676 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f87bbb67-f381-4585-be53-0ddea95f1495-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 14:24:19 crc kubenswrapper[4676]: I0930 14:24:19.877931 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f87bbb67-f381-4585-be53-0ddea95f1495-config\") on node \"crc\" DevicePath \"\"" Sep 30 14:24:20 crc kubenswrapper[4676]: I0930 14:24:20.750969 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-jnt8d" Sep 30 14:24:20 crc kubenswrapper[4676]: I0930 14:24:20.788180 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-jnt8d"] Sep 30 14:24:20 crc kubenswrapper[4676]: I0930 14:24:20.796375 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-jnt8d"] Sep 30 14:24:21 crc kubenswrapper[4676]: I0930 14:24:21.445365 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f87bbb67-f381-4585-be53-0ddea95f1495" path="/var/lib/kubelet/pods/f87bbb67-f381-4585-be53-0ddea95f1495/volumes" Sep 30 14:24:26 crc kubenswrapper[4676]: I0930 14:24:26.805585 4676 generic.go:334] "Generic (PLEG): container finished" podID="140577c7-99f4-4dc1-85dd-1bec990df549" containerID="a27c6e0ad828c012ec9442f693ba51b22f10ee952758c3ec547473cb6dae45ae" exitCode=0 Sep 30 14:24:26 crc kubenswrapper[4676]: I0930 14:24:26.805735 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"140577c7-99f4-4dc1-85dd-1bec990df549","Type":"ContainerDied","Data":"a27c6e0ad828c012ec9442f693ba51b22f10ee952758c3ec547473cb6dae45ae"} Sep 30 14:24:27 crc kubenswrapper[4676]: I0930 14:24:27.824013 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"140577c7-99f4-4dc1-85dd-1bec990df549","Type":"ContainerStarted","Data":"80f7f6249f3583381afaccd0f5804eb6edbf22172ec0669f9e4d6406dc6c2417"} Sep 30 14:24:27 crc kubenswrapper[4676]: I0930 14:24:27.824776 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Sep 30 14:24:27 crc kubenswrapper[4676]: I0930 14:24:27.827065 4676 generic.go:334] "Generic (PLEG): container finished" podID="56d86f95-7d72-42d4-84d7-fdca29b1270f" containerID="280aac38b9e3d9e418df498b4aa8fbaa57ec6ddacaca40c2265ebfb7941b0bdf" exitCode=0 Sep 30 14:24:27 crc kubenswrapper[4676]: I0930 14:24:27.827114 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"56d86f95-7d72-42d4-84d7-fdca29b1270f","Type":"ContainerDied","Data":"280aac38b9e3d9e418df498b4aa8fbaa57ec6ddacaca40c2265ebfb7941b0bdf"} Sep 30 14:24:27 crc kubenswrapper[4676]: I0930 14:24:27.855748 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=33.85572981 podStartE2EDuration="33.85572981s" podCreationTimestamp="2025-09-30 14:23:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:24:27.850655547 +0000 UTC m=+1571.833743986" watchObservedRunningTime="2025-09-30 14:24:27.85572981 +0000 UTC m=+1571.838818239" Sep 30 14:24:27 crc kubenswrapper[4676]: I0930 14:24:27.895487 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-54lsm"] Sep 30 14:24:27 crc kubenswrapper[4676]: E0930 14:24:27.895997 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6090ed01-b4ad-4d41-8e55-a250b4fb9d1c" containerName="dnsmasq-dns" Sep 30 14:24:27 crc kubenswrapper[4676]: I0930 14:24:27.896034 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="6090ed01-b4ad-4d41-8e55-a250b4fb9d1c" containerName="dnsmasq-dns" Sep 30 14:24:27 crc kubenswrapper[4676]: E0930 14:24:27.896058 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6090ed01-b4ad-4d41-8e55-a250b4fb9d1c" containerName="init" Sep 30 14:24:27 crc kubenswrapper[4676]: I0930 14:24:27.896064 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="6090ed01-b4ad-4d41-8e55-a250b4fb9d1c" containerName="init" Sep 30 14:24:27 crc kubenswrapper[4676]: E0930 14:24:27.896077 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f87bbb67-f381-4585-be53-0ddea95f1495" containerName="dnsmasq-dns" Sep 30 14:24:27 crc kubenswrapper[4676]: I0930 14:24:27.896083 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="f87bbb67-f381-4585-be53-0ddea95f1495" containerName="dnsmasq-dns" Sep 30 14:24:27 crc kubenswrapper[4676]: E0930 14:24:27.896089 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f87bbb67-f381-4585-be53-0ddea95f1495" containerName="init" Sep 30 14:24:27 crc kubenswrapper[4676]: I0930 14:24:27.896095 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="f87bbb67-f381-4585-be53-0ddea95f1495" containerName="init" Sep 30 14:24:27 crc kubenswrapper[4676]: I0930 14:24:27.896294 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="f87bbb67-f381-4585-be53-0ddea95f1495" containerName="dnsmasq-dns" Sep 30 14:24:27 crc kubenswrapper[4676]: I0930 14:24:27.896317 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="6090ed01-b4ad-4d41-8e55-a250b4fb9d1c" containerName="dnsmasq-dns" Sep 30 14:24:27 crc kubenswrapper[4676]: I0930 14:24:27.896978 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-54lsm" Sep 30 14:24:27 crc kubenswrapper[4676]: I0930 14:24:27.900762 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 14:24:27 crc kubenswrapper[4676]: I0930 14:24:27.901072 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 14:24:27 crc kubenswrapper[4676]: I0930 14:24:27.901208 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mzpth" Sep 30 14:24:27 crc kubenswrapper[4676]: I0930 14:24:27.901512 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 14:24:27 crc kubenswrapper[4676]: I0930 14:24:27.922667 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-54lsm"] Sep 30 14:24:28 crc kubenswrapper[4676]: I0930 14:24:28.026251 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d063f50-40de-47eb-9849-8b29cee35392-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-54lsm\" (UID: \"8d063f50-40de-47eb-9849-8b29cee35392\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-54lsm" Sep 30 14:24:28 crc kubenswrapper[4676]: I0930 14:24:28.026316 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8d063f50-40de-47eb-9849-8b29cee35392-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-54lsm\" (UID: \"8d063f50-40de-47eb-9849-8b29cee35392\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-54lsm" Sep 30 14:24:28 crc kubenswrapper[4676]: I0930 14:24:28.026573 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d063f50-40de-47eb-9849-8b29cee35392-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-54lsm\" (UID: \"8d063f50-40de-47eb-9849-8b29cee35392\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-54lsm" Sep 30 14:24:28 crc kubenswrapper[4676]: I0930 14:24:28.026738 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wn5qk\" (UniqueName: \"kubernetes.io/projected/8d063f50-40de-47eb-9849-8b29cee35392-kube-api-access-wn5qk\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-54lsm\" (UID: \"8d063f50-40de-47eb-9849-8b29cee35392\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-54lsm" Sep 30 14:24:28 crc kubenswrapper[4676]: I0930 14:24:28.128782 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wn5qk\" (UniqueName: \"kubernetes.io/projected/8d063f50-40de-47eb-9849-8b29cee35392-kube-api-access-wn5qk\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-54lsm\" (UID: \"8d063f50-40de-47eb-9849-8b29cee35392\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-54lsm" Sep 30 14:24:28 crc kubenswrapper[4676]: I0930 14:24:28.128963 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d063f50-40de-47eb-9849-8b29cee35392-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-54lsm\" (UID: \"8d063f50-40de-47eb-9849-8b29cee35392\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-54lsm" Sep 30 14:24:28 crc kubenswrapper[4676]: I0930 14:24:28.129001 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8d063f50-40de-47eb-9849-8b29cee35392-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-54lsm\" (UID: \"8d063f50-40de-47eb-9849-8b29cee35392\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-54lsm" Sep 30 14:24:28 crc kubenswrapper[4676]: I0930 14:24:28.129060 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d063f50-40de-47eb-9849-8b29cee35392-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-54lsm\" (UID: \"8d063f50-40de-47eb-9849-8b29cee35392\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-54lsm" Sep 30 14:24:28 crc kubenswrapper[4676]: I0930 14:24:28.133646 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8d063f50-40de-47eb-9849-8b29cee35392-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-54lsm\" (UID: \"8d063f50-40de-47eb-9849-8b29cee35392\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-54lsm" Sep 30 14:24:28 crc kubenswrapper[4676]: I0930 14:24:28.133837 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d063f50-40de-47eb-9849-8b29cee35392-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-54lsm\" (UID: \"8d063f50-40de-47eb-9849-8b29cee35392\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-54lsm" Sep 30 14:24:28 crc kubenswrapper[4676]: I0930 14:24:28.135290 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d063f50-40de-47eb-9849-8b29cee35392-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-54lsm\" (UID: \"8d063f50-40de-47eb-9849-8b29cee35392\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-54lsm" Sep 30 14:24:28 crc kubenswrapper[4676]: I0930 14:24:28.149862 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wn5qk\" (UniqueName: \"kubernetes.io/projected/8d063f50-40de-47eb-9849-8b29cee35392-kube-api-access-wn5qk\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-54lsm\" (UID: \"8d063f50-40de-47eb-9849-8b29cee35392\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-54lsm" Sep 30 14:24:28 crc kubenswrapper[4676]: I0930 14:24:28.339918 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-54lsm" Sep 30 14:24:28 crc kubenswrapper[4676]: I0930 14:24:28.847278 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"56d86f95-7d72-42d4-84d7-fdca29b1270f","Type":"ContainerStarted","Data":"7bd55cb004c4db54eed6a559edb31d0f93f3d025c39101b18e4020f3d930e775"} Sep 30 14:24:28 crc kubenswrapper[4676]: I0930 14:24:28.848074 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:24:28 crc kubenswrapper[4676]: I0930 14:24:28.877412 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=33.877394848 podStartE2EDuration="33.877394848s" podCreationTimestamp="2025-09-30 14:23:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:24:28.875501369 +0000 UTC m=+1572.858589808" watchObservedRunningTime="2025-09-30 14:24:28.877394848 +0000 UTC m=+1572.860483277" Sep 30 14:24:28 crc kubenswrapper[4676]: I0930 14:24:28.952868 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-54lsm"] Sep 30 14:24:29 crc kubenswrapper[4676]: I0930 14:24:29.861812 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-54lsm" event={"ID":"8d063f50-40de-47eb-9849-8b29cee35392","Type":"ContainerStarted","Data":"7534b02874673f25df196ab04ee9ec95c8e8509fa28e377eda383c22c1aea362"} Sep 30 14:24:41 crc kubenswrapper[4676]: E0930 14:24:41.588921 4676 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest" Sep 30 14:24:41 crc kubenswrapper[4676]: E0930 14:24:41.589695 4676 kuberuntime_manager.go:1274] "Unhandled Error" err=< Sep 30 14:24:41 crc kubenswrapper[4676]: container &Container{Name:repo-setup-edpm-deployment-openstack-edpm-ipam,Image:quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest,Command:[],Args:[ansible-runner run /runner -p playbook.yaml -i repo-setup-edpm-deployment-openstack-edpm-ipam],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ANSIBLE_VERBOSITY,Value:2,ValueFrom:nil,},EnvVar{Name:RUNNER_PLAYBOOK,Value: Sep 30 14:24:41 crc kubenswrapper[4676]: - hosts: all Sep 30 14:24:41 crc kubenswrapper[4676]: strategy: linear Sep 30 14:24:41 crc kubenswrapper[4676]: tasks: Sep 30 14:24:41 crc kubenswrapper[4676]: - name: Enable podified-repos Sep 30 14:24:41 crc kubenswrapper[4676]: become: true Sep 30 14:24:41 crc kubenswrapper[4676]: ansible.builtin.shell: | Sep 30 14:24:41 crc kubenswrapper[4676]: set -euxo pipefail Sep 30 14:24:41 crc kubenswrapper[4676]: pushd /var/tmp Sep 30 14:24:41 crc kubenswrapper[4676]: curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz Sep 30 14:24:41 crc kubenswrapper[4676]: pushd repo-setup-main Sep 30 14:24:41 crc kubenswrapper[4676]: python3 -m venv ./venv Sep 30 14:24:41 crc kubenswrapper[4676]: PBR_VERSION=0.0.0 ./venv/bin/pip install ./ Sep 30 14:24:41 crc kubenswrapper[4676]: ./venv/bin/repo-setup current-podified -b antelope Sep 30 14:24:41 crc kubenswrapper[4676]: popd Sep 30 14:24:41 crc kubenswrapper[4676]: rm -rf repo-setup-main Sep 30 14:24:41 crc kubenswrapper[4676]: Sep 30 14:24:41 crc kubenswrapper[4676]: Sep 30 14:24:41 crc kubenswrapper[4676]: ,ValueFrom:nil,},EnvVar{Name:RUNNER_EXTRA_VARS,Value: Sep 30 14:24:41 crc kubenswrapper[4676]: edpm_override_hosts: openstack-edpm-ipam Sep 30 14:24:41 crc kubenswrapper[4676]: edpm_service_type: repo-setup Sep 30 14:24:41 crc kubenswrapper[4676]: Sep 30 14:24:41 crc kubenswrapper[4676]: Sep 30 14:24:41 crc kubenswrapper[4676]: ,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:repo-setup-combined-ca-bundle,ReadOnly:false,MountPath:/var/lib/openstack/cacerts/repo-setup,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/runner/env/ssh_key,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:inventory,ReadOnly:false,MountPath:/runner/inventory/hosts,SubPath:inventory,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wn5qk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:openstack-aee-default-env,},Optional:*true,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod repo-setup-edpm-deployment-openstack-edpm-ipam-54lsm_openstack(8d063f50-40de-47eb-9849-8b29cee35392): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled Sep 30 14:24:41 crc kubenswrapper[4676]: > logger="UnhandledError" Sep 30 14:24:41 crc kubenswrapper[4676]: E0930 14:24:41.590867 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"repo-setup-edpm-deployment-openstack-edpm-ipam\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-54lsm" podUID="8d063f50-40de-47eb-9849-8b29cee35392" Sep 30 14:24:42 crc kubenswrapper[4676]: E0930 14:24:42.004175 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"repo-setup-edpm-deployment-openstack-edpm-ipam\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest\\\"\"" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-54lsm" podUID="8d063f50-40de-47eb-9849-8b29cee35392" Sep 30 14:24:44 crc kubenswrapper[4676]: I0930 14:24:44.932540 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Sep 30 14:24:45 crc kubenswrapper[4676]: I0930 14:24:45.978200 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Sep 30 14:24:55 crc kubenswrapper[4676]: I0930 14:24:55.148463 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-54lsm" event={"ID":"8d063f50-40de-47eb-9849-8b29cee35392","Type":"ContainerStarted","Data":"52a9d0a7eb9f7dab1ed8c8c52ec820ba734537aa21a4d37ed27e16df1dd16763"} Sep 30 14:24:55 crc kubenswrapper[4676]: I0930 14:24:55.166529 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-54lsm" podStartSLOduration=3.197774784 podStartE2EDuration="28.166507495s" podCreationTimestamp="2025-09-30 14:24:27 +0000 UTC" firstStartedPulling="2025-09-30 14:24:28.958923777 +0000 UTC m=+1572.942012206" lastFinishedPulling="2025-09-30 14:24:53.927656488 +0000 UTC m=+1597.910744917" observedRunningTime="2025-09-30 14:24:55.165406155 +0000 UTC m=+1599.148494584" watchObservedRunningTime="2025-09-30 14:24:55.166507495 +0000 UTC m=+1599.149595924" Sep 30 14:25:09 crc kubenswrapper[4676]: I0930 14:25:09.285216 4676 generic.go:334] "Generic (PLEG): container finished" podID="8d063f50-40de-47eb-9849-8b29cee35392" containerID="52a9d0a7eb9f7dab1ed8c8c52ec820ba734537aa21a4d37ed27e16df1dd16763" exitCode=0 Sep 30 14:25:09 crc kubenswrapper[4676]: I0930 14:25:09.285307 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-54lsm" event={"ID":"8d063f50-40de-47eb-9849-8b29cee35392","Type":"ContainerDied","Data":"52a9d0a7eb9f7dab1ed8c8c52ec820ba734537aa21a4d37ed27e16df1dd16763"} Sep 30 14:25:10 crc kubenswrapper[4676]: I0930 14:25:10.783551 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-54lsm" Sep 30 14:25:10 crc kubenswrapper[4676]: I0930 14:25:10.859247 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wn5qk\" (UniqueName: \"kubernetes.io/projected/8d063f50-40de-47eb-9849-8b29cee35392-kube-api-access-wn5qk\") pod \"8d063f50-40de-47eb-9849-8b29cee35392\" (UID: \"8d063f50-40de-47eb-9849-8b29cee35392\") " Sep 30 14:25:10 crc kubenswrapper[4676]: I0930 14:25:10.859398 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d063f50-40de-47eb-9849-8b29cee35392-inventory\") pod \"8d063f50-40de-47eb-9849-8b29cee35392\" (UID: \"8d063f50-40de-47eb-9849-8b29cee35392\") " Sep 30 14:25:10 crc kubenswrapper[4676]: I0930 14:25:10.859439 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8d063f50-40de-47eb-9849-8b29cee35392-ssh-key\") pod \"8d063f50-40de-47eb-9849-8b29cee35392\" (UID: \"8d063f50-40de-47eb-9849-8b29cee35392\") " Sep 30 14:25:10 crc kubenswrapper[4676]: I0930 14:25:10.859514 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d063f50-40de-47eb-9849-8b29cee35392-repo-setup-combined-ca-bundle\") pod \"8d063f50-40de-47eb-9849-8b29cee35392\" (UID: \"8d063f50-40de-47eb-9849-8b29cee35392\") " Sep 30 14:25:10 crc kubenswrapper[4676]: I0930 14:25:10.866965 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d063f50-40de-47eb-9849-8b29cee35392-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "8d063f50-40de-47eb-9849-8b29cee35392" (UID: "8d063f50-40de-47eb-9849-8b29cee35392"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:25:10 crc kubenswrapper[4676]: I0930 14:25:10.868159 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d063f50-40de-47eb-9849-8b29cee35392-kube-api-access-wn5qk" (OuterVolumeSpecName: "kube-api-access-wn5qk") pod "8d063f50-40de-47eb-9849-8b29cee35392" (UID: "8d063f50-40de-47eb-9849-8b29cee35392"). InnerVolumeSpecName "kube-api-access-wn5qk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:25:10 crc kubenswrapper[4676]: I0930 14:25:10.891418 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d063f50-40de-47eb-9849-8b29cee35392-inventory" (OuterVolumeSpecName: "inventory") pod "8d063f50-40de-47eb-9849-8b29cee35392" (UID: "8d063f50-40de-47eb-9849-8b29cee35392"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:25:10 crc kubenswrapper[4676]: I0930 14:25:10.893530 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d063f50-40de-47eb-9849-8b29cee35392-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8d063f50-40de-47eb-9849-8b29cee35392" (UID: "8d063f50-40de-47eb-9849-8b29cee35392"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:25:10 crc kubenswrapper[4676]: I0930 14:25:10.961423 4676 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d063f50-40de-47eb-9849-8b29cee35392-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 14:25:10 crc kubenswrapper[4676]: I0930 14:25:10.961461 4676 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8d063f50-40de-47eb-9849-8b29cee35392-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 14:25:10 crc kubenswrapper[4676]: I0930 14:25:10.961471 4676 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d063f50-40de-47eb-9849-8b29cee35392-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:25:10 crc kubenswrapper[4676]: I0930 14:25:10.961483 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wn5qk\" (UniqueName: \"kubernetes.io/projected/8d063f50-40de-47eb-9849-8b29cee35392-kube-api-access-wn5qk\") on node \"crc\" DevicePath \"\"" Sep 30 14:25:11 crc kubenswrapper[4676]: I0930 14:25:11.308949 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-54lsm" event={"ID":"8d063f50-40de-47eb-9849-8b29cee35392","Type":"ContainerDied","Data":"7534b02874673f25df196ab04ee9ec95c8e8509fa28e377eda383c22c1aea362"} Sep 30 14:25:11 crc kubenswrapper[4676]: I0930 14:25:11.309023 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7534b02874673f25df196ab04ee9ec95c8e8509fa28e377eda383c22c1aea362" Sep 30 14:25:11 crc kubenswrapper[4676]: I0930 14:25:11.309034 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-54lsm" Sep 30 14:25:11 crc kubenswrapper[4676]: I0930 14:25:11.386187 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-cj72b"] Sep 30 14:25:11 crc kubenswrapper[4676]: E0930 14:25:11.386746 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d063f50-40de-47eb-9849-8b29cee35392" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Sep 30 14:25:11 crc kubenswrapper[4676]: I0930 14:25:11.386773 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d063f50-40de-47eb-9849-8b29cee35392" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Sep 30 14:25:11 crc kubenswrapper[4676]: I0930 14:25:11.386989 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d063f50-40de-47eb-9849-8b29cee35392" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Sep 30 14:25:11 crc kubenswrapper[4676]: I0930 14:25:11.387815 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-cj72b" Sep 30 14:25:11 crc kubenswrapper[4676]: I0930 14:25:11.390824 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 14:25:11 crc kubenswrapper[4676]: I0930 14:25:11.391072 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 14:25:11 crc kubenswrapper[4676]: I0930 14:25:11.391265 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 14:25:11 crc kubenswrapper[4676]: I0930 14:25:11.392320 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mzpth" Sep 30 14:25:11 crc kubenswrapper[4676]: I0930 14:25:11.395716 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-cj72b"] Sep 30 14:25:11 crc kubenswrapper[4676]: I0930 14:25:11.471497 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bcad236c-a0f7-47a2-ae9e-e52839eaee9d-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-cj72b\" (UID: \"bcad236c-a0f7-47a2-ae9e-e52839eaee9d\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-cj72b" Sep 30 14:25:11 crc kubenswrapper[4676]: I0930 14:25:11.471551 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjhvf\" (UniqueName: \"kubernetes.io/projected/bcad236c-a0f7-47a2-ae9e-e52839eaee9d-kube-api-access-jjhvf\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-cj72b\" (UID: \"bcad236c-a0f7-47a2-ae9e-e52839eaee9d\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-cj72b" Sep 30 14:25:11 crc kubenswrapper[4676]: I0930 14:25:11.471643 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bcad236c-a0f7-47a2-ae9e-e52839eaee9d-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-cj72b\" (UID: \"bcad236c-a0f7-47a2-ae9e-e52839eaee9d\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-cj72b" Sep 30 14:25:11 crc kubenswrapper[4676]: I0930 14:25:11.573680 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bcad236c-a0f7-47a2-ae9e-e52839eaee9d-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-cj72b\" (UID: \"bcad236c-a0f7-47a2-ae9e-e52839eaee9d\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-cj72b" Sep 30 14:25:11 crc kubenswrapper[4676]: I0930 14:25:11.573737 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjhvf\" (UniqueName: \"kubernetes.io/projected/bcad236c-a0f7-47a2-ae9e-e52839eaee9d-kube-api-access-jjhvf\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-cj72b\" (UID: \"bcad236c-a0f7-47a2-ae9e-e52839eaee9d\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-cj72b" Sep 30 14:25:11 crc kubenswrapper[4676]: I0930 14:25:11.573843 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bcad236c-a0f7-47a2-ae9e-e52839eaee9d-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-cj72b\" (UID: \"bcad236c-a0f7-47a2-ae9e-e52839eaee9d\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-cj72b" Sep 30 14:25:11 crc kubenswrapper[4676]: I0930 14:25:11.579691 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bcad236c-a0f7-47a2-ae9e-e52839eaee9d-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-cj72b\" (UID: \"bcad236c-a0f7-47a2-ae9e-e52839eaee9d\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-cj72b" Sep 30 14:25:11 crc kubenswrapper[4676]: I0930 14:25:11.579946 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bcad236c-a0f7-47a2-ae9e-e52839eaee9d-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-cj72b\" (UID: \"bcad236c-a0f7-47a2-ae9e-e52839eaee9d\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-cj72b" Sep 30 14:25:11 crc kubenswrapper[4676]: I0930 14:25:11.595533 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjhvf\" (UniqueName: \"kubernetes.io/projected/bcad236c-a0f7-47a2-ae9e-e52839eaee9d-kube-api-access-jjhvf\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-cj72b\" (UID: \"bcad236c-a0f7-47a2-ae9e-e52839eaee9d\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-cj72b" Sep 30 14:25:11 crc kubenswrapper[4676]: I0930 14:25:11.706448 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-cj72b" Sep 30 14:25:12 crc kubenswrapper[4676]: I0930 14:25:12.240330 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-cj72b"] Sep 30 14:25:12 crc kubenswrapper[4676]: I0930 14:25:12.249821 4676 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 14:25:12 crc kubenswrapper[4676]: I0930 14:25:12.320605 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-cj72b" event={"ID":"bcad236c-a0f7-47a2-ae9e-e52839eaee9d","Type":"ContainerStarted","Data":"796b3fc47c4f19e9f75911a4ca09931cce80b9fba536157345fccff18015edcd"} Sep 30 14:25:14 crc kubenswrapper[4676]: I0930 14:25:14.339454 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-cj72b" event={"ID":"bcad236c-a0f7-47a2-ae9e-e52839eaee9d","Type":"ContainerStarted","Data":"24cf954e13e683bd0a9b230f8bcd1b6ce77dc59df64babb21a96e6ae58dfce08"} Sep 30 14:25:14 crc kubenswrapper[4676]: I0930 14:25:14.362514 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-cj72b" podStartSLOduration=1.655039011 podStartE2EDuration="3.362457702s" podCreationTimestamp="2025-09-30 14:25:11 +0000 UTC" firstStartedPulling="2025-09-30 14:25:12.249617603 +0000 UTC m=+1616.232706032" lastFinishedPulling="2025-09-30 14:25:13.957036294 +0000 UTC m=+1617.940124723" observedRunningTime="2025-09-30 14:25:14.362402361 +0000 UTC m=+1618.345490800" watchObservedRunningTime="2025-09-30 14:25:14.362457702 +0000 UTC m=+1618.345546141" Sep 30 14:25:17 crc kubenswrapper[4676]: I0930 14:25:17.371531 4676 generic.go:334] "Generic (PLEG): container finished" podID="bcad236c-a0f7-47a2-ae9e-e52839eaee9d" containerID="24cf954e13e683bd0a9b230f8bcd1b6ce77dc59df64babb21a96e6ae58dfce08" exitCode=0 Sep 30 14:25:17 crc kubenswrapper[4676]: I0930 14:25:17.371631 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-cj72b" event={"ID":"bcad236c-a0f7-47a2-ae9e-e52839eaee9d","Type":"ContainerDied","Data":"24cf954e13e683bd0a9b230f8bcd1b6ce77dc59df64babb21a96e6ae58dfce08"} Sep 30 14:25:18 crc kubenswrapper[4676]: I0930 14:25:18.779343 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-cj72b" Sep 30 14:25:18 crc kubenswrapper[4676]: I0930 14:25:18.820788 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bcad236c-a0f7-47a2-ae9e-e52839eaee9d-inventory\") pod \"bcad236c-a0f7-47a2-ae9e-e52839eaee9d\" (UID: \"bcad236c-a0f7-47a2-ae9e-e52839eaee9d\") " Sep 30 14:25:18 crc kubenswrapper[4676]: I0930 14:25:18.821313 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjhvf\" (UniqueName: \"kubernetes.io/projected/bcad236c-a0f7-47a2-ae9e-e52839eaee9d-kube-api-access-jjhvf\") pod \"bcad236c-a0f7-47a2-ae9e-e52839eaee9d\" (UID: \"bcad236c-a0f7-47a2-ae9e-e52839eaee9d\") " Sep 30 14:25:18 crc kubenswrapper[4676]: I0930 14:25:18.821526 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bcad236c-a0f7-47a2-ae9e-e52839eaee9d-ssh-key\") pod \"bcad236c-a0f7-47a2-ae9e-e52839eaee9d\" (UID: \"bcad236c-a0f7-47a2-ae9e-e52839eaee9d\") " Sep 30 14:25:18 crc kubenswrapper[4676]: I0930 14:25:18.827071 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcad236c-a0f7-47a2-ae9e-e52839eaee9d-kube-api-access-jjhvf" (OuterVolumeSpecName: "kube-api-access-jjhvf") pod "bcad236c-a0f7-47a2-ae9e-e52839eaee9d" (UID: "bcad236c-a0f7-47a2-ae9e-e52839eaee9d"). InnerVolumeSpecName "kube-api-access-jjhvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:25:18 crc kubenswrapper[4676]: I0930 14:25:18.850518 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcad236c-a0f7-47a2-ae9e-e52839eaee9d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "bcad236c-a0f7-47a2-ae9e-e52839eaee9d" (UID: "bcad236c-a0f7-47a2-ae9e-e52839eaee9d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:25:18 crc kubenswrapper[4676]: I0930 14:25:18.854427 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcad236c-a0f7-47a2-ae9e-e52839eaee9d-inventory" (OuterVolumeSpecName: "inventory") pod "bcad236c-a0f7-47a2-ae9e-e52839eaee9d" (UID: "bcad236c-a0f7-47a2-ae9e-e52839eaee9d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:25:18 crc kubenswrapper[4676]: I0930 14:25:18.924394 4676 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bcad236c-a0f7-47a2-ae9e-e52839eaee9d-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 14:25:18 crc kubenswrapper[4676]: I0930 14:25:18.924670 4676 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bcad236c-a0f7-47a2-ae9e-e52839eaee9d-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 14:25:18 crc kubenswrapper[4676]: I0930 14:25:18.924763 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjhvf\" (UniqueName: \"kubernetes.io/projected/bcad236c-a0f7-47a2-ae9e-e52839eaee9d-kube-api-access-jjhvf\") on node \"crc\" DevicePath \"\"" Sep 30 14:25:19 crc kubenswrapper[4676]: I0930 14:25:19.398304 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-cj72b" event={"ID":"bcad236c-a0f7-47a2-ae9e-e52839eaee9d","Type":"ContainerDied","Data":"796b3fc47c4f19e9f75911a4ca09931cce80b9fba536157345fccff18015edcd"} Sep 30 14:25:19 crc kubenswrapper[4676]: I0930 14:25:19.398777 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="796b3fc47c4f19e9f75911a4ca09931cce80b9fba536157345fccff18015edcd" Sep 30 14:25:19 crc kubenswrapper[4676]: I0930 14:25:19.398399 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-cj72b" Sep 30 14:25:19 crc kubenswrapper[4676]: I0930 14:25:19.477774 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9p8tn"] Sep 30 14:25:19 crc kubenswrapper[4676]: E0930 14:25:19.478565 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcad236c-a0f7-47a2-ae9e-e52839eaee9d" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Sep 30 14:25:19 crc kubenswrapper[4676]: I0930 14:25:19.478594 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcad236c-a0f7-47a2-ae9e-e52839eaee9d" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Sep 30 14:25:19 crc kubenswrapper[4676]: I0930 14:25:19.478804 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcad236c-a0f7-47a2-ae9e-e52839eaee9d" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Sep 30 14:25:19 crc kubenswrapper[4676]: I0930 14:25:19.479737 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9p8tn" Sep 30 14:25:19 crc kubenswrapper[4676]: I0930 14:25:19.483742 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mzpth" Sep 30 14:25:19 crc kubenswrapper[4676]: I0930 14:25:19.484464 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 14:25:19 crc kubenswrapper[4676]: I0930 14:25:19.484647 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 14:25:19 crc kubenswrapper[4676]: I0930 14:25:19.484821 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 14:25:19 crc kubenswrapper[4676]: I0930 14:25:19.493316 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9p8tn"] Sep 30 14:25:19 crc kubenswrapper[4676]: I0930 14:25:19.536951 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7ttr\" (UniqueName: \"kubernetes.io/projected/9093887d-1e08-4208-9584-a78c329fd7b0-kube-api-access-w7ttr\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9p8tn\" (UID: \"9093887d-1e08-4208-9584-a78c329fd7b0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9p8tn" Sep 30 14:25:19 crc kubenswrapper[4676]: I0930 14:25:19.537045 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9093887d-1e08-4208-9584-a78c329fd7b0-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9p8tn\" (UID: \"9093887d-1e08-4208-9584-a78c329fd7b0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9p8tn" Sep 30 14:25:19 crc kubenswrapper[4676]: I0930 14:25:19.537115 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9093887d-1e08-4208-9584-a78c329fd7b0-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9p8tn\" (UID: \"9093887d-1e08-4208-9584-a78c329fd7b0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9p8tn" Sep 30 14:25:19 crc kubenswrapper[4676]: I0930 14:25:19.537202 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9093887d-1e08-4208-9584-a78c329fd7b0-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9p8tn\" (UID: \"9093887d-1e08-4208-9584-a78c329fd7b0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9p8tn" Sep 30 14:25:19 crc kubenswrapper[4676]: I0930 14:25:19.639221 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9093887d-1e08-4208-9584-a78c329fd7b0-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9p8tn\" (UID: \"9093887d-1e08-4208-9584-a78c329fd7b0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9p8tn" Sep 30 14:25:19 crc kubenswrapper[4676]: I0930 14:25:19.639339 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9093887d-1e08-4208-9584-a78c329fd7b0-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9p8tn\" (UID: \"9093887d-1e08-4208-9584-a78c329fd7b0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9p8tn" Sep 30 14:25:19 crc kubenswrapper[4676]: I0930 14:25:19.639408 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9093887d-1e08-4208-9584-a78c329fd7b0-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9p8tn\" (UID: \"9093887d-1e08-4208-9584-a78c329fd7b0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9p8tn" Sep 30 14:25:19 crc kubenswrapper[4676]: I0930 14:25:19.639573 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7ttr\" (UniqueName: \"kubernetes.io/projected/9093887d-1e08-4208-9584-a78c329fd7b0-kube-api-access-w7ttr\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9p8tn\" (UID: \"9093887d-1e08-4208-9584-a78c329fd7b0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9p8tn" Sep 30 14:25:19 crc kubenswrapper[4676]: I0930 14:25:19.644177 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9093887d-1e08-4208-9584-a78c329fd7b0-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9p8tn\" (UID: \"9093887d-1e08-4208-9584-a78c329fd7b0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9p8tn" Sep 30 14:25:19 crc kubenswrapper[4676]: I0930 14:25:19.644498 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9093887d-1e08-4208-9584-a78c329fd7b0-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9p8tn\" (UID: \"9093887d-1e08-4208-9584-a78c329fd7b0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9p8tn" Sep 30 14:25:19 crc kubenswrapper[4676]: I0930 14:25:19.646349 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9093887d-1e08-4208-9584-a78c329fd7b0-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9p8tn\" (UID: \"9093887d-1e08-4208-9584-a78c329fd7b0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9p8tn" Sep 30 14:25:19 crc kubenswrapper[4676]: I0930 14:25:19.657081 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7ttr\" (UniqueName: \"kubernetes.io/projected/9093887d-1e08-4208-9584-a78c329fd7b0-kube-api-access-w7ttr\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9p8tn\" (UID: \"9093887d-1e08-4208-9584-a78c329fd7b0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9p8tn" Sep 30 14:25:19 crc kubenswrapper[4676]: I0930 14:25:19.808125 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9p8tn" Sep 30 14:25:20 crc kubenswrapper[4676]: I0930 14:25:20.379811 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9p8tn"] Sep 30 14:25:20 crc kubenswrapper[4676]: I0930 14:25:20.409248 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9p8tn" event={"ID":"9093887d-1e08-4208-9584-a78c329fd7b0","Type":"ContainerStarted","Data":"991fc01018bddd90c7a4a6ab3549bf466d37deb0a0d1a39d615717a950e26f45"} Sep 30 14:25:21 crc kubenswrapper[4676]: I0930 14:25:21.419650 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9p8tn" event={"ID":"9093887d-1e08-4208-9584-a78c329fd7b0","Type":"ContainerStarted","Data":"aab5cc95c0f44c043701f63a404a04469cbac4551185ff904ade4f195c9d7bcd"} Sep 30 14:25:21 crc kubenswrapper[4676]: I0930 14:25:21.455099 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9p8tn" podStartSLOduration=1.928414968 podStartE2EDuration="2.455057048s" podCreationTimestamp="2025-09-30 14:25:19 +0000 UTC" firstStartedPulling="2025-09-30 14:25:20.388407566 +0000 UTC m=+1624.371495996" lastFinishedPulling="2025-09-30 14:25:20.915049647 +0000 UTC m=+1624.898138076" observedRunningTime="2025-09-30 14:25:21.44253883 +0000 UTC m=+1625.425627269" watchObservedRunningTime="2025-09-30 14:25:21.455057048 +0000 UTC m=+1625.438145487" Sep 30 14:26:19 crc kubenswrapper[4676]: I0930 14:26:19.857619 4676 scope.go:117] "RemoveContainer" containerID="185eae2b3dcc4a3ac93914b66f37a3ecbfc99964c50d6eb283b10d3dbdff7545" Sep 30 14:26:19 crc kubenswrapper[4676]: I0930 14:26:19.900411 4676 scope.go:117] "RemoveContainer" containerID="87d9756869ed2a1da3dd6f1fbcc5f711c00b50d58c463696069450cfbf198086" Sep 30 14:26:19 crc kubenswrapper[4676]: I0930 14:26:19.947070 4676 scope.go:117] "RemoveContainer" containerID="15d4fa73b81c64d64a663b9b1b1d62f874e8abfa7d4eb14bfeea527f22babbf6" Sep 30 14:26:29 crc kubenswrapper[4676]: I0930 14:26:29.919874 4676 patch_prober.go:28] interesting pod/machine-config-daemon-4k2dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:26:29 crc kubenswrapper[4676]: I0930 14:26:29.920471 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:26:59 crc kubenswrapper[4676]: I0930 14:26:59.919363 4676 patch_prober.go:28] interesting pod/machine-config-daemon-4k2dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:26:59 crc kubenswrapper[4676]: I0930 14:26:59.920036 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:27:20 crc kubenswrapper[4676]: I0930 14:27:20.072775 4676 scope.go:117] "RemoveContainer" containerID="3ae371b754f0fb586408bdbfb98125a3b3dd21e489e16eeea16990c6f8789542" Sep 30 14:27:20 crc kubenswrapper[4676]: I0930 14:27:20.101046 4676 scope.go:117] "RemoveContainer" containerID="66dab40a6f6f0114922534982e6172da0eede50d0b26c825e28b48c515304ad4" Sep 30 14:27:29 crc kubenswrapper[4676]: I0930 14:27:29.920256 4676 patch_prober.go:28] interesting pod/machine-config-daemon-4k2dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:27:29 crc kubenswrapper[4676]: I0930 14:27:29.921015 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:27:29 crc kubenswrapper[4676]: I0930 14:27:29.921061 4676 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" Sep 30 14:27:29 crc kubenswrapper[4676]: I0930 14:27:29.921828 4676 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"594bb900fe8a86f6d24186adf1f821b6fb52e495dce4ee8695064f3b8033f590"} pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 14:27:29 crc kubenswrapper[4676]: I0930 14:27:29.921947 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerName="machine-config-daemon" containerID="cri-o://594bb900fe8a86f6d24186adf1f821b6fb52e495dce4ee8695064f3b8033f590" gracePeriod=600 Sep 30 14:27:30 crc kubenswrapper[4676]: E0930 14:27:30.046164 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:27:30 crc kubenswrapper[4676]: I0930 14:27:30.709963 4676 generic.go:334] "Generic (PLEG): container finished" podID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerID="594bb900fe8a86f6d24186adf1f821b6fb52e495dce4ee8695064f3b8033f590" exitCode=0 Sep 30 14:27:30 crc kubenswrapper[4676]: I0930 14:27:30.710015 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" event={"ID":"af133cb7-f0e4-428e-b348-c6e81493fc1d","Type":"ContainerDied","Data":"594bb900fe8a86f6d24186adf1f821b6fb52e495dce4ee8695064f3b8033f590"} Sep 30 14:27:30 crc kubenswrapper[4676]: I0930 14:27:30.710322 4676 scope.go:117] "RemoveContainer" containerID="629ff1f63cc08d0a90639ac0dfdfcd800429b2ca079d983358e85abb811d00e0" Sep 30 14:27:30 crc kubenswrapper[4676]: I0930 14:27:30.711209 4676 scope.go:117] "RemoveContainer" containerID="594bb900fe8a86f6d24186adf1f821b6fb52e495dce4ee8695064f3b8033f590" Sep 30 14:27:30 crc kubenswrapper[4676]: E0930 14:27:30.711505 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:27:35 crc kubenswrapper[4676]: I0930 14:27:35.048035 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-6j825"] Sep 30 14:27:35 crc kubenswrapper[4676]: I0930 14:27:35.059928 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-6j825"] Sep 30 14:27:35 crc kubenswrapper[4676]: I0930 14:27:35.444565 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b1b569a-2fee-4891-8eb7-279d8efe9600" path="/var/lib/kubelet/pods/1b1b569a-2fee-4891-8eb7-279d8efe9600/volumes" Sep 30 14:27:42 crc kubenswrapper[4676]: I0930 14:27:42.047323 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-msxhv"] Sep 30 14:27:42 crc kubenswrapper[4676]: I0930 14:27:42.061288 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-k4qfl"] Sep 30 14:27:42 crc kubenswrapper[4676]: I0930 14:27:42.068286 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-l4wcb"] Sep 30 14:27:42 crc kubenswrapper[4676]: I0930 14:27:42.075265 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-msxhv"] Sep 30 14:27:42 crc kubenswrapper[4676]: I0930 14:27:42.083306 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-k4qfl"] Sep 30 14:27:42 crc kubenswrapper[4676]: I0930 14:27:42.091202 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-l4wcb"] Sep 30 14:27:43 crc kubenswrapper[4676]: I0930 14:27:43.036135 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-jlwpr"] Sep 30 14:27:43 crc kubenswrapper[4676]: I0930 14:27:43.052146 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-hw99z"] Sep 30 14:27:43 crc kubenswrapper[4676]: I0930 14:27:43.063480 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-jlwpr"] Sep 30 14:27:43 crc kubenswrapper[4676]: I0930 14:27:43.073842 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-hw99z"] Sep 30 14:27:43 crc kubenswrapper[4676]: I0930 14:27:43.449462 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ac6c313-76fd-461b-9ca1-b93c7e1fb915" path="/var/lib/kubelet/pods/1ac6c313-76fd-461b-9ca1-b93c7e1fb915/volumes" Sep 30 14:27:43 crc kubenswrapper[4676]: I0930 14:27:43.451224 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53f1c642-a974-44d6-8d01-457730d2a186" path="/var/lib/kubelet/pods/53f1c642-a974-44d6-8d01-457730d2a186/volumes" Sep 30 14:27:43 crc kubenswrapper[4676]: I0930 14:27:43.451981 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ca1c7f3-f23d-4e35-9183-4cbf943e567d" path="/var/lib/kubelet/pods/9ca1c7f3-f23d-4e35-9183-4cbf943e567d/volumes" Sep 30 14:27:43 crc kubenswrapper[4676]: I0930 14:27:43.452714 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9f4602f-67b5-451d-9944-b3125ac805b2" path="/var/lib/kubelet/pods/b9f4602f-67b5-451d-9944-b3125ac805b2/volumes" Sep 30 14:27:43 crc kubenswrapper[4676]: I0930 14:27:43.454372 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd479fd7-ab79-40ee-a0f0-e4f2b192c80c" path="/var/lib/kubelet/pods/dd479fd7-ab79-40ee-a0f0-e4f2b192c80c/volumes" Sep 30 14:27:46 crc kubenswrapper[4676]: I0930 14:27:46.434669 4676 scope.go:117] "RemoveContainer" containerID="594bb900fe8a86f6d24186adf1f821b6fb52e495dce4ee8695064f3b8033f590" Sep 30 14:27:46 crc kubenswrapper[4676]: E0930 14:27:46.435396 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:27:53 crc kubenswrapper[4676]: I0930 14:27:53.033769 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-180c-account-create-zkbbd"] Sep 30 14:27:53 crc kubenswrapper[4676]: I0930 14:27:53.045165 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-5ebc-account-create-7k42h"] Sep 30 14:27:53 crc kubenswrapper[4676]: I0930 14:27:53.058581 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-6aea-account-create-fd92w"] Sep 30 14:27:53 crc kubenswrapper[4676]: I0930 14:27:53.072823 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-180c-account-create-zkbbd"] Sep 30 14:27:53 crc kubenswrapper[4676]: I0930 14:27:53.083613 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-5ebc-account-create-7k42h"] Sep 30 14:27:53 crc kubenswrapper[4676]: I0930 14:27:53.093860 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-6aea-account-create-fd92w"] Sep 30 14:27:53 crc kubenswrapper[4676]: I0930 14:27:53.448068 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="076c2b0a-4ad4-44b9-bec1-def5ed805975" path="/var/lib/kubelet/pods/076c2b0a-4ad4-44b9-bec1-def5ed805975/volumes" Sep 30 14:27:53 crc kubenswrapper[4676]: I0930 14:27:53.449034 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c06b4228-b99e-40ab-bbea-8bfe2bb1ddd7" path="/var/lib/kubelet/pods/c06b4228-b99e-40ab-bbea-8bfe2bb1ddd7/volumes" Sep 30 14:27:53 crc kubenswrapper[4676]: I0930 14:27:53.450073 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6a97d90-e028-46e3-afdf-d872d446f162" path="/var/lib/kubelet/pods/d6a97d90-e028-46e3-afdf-d872d446f162/volumes" Sep 30 14:27:54 crc kubenswrapper[4676]: I0930 14:27:54.038897 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-1aaa-account-create-gxldf"] Sep 30 14:27:54 crc kubenswrapper[4676]: I0930 14:27:54.055588 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-38b7-account-create-msmmq"] Sep 30 14:27:54 crc kubenswrapper[4676]: I0930 14:27:54.068149 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-bc91-account-create-zznlq"] Sep 30 14:27:54 crc kubenswrapper[4676]: I0930 14:27:54.080822 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-1aaa-account-create-gxldf"] Sep 30 14:27:54 crc kubenswrapper[4676]: I0930 14:27:54.091831 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-38b7-account-create-msmmq"] Sep 30 14:27:54 crc kubenswrapper[4676]: I0930 14:27:54.101941 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-bc91-account-create-zznlq"] Sep 30 14:27:55 crc kubenswrapper[4676]: I0930 14:27:55.445966 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62ea603d-33b6-4dfe-ad26-ccf513a14ae5" path="/var/lib/kubelet/pods/62ea603d-33b6-4dfe-ad26-ccf513a14ae5/volumes" Sep 30 14:27:55 crc kubenswrapper[4676]: I0930 14:27:55.446855 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c77bd056-9c07-43d9-bd25-02177d6b53cc" path="/var/lib/kubelet/pods/c77bd056-9c07-43d9-bd25-02177d6b53cc/volumes" Sep 30 14:27:55 crc kubenswrapper[4676]: I0930 14:27:55.447417 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f56c6587-9965-4011-848b-c822e4572d6d" path="/var/lib/kubelet/pods/f56c6587-9965-4011-848b-c822e4572d6d/volumes" Sep 30 14:27:59 crc kubenswrapper[4676]: I0930 14:27:59.433651 4676 scope.go:117] "RemoveContainer" containerID="594bb900fe8a86f6d24186adf1f821b6fb52e495dce4ee8695064f3b8033f590" Sep 30 14:27:59 crc kubenswrapper[4676]: E0930 14:27:59.434573 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:28:13 crc kubenswrapper[4676]: I0930 14:28:13.433770 4676 scope.go:117] "RemoveContainer" containerID="594bb900fe8a86f6d24186adf1f821b6fb52e495dce4ee8695064f3b8033f590" Sep 30 14:28:13 crc kubenswrapper[4676]: E0930 14:28:13.434581 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:28:18 crc kubenswrapper[4676]: I0930 14:28:18.777266 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5v2f2"] Sep 30 14:28:18 crc kubenswrapper[4676]: I0930 14:28:18.781148 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5v2f2" Sep 30 14:28:18 crc kubenswrapper[4676]: I0930 14:28:18.788757 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5v2f2"] Sep 30 14:28:18 crc kubenswrapper[4676]: I0930 14:28:18.969920 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a09a852-667f-41a2-9787-c38a6ef5da18-catalog-content\") pod \"redhat-operators-5v2f2\" (UID: \"0a09a852-667f-41a2-9787-c38a6ef5da18\") " pod="openshift-marketplace/redhat-operators-5v2f2" Sep 30 14:28:18 crc kubenswrapper[4676]: I0930 14:28:18.970744 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a09a852-667f-41a2-9787-c38a6ef5da18-utilities\") pod \"redhat-operators-5v2f2\" (UID: \"0a09a852-667f-41a2-9787-c38a6ef5da18\") " pod="openshift-marketplace/redhat-operators-5v2f2" Sep 30 14:28:18 crc kubenswrapper[4676]: I0930 14:28:18.971224 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9tq7\" (UniqueName: \"kubernetes.io/projected/0a09a852-667f-41a2-9787-c38a6ef5da18-kube-api-access-v9tq7\") pod \"redhat-operators-5v2f2\" (UID: \"0a09a852-667f-41a2-9787-c38a6ef5da18\") " pod="openshift-marketplace/redhat-operators-5v2f2" Sep 30 14:28:19 crc kubenswrapper[4676]: I0930 14:28:19.073032 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9tq7\" (UniqueName: \"kubernetes.io/projected/0a09a852-667f-41a2-9787-c38a6ef5da18-kube-api-access-v9tq7\") pod \"redhat-operators-5v2f2\" (UID: \"0a09a852-667f-41a2-9787-c38a6ef5da18\") " pod="openshift-marketplace/redhat-operators-5v2f2" Sep 30 14:28:19 crc kubenswrapper[4676]: I0930 14:28:19.073132 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a09a852-667f-41a2-9787-c38a6ef5da18-catalog-content\") pod \"redhat-operators-5v2f2\" (UID: \"0a09a852-667f-41a2-9787-c38a6ef5da18\") " pod="openshift-marketplace/redhat-operators-5v2f2" Sep 30 14:28:19 crc kubenswrapper[4676]: I0930 14:28:19.073162 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a09a852-667f-41a2-9787-c38a6ef5da18-utilities\") pod \"redhat-operators-5v2f2\" (UID: \"0a09a852-667f-41a2-9787-c38a6ef5da18\") " pod="openshift-marketplace/redhat-operators-5v2f2" Sep 30 14:28:19 crc kubenswrapper[4676]: I0930 14:28:19.073782 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a09a852-667f-41a2-9787-c38a6ef5da18-catalog-content\") pod \"redhat-operators-5v2f2\" (UID: \"0a09a852-667f-41a2-9787-c38a6ef5da18\") " pod="openshift-marketplace/redhat-operators-5v2f2" Sep 30 14:28:19 crc kubenswrapper[4676]: I0930 14:28:19.073845 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a09a852-667f-41a2-9787-c38a6ef5da18-utilities\") pod \"redhat-operators-5v2f2\" (UID: \"0a09a852-667f-41a2-9787-c38a6ef5da18\") " pod="openshift-marketplace/redhat-operators-5v2f2" Sep 30 14:28:19 crc kubenswrapper[4676]: I0930 14:28:19.107967 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9tq7\" (UniqueName: \"kubernetes.io/projected/0a09a852-667f-41a2-9787-c38a6ef5da18-kube-api-access-v9tq7\") pod \"redhat-operators-5v2f2\" (UID: \"0a09a852-667f-41a2-9787-c38a6ef5da18\") " pod="openshift-marketplace/redhat-operators-5v2f2" Sep 30 14:28:19 crc kubenswrapper[4676]: I0930 14:28:19.115955 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5v2f2" Sep 30 14:28:19 crc kubenswrapper[4676]: I0930 14:28:19.639344 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5v2f2"] Sep 30 14:28:20 crc kubenswrapper[4676]: I0930 14:28:20.172786 4676 scope.go:117] "RemoveContainer" containerID="d0ce6ba72a1c93abd37956fee26ea9e9d4e759d0f1e835d8ab6423288b3aed90" Sep 30 14:28:20 crc kubenswrapper[4676]: I0930 14:28:20.199497 4676 scope.go:117] "RemoveContainer" containerID="e72daba28095d339cd316212f8ade7d1e406e315c44893a2a47b1b080542f5ce" Sep 30 14:28:20 crc kubenswrapper[4676]: I0930 14:28:20.220892 4676 generic.go:334] "Generic (PLEG): container finished" podID="0a09a852-667f-41a2-9787-c38a6ef5da18" containerID="220d0ac1a8166828cb8d29c5b1b1556a42903fdacf22dac587c5b42501e1416e" exitCode=0 Sep 30 14:28:20 crc kubenswrapper[4676]: I0930 14:28:20.221066 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5v2f2" event={"ID":"0a09a852-667f-41a2-9787-c38a6ef5da18","Type":"ContainerDied","Data":"220d0ac1a8166828cb8d29c5b1b1556a42903fdacf22dac587c5b42501e1416e"} Sep 30 14:28:20 crc kubenswrapper[4676]: I0930 14:28:20.221624 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5v2f2" event={"ID":"0a09a852-667f-41a2-9787-c38a6ef5da18","Type":"ContainerStarted","Data":"cd981734b3809e5bfd8dea1c02b26381d8cfe7eab04a5f2978d59b4407c7a9d4"} Sep 30 14:28:20 crc kubenswrapper[4676]: I0930 14:28:20.263811 4676 scope.go:117] "RemoveContainer" containerID="e7909666c56ef6be2ee71a7394d383d2490ffe1053beac7d57a47a11963c70c9" Sep 30 14:28:20 crc kubenswrapper[4676]: I0930 14:28:20.315327 4676 scope.go:117] "RemoveContainer" containerID="3b37d0fbff8312be575df3c088d4b2f86e874759e4a10839aa7bcab348b2f8ba" Sep 30 14:28:20 crc kubenswrapper[4676]: I0930 14:28:20.376891 4676 scope.go:117] "RemoveContainer" containerID="a316181a84e64a874e15dc419463cdcd155cd443703667e9f347429c1c83ae7c" Sep 30 14:28:20 crc kubenswrapper[4676]: I0930 14:28:20.417192 4676 scope.go:117] "RemoveContainer" containerID="27156369ce0ce4742a4028041e67fd0beb70a95832d943bbf3147e6ccdadd5b7" Sep 30 14:28:20 crc kubenswrapper[4676]: I0930 14:28:20.438621 4676 scope.go:117] "RemoveContainer" containerID="f84cc29caea02667f85586eca8ef07d5f55a0e948f5838aac06c86ad4009c774" Sep 30 14:28:20 crc kubenswrapper[4676]: I0930 14:28:20.492227 4676 scope.go:117] "RemoveContainer" containerID="23fe04c2f3809d1f1c2287ff1b727f9347f93f0336fd8272f23001dbf8600b22" Sep 30 14:28:20 crc kubenswrapper[4676]: I0930 14:28:20.521101 4676 scope.go:117] "RemoveContainer" containerID="566181ad0387e74b65009333ef7a68a4306d2ae6824e78d7717d1153368e4148" Sep 30 14:28:20 crc kubenswrapper[4676]: I0930 14:28:20.550035 4676 scope.go:117] "RemoveContainer" containerID="20f52802eb411ddd44c1482ca9bae3b03ee472d9745b38fd59b42ce41f2d7667" Sep 30 14:28:20 crc kubenswrapper[4676]: I0930 14:28:20.571679 4676 scope.go:117] "RemoveContainer" containerID="c44fbee39b5cbb506a60b7580dd96675036b64e9982f8f9878b3268fad375e1c" Sep 30 14:28:20 crc kubenswrapper[4676]: I0930 14:28:20.612405 4676 scope.go:117] "RemoveContainer" containerID="7f9a9363dd2efb8b0d225b0419d5003e0dd11e4c40e34876500fdfacd02f2ac5" Sep 30 14:28:20 crc kubenswrapper[4676]: I0930 14:28:20.647338 4676 scope.go:117] "RemoveContainer" containerID="92cb55d388f6edb10e4c1e36fa5a9909cf2af12ec1c63b66cc7624d13a3a8e13" Sep 30 14:28:20 crc kubenswrapper[4676]: I0930 14:28:20.669563 4676 scope.go:117] "RemoveContainer" containerID="e51fa9717e27d3e2f72286bb5b0738d6d9067bb2754f56d1b5795876c3d36644" Sep 30 14:28:20 crc kubenswrapper[4676]: I0930 14:28:20.692276 4676 scope.go:117] "RemoveContainer" containerID="5317f29887f8501b5bcf2aaa71651b5a733faeb8ce126f8fbaf16d5d10878774" Sep 30 14:28:20 crc kubenswrapper[4676]: I0930 14:28:20.724022 4676 scope.go:117] "RemoveContainer" containerID="04d4adbd34469149417bcd5d2ed90120c0c6da2aec29a1482c962af67dcaeaae" Sep 30 14:28:22 crc kubenswrapper[4676]: I0930 14:28:22.249605 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5v2f2" event={"ID":"0a09a852-667f-41a2-9787-c38a6ef5da18","Type":"ContainerStarted","Data":"79598059ef03d27ffabb0d47857795d034e59ee17d2ed956299c827aae8ba8c9"} Sep 30 14:28:27 crc kubenswrapper[4676]: I0930 14:28:27.299129 4676 generic.go:334] "Generic (PLEG): container finished" podID="0a09a852-667f-41a2-9787-c38a6ef5da18" containerID="79598059ef03d27ffabb0d47857795d034e59ee17d2ed956299c827aae8ba8c9" exitCode=0 Sep 30 14:28:27 crc kubenswrapper[4676]: I0930 14:28:27.299230 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5v2f2" event={"ID":"0a09a852-667f-41a2-9787-c38a6ef5da18","Type":"ContainerDied","Data":"79598059ef03d27ffabb0d47857795d034e59ee17d2ed956299c827aae8ba8c9"} Sep 30 14:28:27 crc kubenswrapper[4676]: I0930 14:28:27.441719 4676 scope.go:117] "RemoveContainer" containerID="594bb900fe8a86f6d24186adf1f821b6fb52e495dce4ee8695064f3b8033f590" Sep 30 14:28:27 crc kubenswrapper[4676]: E0930 14:28:27.442053 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:28:28 crc kubenswrapper[4676]: I0930 14:28:28.310338 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5v2f2" event={"ID":"0a09a852-667f-41a2-9787-c38a6ef5da18","Type":"ContainerStarted","Data":"6f64dbdd241def305a39a31ee5ffb057cbca52f96fe1eef95a6b83f4250d259e"} Sep 30 14:28:28 crc kubenswrapper[4676]: I0930 14:28:28.335991 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5v2f2" podStartSLOduration=2.6029507 podStartE2EDuration="10.335970324s" podCreationTimestamp="2025-09-30 14:28:18 +0000 UTC" firstStartedPulling="2025-09-30 14:28:20.22314604 +0000 UTC m=+1804.206234459" lastFinishedPulling="2025-09-30 14:28:27.956165654 +0000 UTC m=+1811.939254083" observedRunningTime="2025-09-30 14:28:28.33122916 +0000 UTC m=+1812.314317619" watchObservedRunningTime="2025-09-30 14:28:28.335970324 +0000 UTC m=+1812.319058753" Sep 30 14:28:29 crc kubenswrapper[4676]: I0930 14:28:29.116303 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5v2f2" Sep 30 14:28:29 crc kubenswrapper[4676]: I0930 14:28:29.116664 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5v2f2" Sep 30 14:28:30 crc kubenswrapper[4676]: I0930 14:28:30.161791 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5v2f2" podUID="0a09a852-667f-41a2-9787-c38a6ef5da18" containerName="registry-server" probeResult="failure" output=< Sep 30 14:28:30 crc kubenswrapper[4676]: timeout: failed to connect service ":50051" within 1s Sep 30 14:28:30 crc kubenswrapper[4676]: > Sep 30 14:28:39 crc kubenswrapper[4676]: I0930 14:28:39.166450 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5v2f2" Sep 30 14:28:39 crc kubenswrapper[4676]: I0930 14:28:39.216584 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5v2f2" Sep 30 14:28:39 crc kubenswrapper[4676]: I0930 14:28:39.858582 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5v2f2"] Sep 30 14:28:40 crc kubenswrapper[4676]: I0930 14:28:40.421492 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5v2f2" podUID="0a09a852-667f-41a2-9787-c38a6ef5da18" containerName="registry-server" containerID="cri-o://6f64dbdd241def305a39a31ee5ffb057cbca52f96fe1eef95a6b83f4250d259e" gracePeriod=2 Sep 30 14:28:40 crc kubenswrapper[4676]: I0930 14:28:40.871109 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5v2f2" Sep 30 14:28:41 crc kubenswrapper[4676]: I0930 14:28:41.025301 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a09a852-667f-41a2-9787-c38a6ef5da18-utilities\") pod \"0a09a852-667f-41a2-9787-c38a6ef5da18\" (UID: \"0a09a852-667f-41a2-9787-c38a6ef5da18\") " Sep 30 14:28:41 crc kubenswrapper[4676]: I0930 14:28:41.025618 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9tq7\" (UniqueName: \"kubernetes.io/projected/0a09a852-667f-41a2-9787-c38a6ef5da18-kube-api-access-v9tq7\") pod \"0a09a852-667f-41a2-9787-c38a6ef5da18\" (UID: \"0a09a852-667f-41a2-9787-c38a6ef5da18\") " Sep 30 14:28:41 crc kubenswrapper[4676]: I0930 14:28:41.025751 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a09a852-667f-41a2-9787-c38a6ef5da18-catalog-content\") pod \"0a09a852-667f-41a2-9787-c38a6ef5da18\" (UID: \"0a09a852-667f-41a2-9787-c38a6ef5da18\") " Sep 30 14:28:41 crc kubenswrapper[4676]: I0930 14:28:41.025995 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a09a852-667f-41a2-9787-c38a6ef5da18-utilities" (OuterVolumeSpecName: "utilities") pod "0a09a852-667f-41a2-9787-c38a6ef5da18" (UID: "0a09a852-667f-41a2-9787-c38a6ef5da18"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:28:41 crc kubenswrapper[4676]: I0930 14:28:41.027529 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a09a852-667f-41a2-9787-c38a6ef5da18-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 14:28:41 crc kubenswrapper[4676]: I0930 14:28:41.031310 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a09a852-667f-41a2-9787-c38a6ef5da18-kube-api-access-v9tq7" (OuterVolumeSpecName: "kube-api-access-v9tq7") pod "0a09a852-667f-41a2-9787-c38a6ef5da18" (UID: "0a09a852-667f-41a2-9787-c38a6ef5da18"). InnerVolumeSpecName "kube-api-access-v9tq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:28:41 crc kubenswrapper[4676]: I0930 14:28:41.112188 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a09a852-667f-41a2-9787-c38a6ef5da18-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0a09a852-667f-41a2-9787-c38a6ef5da18" (UID: "0a09a852-667f-41a2-9787-c38a6ef5da18"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:28:41 crc kubenswrapper[4676]: I0930 14:28:41.129311 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9tq7\" (UniqueName: \"kubernetes.io/projected/0a09a852-667f-41a2-9787-c38a6ef5da18-kube-api-access-v9tq7\") on node \"crc\" DevicePath \"\"" Sep 30 14:28:41 crc kubenswrapper[4676]: I0930 14:28:41.129353 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a09a852-667f-41a2-9787-c38a6ef5da18-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 14:28:41 crc kubenswrapper[4676]: I0930 14:28:41.431440 4676 generic.go:334] "Generic (PLEG): container finished" podID="0a09a852-667f-41a2-9787-c38a6ef5da18" containerID="6f64dbdd241def305a39a31ee5ffb057cbca52f96fe1eef95a6b83f4250d259e" exitCode=0 Sep 30 14:28:41 crc kubenswrapper[4676]: I0930 14:28:41.431486 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5v2f2" event={"ID":"0a09a852-667f-41a2-9787-c38a6ef5da18","Type":"ContainerDied","Data":"6f64dbdd241def305a39a31ee5ffb057cbca52f96fe1eef95a6b83f4250d259e"} Sep 30 14:28:41 crc kubenswrapper[4676]: I0930 14:28:41.431504 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5v2f2" Sep 30 14:28:41 crc kubenswrapper[4676]: I0930 14:28:41.431535 4676 scope.go:117] "RemoveContainer" containerID="6f64dbdd241def305a39a31ee5ffb057cbca52f96fe1eef95a6b83f4250d259e" Sep 30 14:28:41 crc kubenswrapper[4676]: I0930 14:28:41.431520 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5v2f2" event={"ID":"0a09a852-667f-41a2-9787-c38a6ef5da18","Type":"ContainerDied","Data":"cd981734b3809e5bfd8dea1c02b26381d8cfe7eab04a5f2978d59b4407c7a9d4"} Sep 30 14:28:41 crc kubenswrapper[4676]: I0930 14:28:41.432707 4676 scope.go:117] "RemoveContainer" containerID="594bb900fe8a86f6d24186adf1f821b6fb52e495dce4ee8695064f3b8033f590" Sep 30 14:28:41 crc kubenswrapper[4676]: E0930 14:28:41.433116 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:28:41 crc kubenswrapper[4676]: I0930 14:28:41.455011 4676 scope.go:117] "RemoveContainer" containerID="79598059ef03d27ffabb0d47857795d034e59ee17d2ed956299c827aae8ba8c9" Sep 30 14:28:41 crc kubenswrapper[4676]: I0930 14:28:41.481072 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5v2f2"] Sep 30 14:28:41 crc kubenswrapper[4676]: I0930 14:28:41.490557 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5v2f2"] Sep 30 14:28:41 crc kubenswrapper[4676]: I0930 14:28:41.497242 4676 scope.go:117] "RemoveContainer" containerID="220d0ac1a8166828cb8d29c5b1b1556a42903fdacf22dac587c5b42501e1416e" Sep 30 14:28:41 crc kubenswrapper[4676]: I0930 14:28:41.527045 4676 scope.go:117] "RemoveContainer" containerID="6f64dbdd241def305a39a31ee5ffb057cbca52f96fe1eef95a6b83f4250d259e" Sep 30 14:28:41 crc kubenswrapper[4676]: E0930 14:28:41.527470 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f64dbdd241def305a39a31ee5ffb057cbca52f96fe1eef95a6b83f4250d259e\": container with ID starting with 6f64dbdd241def305a39a31ee5ffb057cbca52f96fe1eef95a6b83f4250d259e not found: ID does not exist" containerID="6f64dbdd241def305a39a31ee5ffb057cbca52f96fe1eef95a6b83f4250d259e" Sep 30 14:28:41 crc kubenswrapper[4676]: I0930 14:28:41.527501 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f64dbdd241def305a39a31ee5ffb057cbca52f96fe1eef95a6b83f4250d259e"} err="failed to get container status \"6f64dbdd241def305a39a31ee5ffb057cbca52f96fe1eef95a6b83f4250d259e\": rpc error: code = NotFound desc = could not find container \"6f64dbdd241def305a39a31ee5ffb057cbca52f96fe1eef95a6b83f4250d259e\": container with ID starting with 6f64dbdd241def305a39a31ee5ffb057cbca52f96fe1eef95a6b83f4250d259e not found: ID does not exist" Sep 30 14:28:41 crc kubenswrapper[4676]: I0930 14:28:41.527521 4676 scope.go:117] "RemoveContainer" containerID="79598059ef03d27ffabb0d47857795d034e59ee17d2ed956299c827aae8ba8c9" Sep 30 14:28:41 crc kubenswrapper[4676]: E0930 14:28:41.527945 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79598059ef03d27ffabb0d47857795d034e59ee17d2ed956299c827aae8ba8c9\": container with ID starting with 79598059ef03d27ffabb0d47857795d034e59ee17d2ed956299c827aae8ba8c9 not found: ID does not exist" containerID="79598059ef03d27ffabb0d47857795d034e59ee17d2ed956299c827aae8ba8c9" Sep 30 14:28:41 crc kubenswrapper[4676]: I0930 14:28:41.527992 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79598059ef03d27ffabb0d47857795d034e59ee17d2ed956299c827aae8ba8c9"} err="failed to get container status \"79598059ef03d27ffabb0d47857795d034e59ee17d2ed956299c827aae8ba8c9\": rpc error: code = NotFound desc = could not find container \"79598059ef03d27ffabb0d47857795d034e59ee17d2ed956299c827aae8ba8c9\": container with ID starting with 79598059ef03d27ffabb0d47857795d034e59ee17d2ed956299c827aae8ba8c9 not found: ID does not exist" Sep 30 14:28:41 crc kubenswrapper[4676]: I0930 14:28:41.528005 4676 scope.go:117] "RemoveContainer" containerID="220d0ac1a8166828cb8d29c5b1b1556a42903fdacf22dac587c5b42501e1416e" Sep 30 14:28:41 crc kubenswrapper[4676]: E0930 14:28:41.528468 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"220d0ac1a8166828cb8d29c5b1b1556a42903fdacf22dac587c5b42501e1416e\": container with ID starting with 220d0ac1a8166828cb8d29c5b1b1556a42903fdacf22dac587c5b42501e1416e not found: ID does not exist" containerID="220d0ac1a8166828cb8d29c5b1b1556a42903fdacf22dac587c5b42501e1416e" Sep 30 14:28:41 crc kubenswrapper[4676]: I0930 14:28:41.528527 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"220d0ac1a8166828cb8d29c5b1b1556a42903fdacf22dac587c5b42501e1416e"} err="failed to get container status \"220d0ac1a8166828cb8d29c5b1b1556a42903fdacf22dac587c5b42501e1416e\": rpc error: code = NotFound desc = could not find container \"220d0ac1a8166828cb8d29c5b1b1556a42903fdacf22dac587c5b42501e1416e\": container with ID starting with 220d0ac1a8166828cb8d29c5b1b1556a42903fdacf22dac587c5b42501e1416e not found: ID does not exist" Sep 30 14:28:41 crc kubenswrapper[4676]: E0930 14:28:41.539425 4676 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a09a852_667f_41a2_9787_c38a6ef5da18.slice/crio-cd981734b3809e5bfd8dea1c02b26381d8cfe7eab04a5f2978d59b4407c7a9d4\": RecentStats: unable to find data in memory cache]" Sep 30 14:28:42 crc kubenswrapper[4676]: I0930 14:28:42.443826 4676 generic.go:334] "Generic (PLEG): container finished" podID="9093887d-1e08-4208-9584-a78c329fd7b0" containerID="aab5cc95c0f44c043701f63a404a04469cbac4551185ff904ade4f195c9d7bcd" exitCode=0 Sep 30 14:28:42 crc kubenswrapper[4676]: I0930 14:28:42.443974 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9p8tn" event={"ID":"9093887d-1e08-4208-9584-a78c329fd7b0","Type":"ContainerDied","Data":"aab5cc95c0f44c043701f63a404a04469cbac4551185ff904ade4f195c9d7bcd"} Sep 30 14:28:43 crc kubenswrapper[4676]: I0930 14:28:43.457809 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a09a852-667f-41a2-9787-c38a6ef5da18" path="/var/lib/kubelet/pods/0a09a852-667f-41a2-9787-c38a6ef5da18/volumes" Sep 30 14:28:43 crc kubenswrapper[4676]: I0930 14:28:43.878958 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9p8tn" Sep 30 14:28:43 crc kubenswrapper[4676]: I0930 14:28:43.987764 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9093887d-1e08-4208-9584-a78c329fd7b0-inventory\") pod \"9093887d-1e08-4208-9584-a78c329fd7b0\" (UID: \"9093887d-1e08-4208-9584-a78c329fd7b0\") " Sep 30 14:28:43 crc kubenswrapper[4676]: I0930 14:28:43.987980 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7ttr\" (UniqueName: \"kubernetes.io/projected/9093887d-1e08-4208-9584-a78c329fd7b0-kube-api-access-w7ttr\") pod \"9093887d-1e08-4208-9584-a78c329fd7b0\" (UID: \"9093887d-1e08-4208-9584-a78c329fd7b0\") " Sep 30 14:28:43 crc kubenswrapper[4676]: I0930 14:28:43.988010 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9093887d-1e08-4208-9584-a78c329fd7b0-bootstrap-combined-ca-bundle\") pod \"9093887d-1e08-4208-9584-a78c329fd7b0\" (UID: \"9093887d-1e08-4208-9584-a78c329fd7b0\") " Sep 30 14:28:43 crc kubenswrapper[4676]: I0930 14:28:43.988242 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9093887d-1e08-4208-9584-a78c329fd7b0-ssh-key\") pod \"9093887d-1e08-4208-9584-a78c329fd7b0\" (UID: \"9093887d-1e08-4208-9584-a78c329fd7b0\") " Sep 30 14:28:43 crc kubenswrapper[4676]: I0930 14:28:43.994808 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9093887d-1e08-4208-9584-a78c329fd7b0-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "9093887d-1e08-4208-9584-a78c329fd7b0" (UID: "9093887d-1e08-4208-9584-a78c329fd7b0"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:28:43 crc kubenswrapper[4676]: I0930 14:28:43.994970 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9093887d-1e08-4208-9584-a78c329fd7b0-kube-api-access-w7ttr" (OuterVolumeSpecName: "kube-api-access-w7ttr") pod "9093887d-1e08-4208-9584-a78c329fd7b0" (UID: "9093887d-1e08-4208-9584-a78c329fd7b0"). InnerVolumeSpecName "kube-api-access-w7ttr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:28:44 crc kubenswrapper[4676]: I0930 14:28:44.019227 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9093887d-1e08-4208-9584-a78c329fd7b0-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9093887d-1e08-4208-9584-a78c329fd7b0" (UID: "9093887d-1e08-4208-9584-a78c329fd7b0"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:28:44 crc kubenswrapper[4676]: I0930 14:28:44.022406 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9093887d-1e08-4208-9584-a78c329fd7b0-inventory" (OuterVolumeSpecName: "inventory") pod "9093887d-1e08-4208-9584-a78c329fd7b0" (UID: "9093887d-1e08-4208-9584-a78c329fd7b0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:28:44 crc kubenswrapper[4676]: I0930 14:28:44.091208 4676 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9093887d-1e08-4208-9584-a78c329fd7b0-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 14:28:44 crc kubenswrapper[4676]: I0930 14:28:44.091266 4676 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9093887d-1e08-4208-9584-a78c329fd7b0-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 14:28:44 crc kubenswrapper[4676]: I0930 14:28:44.091280 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7ttr\" (UniqueName: \"kubernetes.io/projected/9093887d-1e08-4208-9584-a78c329fd7b0-kube-api-access-w7ttr\") on node \"crc\" DevicePath \"\"" Sep 30 14:28:44 crc kubenswrapper[4676]: I0930 14:28:44.091297 4676 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9093887d-1e08-4208-9584-a78c329fd7b0-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:28:44 crc kubenswrapper[4676]: I0930 14:28:44.468438 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9p8tn" event={"ID":"9093887d-1e08-4208-9584-a78c329fd7b0","Type":"ContainerDied","Data":"991fc01018bddd90c7a4a6ab3549bf466d37deb0a0d1a39d615717a950e26f45"} Sep 30 14:28:44 crc kubenswrapper[4676]: I0930 14:28:44.468485 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="991fc01018bddd90c7a4a6ab3549bf466d37deb0a0d1a39d615717a950e26f45" Sep 30 14:28:44 crc kubenswrapper[4676]: I0930 14:28:44.468492 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9p8tn" Sep 30 14:28:44 crc kubenswrapper[4676]: I0930 14:28:44.557455 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p4p5c"] Sep 30 14:28:44 crc kubenswrapper[4676]: E0930 14:28:44.557925 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a09a852-667f-41a2-9787-c38a6ef5da18" containerName="registry-server" Sep 30 14:28:44 crc kubenswrapper[4676]: I0930 14:28:44.557943 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a09a852-667f-41a2-9787-c38a6ef5da18" containerName="registry-server" Sep 30 14:28:44 crc kubenswrapper[4676]: E0930 14:28:44.557973 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9093887d-1e08-4208-9584-a78c329fd7b0" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Sep 30 14:28:44 crc kubenswrapper[4676]: I0930 14:28:44.557983 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="9093887d-1e08-4208-9584-a78c329fd7b0" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Sep 30 14:28:44 crc kubenswrapper[4676]: E0930 14:28:44.557994 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a09a852-667f-41a2-9787-c38a6ef5da18" containerName="extract-content" Sep 30 14:28:44 crc kubenswrapper[4676]: I0930 14:28:44.558000 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a09a852-667f-41a2-9787-c38a6ef5da18" containerName="extract-content" Sep 30 14:28:44 crc kubenswrapper[4676]: E0930 14:28:44.558015 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a09a852-667f-41a2-9787-c38a6ef5da18" containerName="extract-utilities" Sep 30 14:28:44 crc kubenswrapper[4676]: I0930 14:28:44.558021 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a09a852-667f-41a2-9787-c38a6ef5da18" containerName="extract-utilities" Sep 30 14:28:44 crc kubenswrapper[4676]: I0930 14:28:44.558206 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="9093887d-1e08-4208-9584-a78c329fd7b0" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Sep 30 14:28:44 crc kubenswrapper[4676]: I0930 14:28:44.558228 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a09a852-667f-41a2-9787-c38a6ef5da18" containerName="registry-server" Sep 30 14:28:44 crc kubenswrapper[4676]: I0930 14:28:44.558856 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p4p5c" Sep 30 14:28:44 crc kubenswrapper[4676]: I0930 14:28:44.562215 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 14:28:44 crc kubenswrapper[4676]: I0930 14:28:44.562224 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 14:28:44 crc kubenswrapper[4676]: I0930 14:28:44.563739 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mzpth" Sep 30 14:28:44 crc kubenswrapper[4676]: I0930 14:28:44.563743 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 14:28:44 crc kubenswrapper[4676]: I0930 14:28:44.569674 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p4p5c"] Sep 30 14:28:44 crc kubenswrapper[4676]: I0930 14:28:44.702501 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhfcw\" (UniqueName: \"kubernetes.io/projected/8b08c117-d7d7-4bc3-89a0-8a05169688fa-kube-api-access-zhfcw\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-p4p5c\" (UID: \"8b08c117-d7d7-4bc3-89a0-8a05169688fa\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p4p5c" Sep 30 14:28:44 crc kubenswrapper[4676]: I0930 14:28:44.702571 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8b08c117-d7d7-4bc3-89a0-8a05169688fa-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-p4p5c\" (UID: \"8b08c117-d7d7-4bc3-89a0-8a05169688fa\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p4p5c" Sep 30 14:28:44 crc kubenswrapper[4676]: I0930 14:28:44.702612 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8b08c117-d7d7-4bc3-89a0-8a05169688fa-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-p4p5c\" (UID: \"8b08c117-d7d7-4bc3-89a0-8a05169688fa\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p4p5c" Sep 30 14:28:44 crc kubenswrapper[4676]: I0930 14:28:44.804423 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8b08c117-d7d7-4bc3-89a0-8a05169688fa-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-p4p5c\" (UID: \"8b08c117-d7d7-4bc3-89a0-8a05169688fa\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p4p5c" Sep 30 14:28:44 crc kubenswrapper[4676]: I0930 14:28:44.804487 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8b08c117-d7d7-4bc3-89a0-8a05169688fa-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-p4p5c\" (UID: \"8b08c117-d7d7-4bc3-89a0-8a05169688fa\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p4p5c" Sep 30 14:28:44 crc kubenswrapper[4676]: I0930 14:28:44.804629 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhfcw\" (UniqueName: \"kubernetes.io/projected/8b08c117-d7d7-4bc3-89a0-8a05169688fa-kube-api-access-zhfcw\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-p4p5c\" (UID: \"8b08c117-d7d7-4bc3-89a0-8a05169688fa\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p4p5c" Sep 30 14:28:44 crc kubenswrapper[4676]: I0930 14:28:44.809572 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8b08c117-d7d7-4bc3-89a0-8a05169688fa-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-p4p5c\" (UID: \"8b08c117-d7d7-4bc3-89a0-8a05169688fa\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p4p5c" Sep 30 14:28:44 crc kubenswrapper[4676]: I0930 14:28:44.825185 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8b08c117-d7d7-4bc3-89a0-8a05169688fa-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-p4p5c\" (UID: \"8b08c117-d7d7-4bc3-89a0-8a05169688fa\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p4p5c" Sep 30 14:28:44 crc kubenswrapper[4676]: I0930 14:28:44.826774 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhfcw\" (UniqueName: \"kubernetes.io/projected/8b08c117-d7d7-4bc3-89a0-8a05169688fa-kube-api-access-zhfcw\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-p4p5c\" (UID: \"8b08c117-d7d7-4bc3-89a0-8a05169688fa\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p4p5c" Sep 30 14:28:44 crc kubenswrapper[4676]: I0930 14:28:44.876068 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p4p5c" Sep 30 14:28:45 crc kubenswrapper[4676]: I0930 14:28:45.403742 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p4p5c"] Sep 30 14:28:45 crc kubenswrapper[4676]: W0930 14:28:45.409248 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b08c117_d7d7_4bc3_89a0_8a05169688fa.slice/crio-a948a9a08678cd6c62ef614dd15c41b8e192e45b34ef35938064fa5bc276179e WatchSource:0}: Error finding container a948a9a08678cd6c62ef614dd15c41b8e192e45b34ef35938064fa5bc276179e: Status 404 returned error can't find the container with id a948a9a08678cd6c62ef614dd15c41b8e192e45b34ef35938064fa5bc276179e Sep 30 14:28:45 crc kubenswrapper[4676]: I0930 14:28:45.478211 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p4p5c" event={"ID":"8b08c117-d7d7-4bc3-89a0-8a05169688fa","Type":"ContainerStarted","Data":"a948a9a08678cd6c62ef614dd15c41b8e192e45b34ef35938064fa5bc276179e"} Sep 30 14:28:46 crc kubenswrapper[4676]: I0930 14:28:46.490641 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p4p5c" event={"ID":"8b08c117-d7d7-4bc3-89a0-8a05169688fa","Type":"ContainerStarted","Data":"b77518ddbf654667cdb9dd5f8d8df695f26804f36fa36399c14cc3255e526df8"} Sep 30 14:28:46 crc kubenswrapper[4676]: I0930 14:28:46.514949 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p4p5c" podStartSLOduration=1.945658244 podStartE2EDuration="2.514920148s" podCreationTimestamp="2025-09-30 14:28:44 +0000 UTC" firstStartedPulling="2025-09-30 14:28:45.412752824 +0000 UTC m=+1829.395841253" lastFinishedPulling="2025-09-30 14:28:45.982014728 +0000 UTC m=+1829.965103157" observedRunningTime="2025-09-30 14:28:46.508714341 +0000 UTC m=+1830.491802790" watchObservedRunningTime="2025-09-30 14:28:46.514920148 +0000 UTC m=+1830.498008577" Sep 30 14:28:54 crc kubenswrapper[4676]: I0930 14:28:54.433646 4676 scope.go:117] "RemoveContainer" containerID="594bb900fe8a86f6d24186adf1f821b6fb52e495dce4ee8695064f3b8033f590" Sep 30 14:28:54 crc kubenswrapper[4676]: E0930 14:28:54.434490 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:29:08 crc kubenswrapper[4676]: I0930 14:29:08.433037 4676 scope.go:117] "RemoveContainer" containerID="594bb900fe8a86f6d24186adf1f821b6fb52e495dce4ee8695064f3b8033f590" Sep 30 14:29:08 crc kubenswrapper[4676]: E0930 14:29:08.433922 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:29:20 crc kubenswrapper[4676]: I0930 14:29:20.433460 4676 scope.go:117] "RemoveContainer" containerID="594bb900fe8a86f6d24186adf1f821b6fb52e495dce4ee8695064f3b8033f590" Sep 30 14:29:20 crc kubenswrapper[4676]: E0930 14:29:20.434342 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:29:21 crc kubenswrapper[4676]: I0930 14:29:21.017103 4676 scope.go:117] "RemoveContainer" containerID="c2fe8942b64a932ed3b353cb4dcf782d249e316feec4811ecca955cb274b1862" Sep 30 14:29:21 crc kubenswrapper[4676]: I0930 14:29:21.059195 4676 scope.go:117] "RemoveContainer" containerID="f5ca0365b3c5c12d4896860c008e693871d812adb6bb057a94a362affcc71020" Sep 30 14:29:21 crc kubenswrapper[4676]: I0930 14:29:21.084989 4676 scope.go:117] "RemoveContainer" containerID="b1b4523951ab1e8bc1137c241434e6e7763322c9f155a07229684db62157a000" Sep 30 14:29:22 crc kubenswrapper[4676]: I0930 14:29:22.058653 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-2m994"] Sep 30 14:29:22 crc kubenswrapper[4676]: I0930 14:29:22.067970 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-2m994"] Sep 30 14:29:23 crc kubenswrapper[4676]: I0930 14:29:23.446903 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0f731ab-125f-48d4-838e-a38a5e78c6fb" path="/var/lib/kubelet/pods/b0f731ab-125f-48d4-838e-a38a5e78c6fb/volumes" Sep 30 14:29:31 crc kubenswrapper[4676]: I0930 14:29:31.433546 4676 scope.go:117] "RemoveContainer" containerID="594bb900fe8a86f6d24186adf1f821b6fb52e495dce4ee8695064f3b8033f590" Sep 30 14:29:31 crc kubenswrapper[4676]: E0930 14:29:31.435321 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:29:42 crc kubenswrapper[4676]: I0930 14:29:42.433844 4676 scope.go:117] "RemoveContainer" containerID="594bb900fe8a86f6d24186adf1f821b6fb52e495dce4ee8695064f3b8033f590" Sep 30 14:29:42 crc kubenswrapper[4676]: E0930 14:29:42.434649 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:29:54 crc kubenswrapper[4676]: I0930 14:29:54.433338 4676 scope.go:117] "RemoveContainer" containerID="594bb900fe8a86f6d24186adf1f821b6fb52e495dce4ee8695064f3b8033f590" Sep 30 14:29:54 crc kubenswrapper[4676]: E0930 14:29:54.435309 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:30:00 crc kubenswrapper[4676]: I0930 14:30:00.147946 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320710-84s5f"] Sep 30 14:30:00 crc kubenswrapper[4676]: I0930 14:30:00.149964 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320710-84s5f" Sep 30 14:30:00 crc kubenswrapper[4676]: I0930 14:30:00.152088 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 14:30:00 crc kubenswrapper[4676]: I0930 14:30:00.152555 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 14:30:00 crc kubenswrapper[4676]: I0930 14:30:00.174204 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320710-84s5f"] Sep 30 14:30:00 crc kubenswrapper[4676]: I0930 14:30:00.215302 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/358635b3-41df-4e35-8de1-c747671c043c-config-volume\") pod \"collect-profiles-29320710-84s5f\" (UID: \"358635b3-41df-4e35-8de1-c747671c043c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320710-84s5f" Sep 30 14:30:00 crc kubenswrapper[4676]: I0930 14:30:00.215390 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/358635b3-41df-4e35-8de1-c747671c043c-secret-volume\") pod \"collect-profiles-29320710-84s5f\" (UID: \"358635b3-41df-4e35-8de1-c747671c043c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320710-84s5f" Sep 30 14:30:00 crc kubenswrapper[4676]: I0930 14:30:00.215542 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bkgp\" (UniqueName: \"kubernetes.io/projected/358635b3-41df-4e35-8de1-c747671c043c-kube-api-access-8bkgp\") pod \"collect-profiles-29320710-84s5f\" (UID: \"358635b3-41df-4e35-8de1-c747671c043c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320710-84s5f" Sep 30 14:30:00 crc kubenswrapper[4676]: I0930 14:30:00.317651 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/358635b3-41df-4e35-8de1-c747671c043c-config-volume\") pod \"collect-profiles-29320710-84s5f\" (UID: \"358635b3-41df-4e35-8de1-c747671c043c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320710-84s5f" Sep 30 14:30:00 crc kubenswrapper[4676]: I0930 14:30:00.317731 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/358635b3-41df-4e35-8de1-c747671c043c-secret-volume\") pod \"collect-profiles-29320710-84s5f\" (UID: \"358635b3-41df-4e35-8de1-c747671c043c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320710-84s5f" Sep 30 14:30:00 crc kubenswrapper[4676]: I0930 14:30:00.317936 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bkgp\" (UniqueName: \"kubernetes.io/projected/358635b3-41df-4e35-8de1-c747671c043c-kube-api-access-8bkgp\") pod \"collect-profiles-29320710-84s5f\" (UID: \"358635b3-41df-4e35-8de1-c747671c043c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320710-84s5f" Sep 30 14:30:00 crc kubenswrapper[4676]: I0930 14:30:00.318994 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/358635b3-41df-4e35-8de1-c747671c043c-config-volume\") pod \"collect-profiles-29320710-84s5f\" (UID: \"358635b3-41df-4e35-8de1-c747671c043c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320710-84s5f" Sep 30 14:30:00 crc kubenswrapper[4676]: I0930 14:30:00.356401 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/358635b3-41df-4e35-8de1-c747671c043c-secret-volume\") pod \"collect-profiles-29320710-84s5f\" (UID: \"358635b3-41df-4e35-8de1-c747671c043c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320710-84s5f" Sep 30 14:30:00 crc kubenswrapper[4676]: I0930 14:30:00.364812 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bkgp\" (UniqueName: \"kubernetes.io/projected/358635b3-41df-4e35-8de1-c747671c043c-kube-api-access-8bkgp\") pod \"collect-profiles-29320710-84s5f\" (UID: \"358635b3-41df-4e35-8de1-c747671c043c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320710-84s5f" Sep 30 14:30:00 crc kubenswrapper[4676]: I0930 14:30:00.473535 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320710-84s5f" Sep 30 14:30:00 crc kubenswrapper[4676]: I0930 14:30:00.945286 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320710-84s5f"] Sep 30 14:30:01 crc kubenswrapper[4676]: I0930 14:30:01.198608 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320710-84s5f" event={"ID":"358635b3-41df-4e35-8de1-c747671c043c","Type":"ContainerStarted","Data":"0f545cd3b73093934e89366588e462430c3fdb303aaad69d61404f891e4b7c0c"} Sep 30 14:30:01 crc kubenswrapper[4676]: I0930 14:30:01.199037 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320710-84s5f" event={"ID":"358635b3-41df-4e35-8de1-c747671c043c","Type":"ContainerStarted","Data":"a5a6207673325e376670ccf5bc27fe653bdad69323e7a08e0eb6ec01899c865f"} Sep 30 14:30:01 crc kubenswrapper[4676]: I0930 14:30:01.234870 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29320710-84s5f" podStartSLOduration=1.234850096 podStartE2EDuration="1.234850096s" podCreationTimestamp="2025-09-30 14:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 14:30:01.232872206 +0000 UTC m=+1905.215960645" watchObservedRunningTime="2025-09-30 14:30:01.234850096 +0000 UTC m=+1905.217938525" Sep 30 14:30:02 crc kubenswrapper[4676]: I0930 14:30:02.208094 4676 generic.go:334] "Generic (PLEG): container finished" podID="358635b3-41df-4e35-8de1-c747671c043c" containerID="0f545cd3b73093934e89366588e462430c3fdb303aaad69d61404f891e4b7c0c" exitCode=0 Sep 30 14:30:02 crc kubenswrapper[4676]: I0930 14:30:02.208143 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320710-84s5f" event={"ID":"358635b3-41df-4e35-8de1-c747671c043c","Type":"ContainerDied","Data":"0f545cd3b73093934e89366588e462430c3fdb303aaad69d61404f891e4b7c0c"} Sep 30 14:30:03 crc kubenswrapper[4676]: I0930 14:30:03.557454 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320710-84s5f" Sep 30 14:30:03 crc kubenswrapper[4676]: I0930 14:30:03.582570 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/358635b3-41df-4e35-8de1-c747671c043c-config-volume\") pod \"358635b3-41df-4e35-8de1-c747671c043c\" (UID: \"358635b3-41df-4e35-8de1-c747671c043c\") " Sep 30 14:30:03 crc kubenswrapper[4676]: I0930 14:30:03.582646 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/358635b3-41df-4e35-8de1-c747671c043c-secret-volume\") pod \"358635b3-41df-4e35-8de1-c747671c043c\" (UID: \"358635b3-41df-4e35-8de1-c747671c043c\") " Sep 30 14:30:03 crc kubenswrapper[4676]: I0930 14:30:03.582777 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bkgp\" (UniqueName: \"kubernetes.io/projected/358635b3-41df-4e35-8de1-c747671c043c-kube-api-access-8bkgp\") pod \"358635b3-41df-4e35-8de1-c747671c043c\" (UID: \"358635b3-41df-4e35-8de1-c747671c043c\") " Sep 30 14:30:03 crc kubenswrapper[4676]: I0930 14:30:03.583534 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/358635b3-41df-4e35-8de1-c747671c043c-config-volume" (OuterVolumeSpecName: "config-volume") pod "358635b3-41df-4e35-8de1-c747671c043c" (UID: "358635b3-41df-4e35-8de1-c747671c043c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:30:03 crc kubenswrapper[4676]: I0930 14:30:03.589137 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/358635b3-41df-4e35-8de1-c747671c043c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "358635b3-41df-4e35-8de1-c747671c043c" (UID: "358635b3-41df-4e35-8de1-c747671c043c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:30:03 crc kubenswrapper[4676]: I0930 14:30:03.598780 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/358635b3-41df-4e35-8de1-c747671c043c-kube-api-access-8bkgp" (OuterVolumeSpecName: "kube-api-access-8bkgp") pod "358635b3-41df-4e35-8de1-c747671c043c" (UID: "358635b3-41df-4e35-8de1-c747671c043c"). InnerVolumeSpecName "kube-api-access-8bkgp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:30:03 crc kubenswrapper[4676]: I0930 14:30:03.684910 4676 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/358635b3-41df-4e35-8de1-c747671c043c-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 14:30:03 crc kubenswrapper[4676]: I0930 14:30:03.684961 4676 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/358635b3-41df-4e35-8de1-c747671c043c-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 14:30:03 crc kubenswrapper[4676]: I0930 14:30:03.684978 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bkgp\" (UniqueName: \"kubernetes.io/projected/358635b3-41df-4e35-8de1-c747671c043c-kube-api-access-8bkgp\") on node \"crc\" DevicePath \"\"" Sep 30 14:30:04 crc kubenswrapper[4676]: I0930 14:30:04.230564 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320710-84s5f" event={"ID":"358635b3-41df-4e35-8de1-c747671c043c","Type":"ContainerDied","Data":"a5a6207673325e376670ccf5bc27fe653bdad69323e7a08e0eb6ec01899c865f"} Sep 30 14:30:04 crc kubenswrapper[4676]: I0930 14:30:04.230621 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5a6207673325e376670ccf5bc27fe653bdad69323e7a08e0eb6ec01899c865f" Sep 30 14:30:04 crc kubenswrapper[4676]: I0930 14:30:04.230671 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320710-84s5f" Sep 30 14:30:10 crc kubenswrapper[4676]: I0930 14:30:10.436773 4676 scope.go:117] "RemoveContainer" containerID="594bb900fe8a86f6d24186adf1f821b6fb52e495dce4ee8695064f3b8033f590" Sep 30 14:30:10 crc kubenswrapper[4676]: E0930 14:30:10.438090 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:30:16 crc kubenswrapper[4676]: I0930 14:30:16.045212 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-g276r"] Sep 30 14:30:16 crc kubenswrapper[4676]: I0930 14:30:16.055690 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-g276r"] Sep 30 14:30:17 crc kubenswrapper[4676]: I0930 14:30:17.446951 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da58f130-eba3-4e56-97c1-6eba2641fa7d" path="/var/lib/kubelet/pods/da58f130-eba3-4e56-97c1-6eba2641fa7d/volumes" Sep 30 14:30:18 crc kubenswrapper[4676]: I0930 14:30:18.025571 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-45gkb"] Sep 30 14:30:18 crc kubenswrapper[4676]: I0930 14:30:18.035953 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-45gkb"] Sep 30 14:30:19 crc kubenswrapper[4676]: I0930 14:30:19.380927 4676 generic.go:334] "Generic (PLEG): container finished" podID="8b08c117-d7d7-4bc3-89a0-8a05169688fa" containerID="b77518ddbf654667cdb9dd5f8d8df695f26804f36fa36399c14cc3255e526df8" exitCode=0 Sep 30 14:30:19 crc kubenswrapper[4676]: I0930 14:30:19.381013 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p4p5c" event={"ID":"8b08c117-d7d7-4bc3-89a0-8a05169688fa","Type":"ContainerDied","Data":"b77518ddbf654667cdb9dd5f8d8df695f26804f36fa36399c14cc3255e526df8"} Sep 30 14:30:19 crc kubenswrapper[4676]: I0930 14:30:19.444375 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b04eaad6-de72-4265-aa3d-fda03a0ea925" path="/var/lib/kubelet/pods/b04eaad6-de72-4265-aa3d-fda03a0ea925/volumes" Sep 30 14:30:20 crc kubenswrapper[4676]: I0930 14:30:20.844071 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p4p5c" Sep 30 14:30:20 crc kubenswrapper[4676]: I0930 14:30:20.924587 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhfcw\" (UniqueName: \"kubernetes.io/projected/8b08c117-d7d7-4bc3-89a0-8a05169688fa-kube-api-access-zhfcw\") pod \"8b08c117-d7d7-4bc3-89a0-8a05169688fa\" (UID: \"8b08c117-d7d7-4bc3-89a0-8a05169688fa\") " Sep 30 14:30:20 crc kubenswrapper[4676]: I0930 14:30:20.924994 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8b08c117-d7d7-4bc3-89a0-8a05169688fa-inventory\") pod \"8b08c117-d7d7-4bc3-89a0-8a05169688fa\" (UID: \"8b08c117-d7d7-4bc3-89a0-8a05169688fa\") " Sep 30 14:30:20 crc kubenswrapper[4676]: I0930 14:30:20.925103 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8b08c117-d7d7-4bc3-89a0-8a05169688fa-ssh-key\") pod \"8b08c117-d7d7-4bc3-89a0-8a05169688fa\" (UID: \"8b08c117-d7d7-4bc3-89a0-8a05169688fa\") " Sep 30 14:30:20 crc kubenswrapper[4676]: I0930 14:30:20.932597 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b08c117-d7d7-4bc3-89a0-8a05169688fa-kube-api-access-zhfcw" (OuterVolumeSpecName: "kube-api-access-zhfcw") pod "8b08c117-d7d7-4bc3-89a0-8a05169688fa" (UID: "8b08c117-d7d7-4bc3-89a0-8a05169688fa"). InnerVolumeSpecName "kube-api-access-zhfcw". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:30:20 crc kubenswrapper[4676]: I0930 14:30:20.953663 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b08c117-d7d7-4bc3-89a0-8a05169688fa-inventory" (OuterVolumeSpecName: "inventory") pod "8b08c117-d7d7-4bc3-89a0-8a05169688fa" (UID: "8b08c117-d7d7-4bc3-89a0-8a05169688fa"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:30:20 crc kubenswrapper[4676]: I0930 14:30:20.954020 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b08c117-d7d7-4bc3-89a0-8a05169688fa-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8b08c117-d7d7-4bc3-89a0-8a05169688fa" (UID: "8b08c117-d7d7-4bc3-89a0-8a05169688fa"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:30:21 crc kubenswrapper[4676]: I0930 14:30:21.027455 4676 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8b08c117-d7d7-4bc3-89a0-8a05169688fa-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 14:30:21 crc kubenswrapper[4676]: I0930 14:30:21.027506 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhfcw\" (UniqueName: \"kubernetes.io/projected/8b08c117-d7d7-4bc3-89a0-8a05169688fa-kube-api-access-zhfcw\") on node \"crc\" DevicePath \"\"" Sep 30 14:30:21 crc kubenswrapper[4676]: I0930 14:30:21.027518 4676 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8b08c117-d7d7-4bc3-89a0-8a05169688fa-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 14:30:21 crc kubenswrapper[4676]: I0930 14:30:21.170815 4676 scope.go:117] "RemoveContainer" containerID="630dba75d1cd824e7fa8825717aee62c6199b7ac50d57cd0d1d5426a4357dd15" Sep 30 14:30:21 crc kubenswrapper[4676]: I0930 14:30:21.247122 4676 scope.go:117] "RemoveContainer" containerID="f88a084860b35a0926c41bf279dcba241b63c4b3ad1a0f192816a697b1df10bd" Sep 30 14:30:21 crc kubenswrapper[4676]: I0930 14:30:21.280117 4676 scope.go:117] "RemoveContainer" containerID="aaf813e9cfb6cabf673e2eb3c8f7aaa9f41ca89bdb522967fe99cf9153836c39" Sep 30 14:30:21 crc kubenswrapper[4676]: I0930 14:30:21.300100 4676 scope.go:117] "RemoveContainer" containerID="2d2735e743682cc3ec46f83546a934923f0909888643923321d0b0c8d3f62c52" Sep 30 14:30:21 crc kubenswrapper[4676]: I0930 14:30:21.333166 4676 scope.go:117] "RemoveContainer" containerID="da55696fa9c4f256f295d73488e460c9e0e5b8cb2b59e6126705f55467203b89" Sep 30 14:30:21 crc kubenswrapper[4676]: I0930 14:30:21.405379 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p4p5c" Sep 30 14:30:21 crc kubenswrapper[4676]: I0930 14:30:21.405978 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-p4p5c" event={"ID":"8b08c117-d7d7-4bc3-89a0-8a05169688fa","Type":"ContainerDied","Data":"a948a9a08678cd6c62ef614dd15c41b8e192e45b34ef35938064fa5bc276179e"} Sep 30 14:30:21 crc kubenswrapper[4676]: I0930 14:30:21.406036 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a948a9a08678cd6c62ef614dd15c41b8e192e45b34ef35938064fa5bc276179e" Sep 30 14:30:21 crc kubenswrapper[4676]: I0930 14:30:21.501759 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-56d9g"] Sep 30 14:30:21 crc kubenswrapper[4676]: E0930 14:30:21.502272 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b08c117-d7d7-4bc3-89a0-8a05169688fa" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Sep 30 14:30:21 crc kubenswrapper[4676]: I0930 14:30:21.502299 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b08c117-d7d7-4bc3-89a0-8a05169688fa" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Sep 30 14:30:21 crc kubenswrapper[4676]: E0930 14:30:21.502326 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="358635b3-41df-4e35-8de1-c747671c043c" containerName="collect-profiles" Sep 30 14:30:21 crc kubenswrapper[4676]: I0930 14:30:21.502335 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="358635b3-41df-4e35-8de1-c747671c043c" containerName="collect-profiles" Sep 30 14:30:21 crc kubenswrapper[4676]: I0930 14:30:21.502623 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="358635b3-41df-4e35-8de1-c747671c043c" containerName="collect-profiles" Sep 30 14:30:21 crc kubenswrapper[4676]: I0930 14:30:21.502658 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b08c117-d7d7-4bc3-89a0-8a05169688fa" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Sep 30 14:30:21 crc kubenswrapper[4676]: I0930 14:30:21.503336 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-56d9g" Sep 30 14:30:21 crc kubenswrapper[4676]: I0930 14:30:21.505972 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 14:30:21 crc kubenswrapper[4676]: I0930 14:30:21.506421 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 14:30:21 crc kubenswrapper[4676]: I0930 14:30:21.506459 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 14:30:21 crc kubenswrapper[4676]: I0930 14:30:21.506533 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mzpth" Sep 30 14:30:21 crc kubenswrapper[4676]: I0930 14:30:21.524990 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-56d9g"] Sep 30 14:30:21 crc kubenswrapper[4676]: I0930 14:30:21.648874 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/24983a6b-dac1-4567-b8b8-ded54e7287bb-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-56d9g\" (UID: \"24983a6b-dac1-4567-b8b8-ded54e7287bb\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-56d9g" Sep 30 14:30:21 crc kubenswrapper[4676]: I0930 14:30:21.649463 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/24983a6b-dac1-4567-b8b8-ded54e7287bb-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-56d9g\" (UID: \"24983a6b-dac1-4567-b8b8-ded54e7287bb\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-56d9g" Sep 30 14:30:21 crc kubenswrapper[4676]: I0930 14:30:21.649666 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2qbb\" (UniqueName: \"kubernetes.io/projected/24983a6b-dac1-4567-b8b8-ded54e7287bb-kube-api-access-s2qbb\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-56d9g\" (UID: \"24983a6b-dac1-4567-b8b8-ded54e7287bb\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-56d9g" Sep 30 14:30:21 crc kubenswrapper[4676]: I0930 14:30:21.752639 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/24983a6b-dac1-4567-b8b8-ded54e7287bb-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-56d9g\" (UID: \"24983a6b-dac1-4567-b8b8-ded54e7287bb\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-56d9g" Sep 30 14:30:21 crc kubenswrapper[4676]: I0930 14:30:21.752824 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/24983a6b-dac1-4567-b8b8-ded54e7287bb-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-56d9g\" (UID: \"24983a6b-dac1-4567-b8b8-ded54e7287bb\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-56d9g" Sep 30 14:30:21 crc kubenswrapper[4676]: I0930 14:30:21.752932 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2qbb\" (UniqueName: \"kubernetes.io/projected/24983a6b-dac1-4567-b8b8-ded54e7287bb-kube-api-access-s2qbb\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-56d9g\" (UID: \"24983a6b-dac1-4567-b8b8-ded54e7287bb\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-56d9g" Sep 30 14:30:21 crc kubenswrapper[4676]: I0930 14:30:21.757700 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/24983a6b-dac1-4567-b8b8-ded54e7287bb-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-56d9g\" (UID: \"24983a6b-dac1-4567-b8b8-ded54e7287bb\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-56d9g" Sep 30 14:30:21 crc kubenswrapper[4676]: I0930 14:30:21.761053 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/24983a6b-dac1-4567-b8b8-ded54e7287bb-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-56d9g\" (UID: \"24983a6b-dac1-4567-b8b8-ded54e7287bb\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-56d9g" Sep 30 14:30:21 crc kubenswrapper[4676]: I0930 14:30:21.770414 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2qbb\" (UniqueName: \"kubernetes.io/projected/24983a6b-dac1-4567-b8b8-ded54e7287bb-kube-api-access-s2qbb\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-56d9g\" (UID: \"24983a6b-dac1-4567-b8b8-ded54e7287bb\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-56d9g" Sep 30 14:30:21 crc kubenswrapper[4676]: I0930 14:30:21.823572 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-56d9g" Sep 30 14:30:22 crc kubenswrapper[4676]: I0930 14:30:22.371455 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-56d9g"] Sep 30 14:30:22 crc kubenswrapper[4676]: W0930 14:30:22.378156 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24983a6b_dac1_4567_b8b8_ded54e7287bb.slice/crio-063f660872a97fc439944ded324989c5741b6f5ac2240341caf19ca2744b829a WatchSource:0}: Error finding container 063f660872a97fc439944ded324989c5741b6f5ac2240341caf19ca2744b829a: Status 404 returned error can't find the container with id 063f660872a97fc439944ded324989c5741b6f5ac2240341caf19ca2744b829a Sep 30 14:30:22 crc kubenswrapper[4676]: I0930 14:30:22.380770 4676 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 14:30:22 crc kubenswrapper[4676]: I0930 14:30:22.426515 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-56d9g" event={"ID":"24983a6b-dac1-4567-b8b8-ded54e7287bb","Type":"ContainerStarted","Data":"063f660872a97fc439944ded324989c5741b6f5ac2240341caf19ca2744b829a"} Sep 30 14:30:23 crc kubenswrapper[4676]: I0930 14:30:23.445299 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-56d9g" event={"ID":"24983a6b-dac1-4567-b8b8-ded54e7287bb","Type":"ContainerStarted","Data":"729bf626cb28a393ae13ae5835c5ad6b364ce23c873234cccb8d24597ee15a4f"} Sep 30 14:30:23 crc kubenswrapper[4676]: I0930 14:30:23.474627 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-56d9g" podStartSLOduration=2.037893673 podStartE2EDuration="2.474596852s" podCreationTimestamp="2025-09-30 14:30:21 +0000 UTC" firstStartedPulling="2025-09-30 14:30:22.38051233 +0000 UTC m=+1926.363600759" lastFinishedPulling="2025-09-30 14:30:22.817215509 +0000 UTC m=+1926.800303938" observedRunningTime="2025-09-30 14:30:23.46774828 +0000 UTC m=+1927.450836709" watchObservedRunningTime="2025-09-30 14:30:23.474596852 +0000 UTC m=+1927.457685281" Sep 30 14:30:24 crc kubenswrapper[4676]: I0930 14:30:24.043418 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-fptlq"] Sep 30 14:30:24 crc kubenswrapper[4676]: I0930 14:30:24.050866 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-fptlq"] Sep 30 14:30:25 crc kubenswrapper[4676]: I0930 14:30:25.436165 4676 scope.go:117] "RemoveContainer" containerID="594bb900fe8a86f6d24186adf1f821b6fb52e495dce4ee8695064f3b8033f590" Sep 30 14:30:25 crc kubenswrapper[4676]: E0930 14:30:25.436801 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:30:25 crc kubenswrapper[4676]: I0930 14:30:25.447561 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7724b184-865f-4ced-bdf7-867184cf3647" path="/var/lib/kubelet/pods/7724b184-865f-4ced-bdf7-867184cf3647/volumes" Sep 30 14:30:31 crc kubenswrapper[4676]: I0930 14:30:31.029260 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-b5nzl"] Sep 30 14:30:31 crc kubenswrapper[4676]: I0930 14:30:31.036725 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-b5nzl"] Sep 30 14:30:31 crc kubenswrapper[4676]: I0930 14:30:31.443520 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1fd6e1e-38da-4634-9862-21c027ea770a" path="/var/lib/kubelet/pods/c1fd6e1e-38da-4634-9862-21c027ea770a/volumes" Sep 30 14:30:38 crc kubenswrapper[4676]: I0930 14:30:38.432563 4676 scope.go:117] "RemoveContainer" containerID="594bb900fe8a86f6d24186adf1f821b6fb52e495dce4ee8695064f3b8033f590" Sep 30 14:30:38 crc kubenswrapper[4676]: E0930 14:30:38.433347 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:30:43 crc kubenswrapper[4676]: I0930 14:30:43.055311 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-wxj4s"] Sep 30 14:30:43 crc kubenswrapper[4676]: I0930 14:30:43.068400 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-wxj4s"] Sep 30 14:30:43 crc kubenswrapper[4676]: I0930 14:30:43.444166 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63080796-b0be-4b3a-8db5-8242e2eb2bb3" path="/var/lib/kubelet/pods/63080796-b0be-4b3a-8db5-8242e2eb2bb3/volumes" Sep 30 14:30:52 crc kubenswrapper[4676]: I0930 14:30:52.433562 4676 scope.go:117] "RemoveContainer" containerID="594bb900fe8a86f6d24186adf1f821b6fb52e495dce4ee8695064f3b8033f590" Sep 30 14:30:52 crc kubenswrapper[4676]: E0930 14:30:52.434293 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:30:58 crc kubenswrapper[4676]: I0930 14:30:58.029761 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-2mftx"] Sep 30 14:30:58 crc kubenswrapper[4676]: I0930 14:30:58.036847 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-2mftx"] Sep 30 14:30:59 crc kubenswrapper[4676]: I0930 14:30:59.446766 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9a3961e-e61e-4a02-9fe4-bc1b5ae097cc" path="/var/lib/kubelet/pods/f9a3961e-e61e-4a02-9fe4-bc1b5ae097cc/volumes" Sep 30 14:31:03 crc kubenswrapper[4676]: I0930 14:31:03.433647 4676 scope.go:117] "RemoveContainer" containerID="594bb900fe8a86f6d24186adf1f821b6fb52e495dce4ee8695064f3b8033f590" Sep 30 14:31:03 crc kubenswrapper[4676]: E0930 14:31:03.434919 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:31:14 crc kubenswrapper[4676]: I0930 14:31:14.433744 4676 scope.go:117] "RemoveContainer" containerID="594bb900fe8a86f6d24186adf1f821b6fb52e495dce4ee8695064f3b8033f590" Sep 30 14:31:14 crc kubenswrapper[4676]: E0930 14:31:14.434640 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:31:21 crc kubenswrapper[4676]: I0930 14:31:21.474679 4676 scope.go:117] "RemoveContainer" containerID="75ae60f130bd580b01b0188c854fd3510ab5c9f6f7640cee5f9d9c91dd5d9345" Sep 30 14:31:21 crc kubenswrapper[4676]: I0930 14:31:21.525218 4676 scope.go:117] "RemoveContainer" containerID="b93a672a897fe21e83c7690dbae28e10b7055fe7971a695f3c3c8ae60a15a1ba" Sep 30 14:31:21 crc kubenswrapper[4676]: I0930 14:31:21.561699 4676 scope.go:117] "RemoveContainer" containerID="086ac2a3788928169cf290661c3c50dd7ad4957cff71c7bde933f95169f8b4ef" Sep 30 14:31:21 crc kubenswrapper[4676]: I0930 14:31:21.623314 4676 scope.go:117] "RemoveContainer" containerID="a9ee4e61b97d16119699b853291cd49fbeb6cc8203cf25984ecfb1d6e96db9cf" Sep 30 14:31:26 crc kubenswrapper[4676]: I0930 14:31:26.051920 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-tlcl8"] Sep 30 14:31:26 crc kubenswrapper[4676]: I0930 14:31:26.060736 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-vb8zf"] Sep 30 14:31:26 crc kubenswrapper[4676]: I0930 14:31:26.069229 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-4tch9"] Sep 30 14:31:26 crc kubenswrapper[4676]: I0930 14:31:26.076297 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-tlcl8"] Sep 30 14:31:26 crc kubenswrapper[4676]: I0930 14:31:26.083734 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-vb8zf"] Sep 30 14:31:26 crc kubenswrapper[4676]: I0930 14:31:26.091387 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-4tch9"] Sep 30 14:31:26 crc kubenswrapper[4676]: I0930 14:31:26.433689 4676 scope.go:117] "RemoveContainer" containerID="594bb900fe8a86f6d24186adf1f821b6fb52e495dce4ee8695064f3b8033f590" Sep 30 14:31:26 crc kubenswrapper[4676]: E0930 14:31:26.434018 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:31:27 crc kubenswrapper[4676]: I0930 14:31:27.443750 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c62023c-427e-406f-af99-0d50b6808acb" path="/var/lib/kubelet/pods/2c62023c-427e-406f-af99-0d50b6808acb/volumes" Sep 30 14:31:27 crc kubenswrapper[4676]: I0930 14:31:27.444475 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34ae48f1-6128-48b6-a472-a35226869dc2" path="/var/lib/kubelet/pods/34ae48f1-6128-48b6-a472-a35226869dc2/volumes" Sep 30 14:31:27 crc kubenswrapper[4676]: I0930 14:31:27.445055 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2c5910c-a5af-4d9f-9c16-81ac73cd7a5a" path="/var/lib/kubelet/pods/d2c5910c-a5af-4d9f-9c16-81ac73cd7a5a/volumes" Sep 30 14:31:35 crc kubenswrapper[4676]: I0930 14:31:35.034356 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-1872-account-create-m4vlg"] Sep 30 14:31:35 crc kubenswrapper[4676]: I0930 14:31:35.045386 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-5f61-account-create-jbd7r"] Sep 30 14:31:35 crc kubenswrapper[4676]: I0930 14:31:35.053360 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-1872-account-create-m4vlg"] Sep 30 14:31:35 crc kubenswrapper[4676]: I0930 14:31:35.060994 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-5f61-account-create-jbd7r"] Sep 30 14:31:35 crc kubenswrapper[4676]: I0930 14:31:35.445519 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96101623-7868-4b78-8c66-9aeb8e5e8ec5" path="/var/lib/kubelet/pods/96101623-7868-4b78-8c66-9aeb8e5e8ec5/volumes" Sep 30 14:31:35 crc kubenswrapper[4676]: I0930 14:31:35.446535 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8eb7b5d-dac6-4093-a364-cc7311e159ef" path="/var/lib/kubelet/pods/d8eb7b5d-dac6-4093-a364-cc7311e159ef/volumes" Sep 30 14:31:36 crc kubenswrapper[4676]: I0930 14:31:36.032418 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-ccb8-account-create-f8xw8"] Sep 30 14:31:36 crc kubenswrapper[4676]: I0930 14:31:36.042544 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-ccb8-account-create-f8xw8"] Sep 30 14:31:36 crc kubenswrapper[4676]: I0930 14:31:36.112052 4676 generic.go:334] "Generic (PLEG): container finished" podID="24983a6b-dac1-4567-b8b8-ded54e7287bb" containerID="729bf626cb28a393ae13ae5835c5ad6b364ce23c873234cccb8d24597ee15a4f" exitCode=0 Sep 30 14:31:36 crc kubenswrapper[4676]: I0930 14:31:36.112100 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-56d9g" event={"ID":"24983a6b-dac1-4567-b8b8-ded54e7287bb","Type":"ContainerDied","Data":"729bf626cb28a393ae13ae5835c5ad6b364ce23c873234cccb8d24597ee15a4f"} Sep 30 14:31:37 crc kubenswrapper[4676]: I0930 14:31:37.445374 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69502754-998e-4124-8956-67747a925b66" path="/var/lib/kubelet/pods/69502754-998e-4124-8956-67747a925b66/volumes" Sep 30 14:31:37 crc kubenswrapper[4676]: I0930 14:31:37.519702 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-56d9g" Sep 30 14:31:37 crc kubenswrapper[4676]: I0930 14:31:37.572158 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/24983a6b-dac1-4567-b8b8-ded54e7287bb-inventory\") pod \"24983a6b-dac1-4567-b8b8-ded54e7287bb\" (UID: \"24983a6b-dac1-4567-b8b8-ded54e7287bb\") " Sep 30 14:31:37 crc kubenswrapper[4676]: I0930 14:31:37.572294 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2qbb\" (UniqueName: \"kubernetes.io/projected/24983a6b-dac1-4567-b8b8-ded54e7287bb-kube-api-access-s2qbb\") pod \"24983a6b-dac1-4567-b8b8-ded54e7287bb\" (UID: \"24983a6b-dac1-4567-b8b8-ded54e7287bb\") " Sep 30 14:31:37 crc kubenswrapper[4676]: I0930 14:31:37.572344 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/24983a6b-dac1-4567-b8b8-ded54e7287bb-ssh-key\") pod \"24983a6b-dac1-4567-b8b8-ded54e7287bb\" (UID: \"24983a6b-dac1-4567-b8b8-ded54e7287bb\") " Sep 30 14:31:37 crc kubenswrapper[4676]: I0930 14:31:37.592131 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24983a6b-dac1-4567-b8b8-ded54e7287bb-kube-api-access-s2qbb" (OuterVolumeSpecName: "kube-api-access-s2qbb") pod "24983a6b-dac1-4567-b8b8-ded54e7287bb" (UID: "24983a6b-dac1-4567-b8b8-ded54e7287bb"). InnerVolumeSpecName "kube-api-access-s2qbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:31:37 crc kubenswrapper[4676]: I0930 14:31:37.604461 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24983a6b-dac1-4567-b8b8-ded54e7287bb-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "24983a6b-dac1-4567-b8b8-ded54e7287bb" (UID: "24983a6b-dac1-4567-b8b8-ded54e7287bb"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:31:37 crc kubenswrapper[4676]: I0930 14:31:37.617976 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24983a6b-dac1-4567-b8b8-ded54e7287bb-inventory" (OuterVolumeSpecName: "inventory") pod "24983a6b-dac1-4567-b8b8-ded54e7287bb" (UID: "24983a6b-dac1-4567-b8b8-ded54e7287bb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:31:37 crc kubenswrapper[4676]: I0930 14:31:37.680225 4676 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/24983a6b-dac1-4567-b8b8-ded54e7287bb-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 14:31:37 crc kubenswrapper[4676]: I0930 14:31:37.680271 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2qbb\" (UniqueName: \"kubernetes.io/projected/24983a6b-dac1-4567-b8b8-ded54e7287bb-kube-api-access-s2qbb\") on node \"crc\" DevicePath \"\"" Sep 30 14:31:37 crc kubenswrapper[4676]: I0930 14:31:37.680283 4676 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/24983a6b-dac1-4567-b8b8-ded54e7287bb-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 14:31:38 crc kubenswrapper[4676]: I0930 14:31:38.131337 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-56d9g" event={"ID":"24983a6b-dac1-4567-b8b8-ded54e7287bb","Type":"ContainerDied","Data":"063f660872a97fc439944ded324989c5741b6f5ac2240341caf19ca2744b829a"} Sep 30 14:31:38 crc kubenswrapper[4676]: I0930 14:31:38.131385 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="063f660872a97fc439944ded324989c5741b6f5ac2240341caf19ca2744b829a" Sep 30 14:31:38 crc kubenswrapper[4676]: I0930 14:31:38.131418 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-56d9g" Sep 30 14:31:38 crc kubenswrapper[4676]: I0930 14:31:38.212418 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6v5pp"] Sep 30 14:31:38 crc kubenswrapper[4676]: E0930 14:31:38.213075 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24983a6b-dac1-4567-b8b8-ded54e7287bb" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Sep 30 14:31:38 crc kubenswrapper[4676]: I0930 14:31:38.213194 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="24983a6b-dac1-4567-b8b8-ded54e7287bb" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Sep 30 14:31:38 crc kubenswrapper[4676]: I0930 14:31:38.213440 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="24983a6b-dac1-4567-b8b8-ded54e7287bb" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Sep 30 14:31:38 crc kubenswrapper[4676]: I0930 14:31:38.214143 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6v5pp" Sep 30 14:31:38 crc kubenswrapper[4676]: I0930 14:31:38.216642 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 14:31:38 crc kubenswrapper[4676]: I0930 14:31:38.216942 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mzpth" Sep 30 14:31:38 crc kubenswrapper[4676]: I0930 14:31:38.217024 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 14:31:38 crc kubenswrapper[4676]: I0930 14:31:38.217394 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 14:31:38 crc kubenswrapper[4676]: I0930 14:31:38.234676 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6v5pp"] Sep 30 14:31:38 crc kubenswrapper[4676]: I0930 14:31:38.292653 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d889c774-3b4f-4b05-9fd4-f18ad2aa2e4b-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-6v5pp\" (UID: \"d889c774-3b4f-4b05-9fd4-f18ad2aa2e4b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6v5pp" Sep 30 14:31:38 crc kubenswrapper[4676]: I0930 14:31:38.292929 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tj9lj\" (UniqueName: \"kubernetes.io/projected/d889c774-3b4f-4b05-9fd4-f18ad2aa2e4b-kube-api-access-tj9lj\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-6v5pp\" (UID: \"d889c774-3b4f-4b05-9fd4-f18ad2aa2e4b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6v5pp" Sep 30 14:31:38 crc kubenswrapper[4676]: I0930 14:31:38.292969 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d889c774-3b4f-4b05-9fd4-f18ad2aa2e4b-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-6v5pp\" (UID: \"d889c774-3b4f-4b05-9fd4-f18ad2aa2e4b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6v5pp" Sep 30 14:31:38 crc kubenswrapper[4676]: I0930 14:31:38.394484 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tj9lj\" (UniqueName: \"kubernetes.io/projected/d889c774-3b4f-4b05-9fd4-f18ad2aa2e4b-kube-api-access-tj9lj\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-6v5pp\" (UID: \"d889c774-3b4f-4b05-9fd4-f18ad2aa2e4b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6v5pp" Sep 30 14:31:38 crc kubenswrapper[4676]: I0930 14:31:38.394543 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d889c774-3b4f-4b05-9fd4-f18ad2aa2e4b-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-6v5pp\" (UID: \"d889c774-3b4f-4b05-9fd4-f18ad2aa2e4b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6v5pp" Sep 30 14:31:38 crc kubenswrapper[4676]: I0930 14:31:38.394595 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d889c774-3b4f-4b05-9fd4-f18ad2aa2e4b-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-6v5pp\" (UID: \"d889c774-3b4f-4b05-9fd4-f18ad2aa2e4b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6v5pp" Sep 30 14:31:38 crc kubenswrapper[4676]: I0930 14:31:38.399349 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d889c774-3b4f-4b05-9fd4-f18ad2aa2e4b-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-6v5pp\" (UID: \"d889c774-3b4f-4b05-9fd4-f18ad2aa2e4b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6v5pp" Sep 30 14:31:38 crc kubenswrapper[4676]: I0930 14:31:38.399410 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d889c774-3b4f-4b05-9fd4-f18ad2aa2e4b-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-6v5pp\" (UID: \"d889c774-3b4f-4b05-9fd4-f18ad2aa2e4b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6v5pp" Sep 30 14:31:38 crc kubenswrapper[4676]: I0930 14:31:38.421961 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tj9lj\" (UniqueName: \"kubernetes.io/projected/d889c774-3b4f-4b05-9fd4-f18ad2aa2e4b-kube-api-access-tj9lj\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-6v5pp\" (UID: \"d889c774-3b4f-4b05-9fd4-f18ad2aa2e4b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6v5pp" Sep 30 14:31:38 crc kubenswrapper[4676]: I0930 14:31:38.530703 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6v5pp" Sep 30 14:31:39 crc kubenswrapper[4676]: I0930 14:31:39.108567 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6v5pp"] Sep 30 14:31:39 crc kubenswrapper[4676]: I0930 14:31:39.144605 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6v5pp" event={"ID":"d889c774-3b4f-4b05-9fd4-f18ad2aa2e4b","Type":"ContainerStarted","Data":"46dd61f00e55ecc6482fdec65059b15ed9576924ec3c962df0025e6acd38414d"} Sep 30 14:31:39 crc kubenswrapper[4676]: I0930 14:31:39.434157 4676 scope.go:117] "RemoveContainer" containerID="594bb900fe8a86f6d24186adf1f821b6fb52e495dce4ee8695064f3b8033f590" Sep 30 14:31:39 crc kubenswrapper[4676]: E0930 14:31:39.434516 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:31:40 crc kubenswrapper[4676]: I0930 14:31:40.155744 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6v5pp" event={"ID":"d889c774-3b4f-4b05-9fd4-f18ad2aa2e4b","Type":"ContainerStarted","Data":"43135b8ab063d80114d4dd2836755b9fb4804a3ceb435b8f8c2d7461de70f597"} Sep 30 14:31:40 crc kubenswrapper[4676]: I0930 14:31:40.177867 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6v5pp" podStartSLOduration=1.679678571 podStartE2EDuration="2.177839597s" podCreationTimestamp="2025-09-30 14:31:38 +0000 UTC" firstStartedPulling="2025-09-30 14:31:39.112705364 +0000 UTC m=+2003.095793793" lastFinishedPulling="2025-09-30 14:31:39.61086639 +0000 UTC m=+2003.593954819" observedRunningTime="2025-09-30 14:31:40.171929538 +0000 UTC m=+2004.155017967" watchObservedRunningTime="2025-09-30 14:31:40.177839597 +0000 UTC m=+2004.160928026" Sep 30 14:31:45 crc kubenswrapper[4676]: I0930 14:31:45.205231 4676 generic.go:334] "Generic (PLEG): container finished" podID="d889c774-3b4f-4b05-9fd4-f18ad2aa2e4b" containerID="43135b8ab063d80114d4dd2836755b9fb4804a3ceb435b8f8c2d7461de70f597" exitCode=0 Sep 30 14:31:45 crc kubenswrapper[4676]: I0930 14:31:45.205359 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6v5pp" event={"ID":"d889c774-3b4f-4b05-9fd4-f18ad2aa2e4b","Type":"ContainerDied","Data":"43135b8ab063d80114d4dd2836755b9fb4804a3ceb435b8f8c2d7461de70f597"} Sep 30 14:31:46 crc kubenswrapper[4676]: I0930 14:31:46.637401 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6v5pp" Sep 30 14:31:46 crc kubenswrapper[4676]: I0930 14:31:46.693393 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tj9lj\" (UniqueName: \"kubernetes.io/projected/d889c774-3b4f-4b05-9fd4-f18ad2aa2e4b-kube-api-access-tj9lj\") pod \"d889c774-3b4f-4b05-9fd4-f18ad2aa2e4b\" (UID: \"d889c774-3b4f-4b05-9fd4-f18ad2aa2e4b\") " Sep 30 14:31:46 crc kubenswrapper[4676]: I0930 14:31:46.693487 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d889c774-3b4f-4b05-9fd4-f18ad2aa2e4b-ssh-key\") pod \"d889c774-3b4f-4b05-9fd4-f18ad2aa2e4b\" (UID: \"d889c774-3b4f-4b05-9fd4-f18ad2aa2e4b\") " Sep 30 14:31:46 crc kubenswrapper[4676]: I0930 14:31:46.693615 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d889c774-3b4f-4b05-9fd4-f18ad2aa2e4b-inventory\") pod \"d889c774-3b4f-4b05-9fd4-f18ad2aa2e4b\" (UID: \"d889c774-3b4f-4b05-9fd4-f18ad2aa2e4b\") " Sep 30 14:31:46 crc kubenswrapper[4676]: I0930 14:31:46.699443 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d889c774-3b4f-4b05-9fd4-f18ad2aa2e4b-kube-api-access-tj9lj" (OuterVolumeSpecName: "kube-api-access-tj9lj") pod "d889c774-3b4f-4b05-9fd4-f18ad2aa2e4b" (UID: "d889c774-3b4f-4b05-9fd4-f18ad2aa2e4b"). InnerVolumeSpecName "kube-api-access-tj9lj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:31:46 crc kubenswrapper[4676]: I0930 14:31:46.724241 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d889c774-3b4f-4b05-9fd4-f18ad2aa2e4b-inventory" (OuterVolumeSpecName: "inventory") pod "d889c774-3b4f-4b05-9fd4-f18ad2aa2e4b" (UID: "d889c774-3b4f-4b05-9fd4-f18ad2aa2e4b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:31:46 crc kubenswrapper[4676]: I0930 14:31:46.725305 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d889c774-3b4f-4b05-9fd4-f18ad2aa2e4b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d889c774-3b4f-4b05-9fd4-f18ad2aa2e4b" (UID: "d889c774-3b4f-4b05-9fd4-f18ad2aa2e4b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:31:46 crc kubenswrapper[4676]: I0930 14:31:46.796985 4676 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d889c774-3b4f-4b05-9fd4-f18ad2aa2e4b-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 14:31:46 crc kubenswrapper[4676]: I0930 14:31:46.797307 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tj9lj\" (UniqueName: \"kubernetes.io/projected/d889c774-3b4f-4b05-9fd4-f18ad2aa2e4b-kube-api-access-tj9lj\") on node \"crc\" DevicePath \"\"" Sep 30 14:31:46 crc kubenswrapper[4676]: I0930 14:31:46.797326 4676 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d889c774-3b4f-4b05-9fd4-f18ad2aa2e4b-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 14:31:47 crc kubenswrapper[4676]: I0930 14:31:47.225755 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6v5pp" event={"ID":"d889c774-3b4f-4b05-9fd4-f18ad2aa2e4b","Type":"ContainerDied","Data":"46dd61f00e55ecc6482fdec65059b15ed9576924ec3c962df0025e6acd38414d"} Sep 30 14:31:47 crc kubenswrapper[4676]: I0930 14:31:47.225818 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6v5pp" Sep 30 14:31:47 crc kubenswrapper[4676]: I0930 14:31:47.225833 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46dd61f00e55ecc6482fdec65059b15ed9576924ec3c962df0025e6acd38414d" Sep 30 14:31:47 crc kubenswrapper[4676]: I0930 14:31:47.297947 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-j8kmb"] Sep 30 14:31:47 crc kubenswrapper[4676]: E0930 14:31:47.298307 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d889c774-3b4f-4b05-9fd4-f18ad2aa2e4b" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Sep 30 14:31:47 crc kubenswrapper[4676]: I0930 14:31:47.298324 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="d889c774-3b4f-4b05-9fd4-f18ad2aa2e4b" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Sep 30 14:31:47 crc kubenswrapper[4676]: I0930 14:31:47.298553 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="d889c774-3b4f-4b05-9fd4-f18ad2aa2e4b" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Sep 30 14:31:47 crc kubenswrapper[4676]: I0930 14:31:47.299179 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j8kmb" Sep 30 14:31:47 crc kubenswrapper[4676]: I0930 14:31:47.303490 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mzpth" Sep 30 14:31:47 crc kubenswrapper[4676]: I0930 14:31:47.303902 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 14:31:47 crc kubenswrapper[4676]: I0930 14:31:47.306352 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 14:31:47 crc kubenswrapper[4676]: I0930 14:31:47.306840 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 14:31:47 crc kubenswrapper[4676]: I0930 14:31:47.312781 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-j8kmb"] Sep 30 14:31:47 crc kubenswrapper[4676]: I0930 14:31:47.409376 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cg77p\" (UniqueName: \"kubernetes.io/projected/29932146-0fdd-4717-8a42-2b04967df9ce-kube-api-access-cg77p\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-j8kmb\" (UID: \"29932146-0fdd-4717-8a42-2b04967df9ce\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j8kmb" Sep 30 14:31:47 crc kubenswrapper[4676]: I0930 14:31:47.409452 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29932146-0fdd-4717-8a42-2b04967df9ce-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-j8kmb\" (UID: \"29932146-0fdd-4717-8a42-2b04967df9ce\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j8kmb" Sep 30 14:31:47 crc kubenswrapper[4676]: I0930 14:31:47.409803 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/29932146-0fdd-4717-8a42-2b04967df9ce-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-j8kmb\" (UID: \"29932146-0fdd-4717-8a42-2b04967df9ce\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j8kmb" Sep 30 14:31:47 crc kubenswrapper[4676]: I0930 14:31:47.512523 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cg77p\" (UniqueName: \"kubernetes.io/projected/29932146-0fdd-4717-8a42-2b04967df9ce-kube-api-access-cg77p\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-j8kmb\" (UID: \"29932146-0fdd-4717-8a42-2b04967df9ce\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j8kmb" Sep 30 14:31:47 crc kubenswrapper[4676]: I0930 14:31:47.512639 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29932146-0fdd-4717-8a42-2b04967df9ce-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-j8kmb\" (UID: \"29932146-0fdd-4717-8a42-2b04967df9ce\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j8kmb" Sep 30 14:31:47 crc kubenswrapper[4676]: I0930 14:31:47.512774 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/29932146-0fdd-4717-8a42-2b04967df9ce-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-j8kmb\" (UID: \"29932146-0fdd-4717-8a42-2b04967df9ce\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j8kmb" Sep 30 14:31:47 crc kubenswrapper[4676]: I0930 14:31:47.519634 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29932146-0fdd-4717-8a42-2b04967df9ce-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-j8kmb\" (UID: \"29932146-0fdd-4717-8a42-2b04967df9ce\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j8kmb" Sep 30 14:31:47 crc kubenswrapper[4676]: I0930 14:31:47.520303 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/29932146-0fdd-4717-8a42-2b04967df9ce-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-j8kmb\" (UID: \"29932146-0fdd-4717-8a42-2b04967df9ce\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j8kmb" Sep 30 14:31:47 crc kubenswrapper[4676]: I0930 14:31:47.531627 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cg77p\" (UniqueName: \"kubernetes.io/projected/29932146-0fdd-4717-8a42-2b04967df9ce-kube-api-access-cg77p\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-j8kmb\" (UID: \"29932146-0fdd-4717-8a42-2b04967df9ce\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j8kmb" Sep 30 14:31:47 crc kubenswrapper[4676]: I0930 14:31:47.620512 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j8kmb" Sep 30 14:31:48 crc kubenswrapper[4676]: I0930 14:31:48.184128 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-j8kmb"] Sep 30 14:31:48 crc kubenswrapper[4676]: I0930 14:31:48.237577 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j8kmb" event={"ID":"29932146-0fdd-4717-8a42-2b04967df9ce","Type":"ContainerStarted","Data":"e33a001b1dbff072f7de195d8d5767c971aba85f5791058b02029ba12dd597e8"} Sep 30 14:31:50 crc kubenswrapper[4676]: I0930 14:31:50.258059 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j8kmb" event={"ID":"29932146-0fdd-4717-8a42-2b04967df9ce","Type":"ContainerStarted","Data":"69791d571d1f520f3b5992ed9cb5220c5d0c9e9fd861564fe065684e9eb4318e"} Sep 30 14:31:50 crc kubenswrapper[4676]: I0930 14:31:50.280456 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j8kmb" podStartSLOduration=2.43376672 podStartE2EDuration="3.280431755s" podCreationTimestamp="2025-09-30 14:31:47 +0000 UTC" firstStartedPulling="2025-09-30 14:31:48.191485065 +0000 UTC m=+2012.174573494" lastFinishedPulling="2025-09-30 14:31:49.0381501 +0000 UTC m=+2013.021238529" observedRunningTime="2025-09-30 14:31:50.274955743 +0000 UTC m=+2014.258044182" watchObservedRunningTime="2025-09-30 14:31:50.280431755 +0000 UTC m=+2014.263520184" Sep 30 14:31:53 crc kubenswrapper[4676]: I0930 14:31:53.433649 4676 scope.go:117] "RemoveContainer" containerID="594bb900fe8a86f6d24186adf1f821b6fb52e495dce4ee8695064f3b8033f590" Sep 30 14:31:53 crc kubenswrapper[4676]: E0930 14:31:53.434466 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:32:03 crc kubenswrapper[4676]: I0930 14:32:03.037998 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-589w4"] Sep 30 14:32:03 crc kubenswrapper[4676]: I0930 14:32:03.056662 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-589w4"] Sep 30 14:32:03 crc kubenswrapper[4676]: I0930 14:32:03.444787 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18f239b5-877b-4291-8481-6a121c25bff9" path="/var/lib/kubelet/pods/18f239b5-877b-4291-8481-6a121c25bff9/volumes" Sep 30 14:32:05 crc kubenswrapper[4676]: I0930 14:32:05.433462 4676 scope.go:117] "RemoveContainer" containerID="594bb900fe8a86f6d24186adf1f821b6fb52e495dce4ee8695064f3b8033f590" Sep 30 14:32:05 crc kubenswrapper[4676]: E0930 14:32:05.434083 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:32:13 crc kubenswrapper[4676]: I0930 14:32:13.319169 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dbbqx"] Sep 30 14:32:13 crc kubenswrapper[4676]: I0930 14:32:13.323507 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dbbqx" Sep 30 14:32:13 crc kubenswrapper[4676]: I0930 14:32:13.345667 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dbbqx"] Sep 30 14:32:13 crc kubenswrapper[4676]: I0930 14:32:13.426954 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thfc6\" (UniqueName: \"kubernetes.io/projected/601a5049-f2d3-460d-9f5d-6c85ab4d192f-kube-api-access-thfc6\") pod \"certified-operators-dbbqx\" (UID: \"601a5049-f2d3-460d-9f5d-6c85ab4d192f\") " pod="openshift-marketplace/certified-operators-dbbqx" Sep 30 14:32:13 crc kubenswrapper[4676]: I0930 14:32:13.427044 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/601a5049-f2d3-460d-9f5d-6c85ab4d192f-utilities\") pod \"certified-operators-dbbqx\" (UID: \"601a5049-f2d3-460d-9f5d-6c85ab4d192f\") " pod="openshift-marketplace/certified-operators-dbbqx" Sep 30 14:32:13 crc kubenswrapper[4676]: I0930 14:32:13.427072 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/601a5049-f2d3-460d-9f5d-6c85ab4d192f-catalog-content\") pod \"certified-operators-dbbqx\" (UID: \"601a5049-f2d3-460d-9f5d-6c85ab4d192f\") " pod="openshift-marketplace/certified-operators-dbbqx" Sep 30 14:32:13 crc kubenswrapper[4676]: I0930 14:32:13.529283 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thfc6\" (UniqueName: \"kubernetes.io/projected/601a5049-f2d3-460d-9f5d-6c85ab4d192f-kube-api-access-thfc6\") pod \"certified-operators-dbbqx\" (UID: \"601a5049-f2d3-460d-9f5d-6c85ab4d192f\") " pod="openshift-marketplace/certified-operators-dbbqx" Sep 30 14:32:13 crc kubenswrapper[4676]: I0930 14:32:13.529385 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/601a5049-f2d3-460d-9f5d-6c85ab4d192f-utilities\") pod \"certified-operators-dbbqx\" (UID: \"601a5049-f2d3-460d-9f5d-6c85ab4d192f\") " pod="openshift-marketplace/certified-operators-dbbqx" Sep 30 14:32:13 crc kubenswrapper[4676]: I0930 14:32:13.529409 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/601a5049-f2d3-460d-9f5d-6c85ab4d192f-catalog-content\") pod \"certified-operators-dbbqx\" (UID: \"601a5049-f2d3-460d-9f5d-6c85ab4d192f\") " pod="openshift-marketplace/certified-operators-dbbqx" Sep 30 14:32:13 crc kubenswrapper[4676]: I0930 14:32:13.530039 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/601a5049-f2d3-460d-9f5d-6c85ab4d192f-utilities\") pod \"certified-operators-dbbqx\" (UID: \"601a5049-f2d3-460d-9f5d-6c85ab4d192f\") " pod="openshift-marketplace/certified-operators-dbbqx" Sep 30 14:32:13 crc kubenswrapper[4676]: I0930 14:32:13.530088 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/601a5049-f2d3-460d-9f5d-6c85ab4d192f-catalog-content\") pod \"certified-operators-dbbqx\" (UID: \"601a5049-f2d3-460d-9f5d-6c85ab4d192f\") " pod="openshift-marketplace/certified-operators-dbbqx" Sep 30 14:32:13 crc kubenswrapper[4676]: I0930 14:32:13.555761 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thfc6\" (UniqueName: \"kubernetes.io/projected/601a5049-f2d3-460d-9f5d-6c85ab4d192f-kube-api-access-thfc6\") pod \"certified-operators-dbbqx\" (UID: \"601a5049-f2d3-460d-9f5d-6c85ab4d192f\") " pod="openshift-marketplace/certified-operators-dbbqx" Sep 30 14:32:13 crc kubenswrapper[4676]: I0930 14:32:13.644241 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dbbqx" Sep 30 14:32:13 crc kubenswrapper[4676]: I0930 14:32:13.982801 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dbbqx"] Sep 30 14:32:14 crc kubenswrapper[4676]: I0930 14:32:14.456973 4676 generic.go:334] "Generic (PLEG): container finished" podID="601a5049-f2d3-460d-9f5d-6c85ab4d192f" containerID="18eb42ace96cd361d6d6951cdafac694abff09667afd9055caa324fc044c3528" exitCode=0 Sep 30 14:32:14 crc kubenswrapper[4676]: I0930 14:32:14.457100 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dbbqx" event={"ID":"601a5049-f2d3-460d-9f5d-6c85ab4d192f","Type":"ContainerDied","Data":"18eb42ace96cd361d6d6951cdafac694abff09667afd9055caa324fc044c3528"} Sep 30 14:32:14 crc kubenswrapper[4676]: I0930 14:32:14.457628 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dbbqx" event={"ID":"601a5049-f2d3-460d-9f5d-6c85ab4d192f","Type":"ContainerStarted","Data":"fae898da6797872e6e15ddde173e509988ec80a53fc59407fbf2c21dab26cb67"} Sep 30 14:32:16 crc kubenswrapper[4676]: I0930 14:32:16.477582 4676 generic.go:334] "Generic (PLEG): container finished" podID="601a5049-f2d3-460d-9f5d-6c85ab4d192f" containerID="f43b21972a98ee5c8c8b6a82584aa40b89c7804cec52dd430fcfd28d8b79022b" exitCode=0 Sep 30 14:32:16 crc kubenswrapper[4676]: I0930 14:32:16.477616 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dbbqx" event={"ID":"601a5049-f2d3-460d-9f5d-6c85ab4d192f","Type":"ContainerDied","Data":"f43b21972a98ee5c8c8b6a82584aa40b89c7804cec52dd430fcfd28d8b79022b"} Sep 30 14:32:17 crc kubenswrapper[4676]: I0930 14:32:17.432981 4676 scope.go:117] "RemoveContainer" containerID="594bb900fe8a86f6d24186adf1f821b6fb52e495dce4ee8695064f3b8033f590" Sep 30 14:32:17 crc kubenswrapper[4676]: E0930 14:32:17.433926 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:32:17 crc kubenswrapper[4676]: I0930 14:32:17.492214 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dbbqx" event={"ID":"601a5049-f2d3-460d-9f5d-6c85ab4d192f","Type":"ContainerStarted","Data":"56deaad2dc97c293bc7d2f49a27dcfe4caafc4d2eecb6e7bcf0618d273307eb2"} Sep 30 14:32:17 crc kubenswrapper[4676]: I0930 14:32:17.515002 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dbbqx" podStartSLOduration=1.840912053 podStartE2EDuration="4.514974557s" podCreationTimestamp="2025-09-30 14:32:13 +0000 UTC" firstStartedPulling="2025-09-30 14:32:14.45941496 +0000 UTC m=+2038.442503389" lastFinishedPulling="2025-09-30 14:32:17.133477464 +0000 UTC m=+2041.116565893" observedRunningTime="2025-09-30 14:32:17.512252917 +0000 UTC m=+2041.495341356" watchObservedRunningTime="2025-09-30 14:32:17.514974557 +0000 UTC m=+2041.498062986" Sep 30 14:32:21 crc kubenswrapper[4676]: I0930 14:32:21.788617 4676 scope.go:117] "RemoveContainer" containerID="097eb66c9d5a70284123973c8052d068773ffef68ed246953b44e1c9e247d670" Sep 30 14:32:21 crc kubenswrapper[4676]: I0930 14:32:21.812387 4676 scope.go:117] "RemoveContainer" containerID="15fe0d9acd3c959cd25fc1a238869dc72f39e3c8750058b321b878b777976149" Sep 30 14:32:21 crc kubenswrapper[4676]: I0930 14:32:21.870618 4676 scope.go:117] "RemoveContainer" containerID="bcf77ab230c8ecdb557bad36d67b63abf7e8df5071f117b4ab224fd883d64d3c" Sep 30 14:32:21 crc kubenswrapper[4676]: I0930 14:32:21.924523 4676 scope.go:117] "RemoveContainer" containerID="7c401ca1e5312a6509c27ce37f7ec2ddd0f578f113baa2f49b4f416f04b3eb5c" Sep 30 14:32:21 crc kubenswrapper[4676]: I0930 14:32:21.993334 4676 scope.go:117] "RemoveContainer" containerID="7b1734f93cb29c2b48f80dbd407a8fb9e804f31fa645c0bfce7cc6d82e975343" Sep 30 14:32:22 crc kubenswrapper[4676]: I0930 14:32:22.022091 4676 scope.go:117] "RemoveContainer" containerID="9d99058d6bebd5c65f55c6f30f0b2166b4dedbca8a5b7726560e1f66b07e386b" Sep 30 14:32:22 crc kubenswrapper[4676]: I0930 14:32:22.076399 4676 scope.go:117] "RemoveContainer" containerID="aaead5d4de93e4ccde86968661a2507e4c9ca3492a9b560f080713716c5f05ad" Sep 30 14:32:23 crc kubenswrapper[4676]: I0930 14:32:23.036342 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-jpk85"] Sep 30 14:32:23 crc kubenswrapper[4676]: I0930 14:32:23.043743 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-jpk85"] Sep 30 14:32:23 crc kubenswrapper[4676]: I0930 14:32:23.444502 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d2e03c1-e9fc-4b6d-a755-0582ad936263" path="/var/lib/kubelet/pods/1d2e03c1-e9fc-4b6d-a755-0582ad936263/volumes" Sep 30 14:32:23 crc kubenswrapper[4676]: I0930 14:32:23.645578 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dbbqx" Sep 30 14:32:23 crc kubenswrapper[4676]: I0930 14:32:23.645640 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dbbqx" Sep 30 14:32:23 crc kubenswrapper[4676]: I0930 14:32:23.694989 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dbbqx" Sep 30 14:32:24 crc kubenswrapper[4676]: I0930 14:32:24.552408 4676 generic.go:334] "Generic (PLEG): container finished" podID="29932146-0fdd-4717-8a42-2b04967df9ce" containerID="69791d571d1f520f3b5992ed9cb5220c5d0c9e9fd861564fe065684e9eb4318e" exitCode=0 Sep 30 14:32:24 crc kubenswrapper[4676]: I0930 14:32:24.552516 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j8kmb" event={"ID":"29932146-0fdd-4717-8a42-2b04967df9ce","Type":"ContainerDied","Data":"69791d571d1f520f3b5992ed9cb5220c5d0c9e9fd861564fe065684e9eb4318e"} Sep 30 14:32:24 crc kubenswrapper[4676]: I0930 14:32:24.636190 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dbbqx" Sep 30 14:32:24 crc kubenswrapper[4676]: I0930 14:32:24.689609 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dbbqx"] Sep 30 14:32:25 crc kubenswrapper[4676]: I0930 14:32:25.962431 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j8kmb" Sep 30 14:32:25 crc kubenswrapper[4676]: I0930 14:32:25.988161 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29932146-0fdd-4717-8a42-2b04967df9ce-inventory\") pod \"29932146-0fdd-4717-8a42-2b04967df9ce\" (UID: \"29932146-0fdd-4717-8a42-2b04967df9ce\") " Sep 30 14:32:25 crc kubenswrapper[4676]: I0930 14:32:25.988505 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/29932146-0fdd-4717-8a42-2b04967df9ce-ssh-key\") pod \"29932146-0fdd-4717-8a42-2b04967df9ce\" (UID: \"29932146-0fdd-4717-8a42-2b04967df9ce\") " Sep 30 14:32:25 crc kubenswrapper[4676]: I0930 14:32:25.988699 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cg77p\" (UniqueName: \"kubernetes.io/projected/29932146-0fdd-4717-8a42-2b04967df9ce-kube-api-access-cg77p\") pod \"29932146-0fdd-4717-8a42-2b04967df9ce\" (UID: \"29932146-0fdd-4717-8a42-2b04967df9ce\") " Sep 30 14:32:25 crc kubenswrapper[4676]: I0930 14:32:25.994658 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29932146-0fdd-4717-8a42-2b04967df9ce-kube-api-access-cg77p" (OuterVolumeSpecName: "kube-api-access-cg77p") pod "29932146-0fdd-4717-8a42-2b04967df9ce" (UID: "29932146-0fdd-4717-8a42-2b04967df9ce"). InnerVolumeSpecName "kube-api-access-cg77p". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:32:26 crc kubenswrapper[4676]: I0930 14:32:26.025136 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29932146-0fdd-4717-8a42-2b04967df9ce-inventory" (OuterVolumeSpecName: "inventory") pod "29932146-0fdd-4717-8a42-2b04967df9ce" (UID: "29932146-0fdd-4717-8a42-2b04967df9ce"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:32:26 crc kubenswrapper[4676]: I0930 14:32:26.029750 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bwwr8"] Sep 30 14:32:26 crc kubenswrapper[4676]: I0930 14:32:26.038447 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29932146-0fdd-4717-8a42-2b04967df9ce-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "29932146-0fdd-4717-8a42-2b04967df9ce" (UID: "29932146-0fdd-4717-8a42-2b04967df9ce"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:32:26 crc kubenswrapper[4676]: I0930 14:32:26.044301 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bwwr8"] Sep 30 14:32:26 crc kubenswrapper[4676]: I0930 14:32:26.091098 4676 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29932146-0fdd-4717-8a42-2b04967df9ce-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 14:32:26 crc kubenswrapper[4676]: I0930 14:32:26.091142 4676 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/29932146-0fdd-4717-8a42-2b04967df9ce-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 14:32:26 crc kubenswrapper[4676]: I0930 14:32:26.091156 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cg77p\" (UniqueName: \"kubernetes.io/projected/29932146-0fdd-4717-8a42-2b04967df9ce-kube-api-access-cg77p\") on node \"crc\" DevicePath \"\"" Sep 30 14:32:26 crc kubenswrapper[4676]: I0930 14:32:26.571401 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dbbqx" podUID="601a5049-f2d3-460d-9f5d-6c85ab4d192f" containerName="registry-server" containerID="cri-o://56deaad2dc97c293bc7d2f49a27dcfe4caafc4d2eecb6e7bcf0618d273307eb2" gracePeriod=2 Sep 30 14:32:26 crc kubenswrapper[4676]: I0930 14:32:26.572030 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j8kmb" Sep 30 14:32:26 crc kubenswrapper[4676]: I0930 14:32:26.574316 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j8kmb" event={"ID":"29932146-0fdd-4717-8a42-2b04967df9ce","Type":"ContainerDied","Data":"e33a001b1dbff072f7de195d8d5767c971aba85f5791058b02029ba12dd597e8"} Sep 30 14:32:26 crc kubenswrapper[4676]: I0930 14:32:26.574375 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e33a001b1dbff072f7de195d8d5767c971aba85f5791058b02029ba12dd597e8" Sep 30 14:32:26 crc kubenswrapper[4676]: I0930 14:32:26.662397 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zhdfz"] Sep 30 14:32:26 crc kubenswrapper[4676]: E0930 14:32:26.663265 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29932146-0fdd-4717-8a42-2b04967df9ce" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Sep 30 14:32:26 crc kubenswrapper[4676]: I0930 14:32:26.663286 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="29932146-0fdd-4717-8a42-2b04967df9ce" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Sep 30 14:32:26 crc kubenswrapper[4676]: I0930 14:32:26.663501 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="29932146-0fdd-4717-8a42-2b04967df9ce" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Sep 30 14:32:26 crc kubenswrapper[4676]: I0930 14:32:26.664182 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zhdfz" Sep 30 14:32:26 crc kubenswrapper[4676]: I0930 14:32:26.667245 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 14:32:26 crc kubenswrapper[4676]: I0930 14:32:26.667306 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mzpth" Sep 30 14:32:26 crc kubenswrapper[4676]: I0930 14:32:26.667245 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 14:32:26 crc kubenswrapper[4676]: I0930 14:32:26.667630 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 14:32:26 crc kubenswrapper[4676]: I0930 14:32:26.675661 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zhdfz"] Sep 30 14:32:26 crc kubenswrapper[4676]: I0930 14:32:26.703973 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6f7ebae7-0748-4052-859c-fb6a5fa89d33-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-zhdfz\" (UID: \"6f7ebae7-0748-4052-859c-fb6a5fa89d33\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zhdfz" Sep 30 14:32:26 crc kubenswrapper[4676]: I0930 14:32:26.704029 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gssnx\" (UniqueName: \"kubernetes.io/projected/6f7ebae7-0748-4052-859c-fb6a5fa89d33-kube-api-access-gssnx\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-zhdfz\" (UID: \"6f7ebae7-0748-4052-859c-fb6a5fa89d33\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zhdfz" Sep 30 14:32:26 crc kubenswrapper[4676]: I0930 14:32:26.704062 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6f7ebae7-0748-4052-859c-fb6a5fa89d33-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-zhdfz\" (UID: \"6f7ebae7-0748-4052-859c-fb6a5fa89d33\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zhdfz" Sep 30 14:32:26 crc kubenswrapper[4676]: I0930 14:32:26.805973 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6f7ebae7-0748-4052-859c-fb6a5fa89d33-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-zhdfz\" (UID: \"6f7ebae7-0748-4052-859c-fb6a5fa89d33\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zhdfz" Sep 30 14:32:26 crc kubenswrapper[4676]: I0930 14:32:26.806021 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gssnx\" (UniqueName: \"kubernetes.io/projected/6f7ebae7-0748-4052-859c-fb6a5fa89d33-kube-api-access-gssnx\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-zhdfz\" (UID: \"6f7ebae7-0748-4052-859c-fb6a5fa89d33\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zhdfz" Sep 30 14:32:26 crc kubenswrapper[4676]: I0930 14:32:26.806045 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6f7ebae7-0748-4052-859c-fb6a5fa89d33-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-zhdfz\" (UID: \"6f7ebae7-0748-4052-859c-fb6a5fa89d33\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zhdfz" Sep 30 14:32:26 crc kubenswrapper[4676]: I0930 14:32:26.811822 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6f7ebae7-0748-4052-859c-fb6a5fa89d33-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-zhdfz\" (UID: \"6f7ebae7-0748-4052-859c-fb6a5fa89d33\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zhdfz" Sep 30 14:32:26 crc kubenswrapper[4676]: I0930 14:32:26.821787 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6f7ebae7-0748-4052-859c-fb6a5fa89d33-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-zhdfz\" (UID: \"6f7ebae7-0748-4052-859c-fb6a5fa89d33\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zhdfz" Sep 30 14:32:26 crc kubenswrapper[4676]: I0930 14:32:26.823366 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gssnx\" (UniqueName: \"kubernetes.io/projected/6f7ebae7-0748-4052-859c-fb6a5fa89d33-kube-api-access-gssnx\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-zhdfz\" (UID: \"6f7ebae7-0748-4052-859c-fb6a5fa89d33\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zhdfz" Sep 30 14:32:27 crc kubenswrapper[4676]: I0930 14:32:27.020108 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dbbqx" Sep 30 14:32:27 crc kubenswrapper[4676]: I0930 14:32:27.047418 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zhdfz" Sep 30 14:32:27 crc kubenswrapper[4676]: I0930 14:32:27.112144 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thfc6\" (UniqueName: \"kubernetes.io/projected/601a5049-f2d3-460d-9f5d-6c85ab4d192f-kube-api-access-thfc6\") pod \"601a5049-f2d3-460d-9f5d-6c85ab4d192f\" (UID: \"601a5049-f2d3-460d-9f5d-6c85ab4d192f\") " Sep 30 14:32:27 crc kubenswrapper[4676]: I0930 14:32:27.112338 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/601a5049-f2d3-460d-9f5d-6c85ab4d192f-utilities\") pod \"601a5049-f2d3-460d-9f5d-6c85ab4d192f\" (UID: \"601a5049-f2d3-460d-9f5d-6c85ab4d192f\") " Sep 30 14:32:27 crc kubenswrapper[4676]: I0930 14:32:27.112374 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/601a5049-f2d3-460d-9f5d-6c85ab4d192f-catalog-content\") pod \"601a5049-f2d3-460d-9f5d-6c85ab4d192f\" (UID: \"601a5049-f2d3-460d-9f5d-6c85ab4d192f\") " Sep 30 14:32:27 crc kubenswrapper[4676]: I0930 14:32:27.113729 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/601a5049-f2d3-460d-9f5d-6c85ab4d192f-utilities" (OuterVolumeSpecName: "utilities") pod "601a5049-f2d3-460d-9f5d-6c85ab4d192f" (UID: "601a5049-f2d3-460d-9f5d-6c85ab4d192f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:32:27 crc kubenswrapper[4676]: I0930 14:32:27.118072 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/601a5049-f2d3-460d-9f5d-6c85ab4d192f-kube-api-access-thfc6" (OuterVolumeSpecName: "kube-api-access-thfc6") pod "601a5049-f2d3-460d-9f5d-6c85ab4d192f" (UID: "601a5049-f2d3-460d-9f5d-6c85ab4d192f"). InnerVolumeSpecName "kube-api-access-thfc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:32:27 crc kubenswrapper[4676]: I0930 14:32:27.174296 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/601a5049-f2d3-460d-9f5d-6c85ab4d192f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "601a5049-f2d3-460d-9f5d-6c85ab4d192f" (UID: "601a5049-f2d3-460d-9f5d-6c85ab4d192f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:32:27 crc kubenswrapper[4676]: I0930 14:32:27.214500 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thfc6\" (UniqueName: \"kubernetes.io/projected/601a5049-f2d3-460d-9f5d-6c85ab4d192f-kube-api-access-thfc6\") on node \"crc\" DevicePath \"\"" Sep 30 14:32:27 crc kubenswrapper[4676]: I0930 14:32:27.214924 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/601a5049-f2d3-460d-9f5d-6c85ab4d192f-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 14:32:27 crc kubenswrapper[4676]: I0930 14:32:27.214941 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/601a5049-f2d3-460d-9f5d-6c85ab4d192f-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 14:32:27 crc kubenswrapper[4676]: I0930 14:32:27.451530 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="279b0c8e-a8bf-457a-a345-ba9c1309c118" path="/var/lib/kubelet/pods/279b0c8e-a8bf-457a-a345-ba9c1309c118/volumes" Sep 30 14:32:27 crc kubenswrapper[4676]: I0930 14:32:27.567071 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zhdfz"] Sep 30 14:32:27 crc kubenswrapper[4676]: I0930 14:32:27.584842 4676 generic.go:334] "Generic (PLEG): container finished" podID="601a5049-f2d3-460d-9f5d-6c85ab4d192f" containerID="56deaad2dc97c293bc7d2f49a27dcfe4caafc4d2eecb6e7bcf0618d273307eb2" exitCode=0 Sep 30 14:32:27 crc kubenswrapper[4676]: I0930 14:32:27.584967 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dbbqx" event={"ID":"601a5049-f2d3-460d-9f5d-6c85ab4d192f","Type":"ContainerDied","Data":"56deaad2dc97c293bc7d2f49a27dcfe4caafc4d2eecb6e7bcf0618d273307eb2"} Sep 30 14:32:27 crc kubenswrapper[4676]: I0930 14:32:27.585023 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dbbqx" event={"ID":"601a5049-f2d3-460d-9f5d-6c85ab4d192f","Type":"ContainerDied","Data":"fae898da6797872e6e15ddde173e509988ec80a53fc59407fbf2c21dab26cb67"} Sep 30 14:32:27 crc kubenswrapper[4676]: I0930 14:32:27.584983 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dbbqx" Sep 30 14:32:27 crc kubenswrapper[4676]: I0930 14:32:27.585049 4676 scope.go:117] "RemoveContainer" containerID="56deaad2dc97c293bc7d2f49a27dcfe4caafc4d2eecb6e7bcf0618d273307eb2" Sep 30 14:32:27 crc kubenswrapper[4676]: I0930 14:32:27.613346 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dbbqx"] Sep 30 14:32:27 crc kubenswrapper[4676]: I0930 14:32:27.615055 4676 scope.go:117] "RemoveContainer" containerID="f43b21972a98ee5c8c8b6a82584aa40b89c7804cec52dd430fcfd28d8b79022b" Sep 30 14:32:27 crc kubenswrapper[4676]: I0930 14:32:27.622337 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dbbqx"] Sep 30 14:32:27 crc kubenswrapper[4676]: I0930 14:32:27.649621 4676 scope.go:117] "RemoveContainer" containerID="18eb42ace96cd361d6d6951cdafac694abff09667afd9055caa324fc044c3528" Sep 30 14:32:27 crc kubenswrapper[4676]: I0930 14:32:27.670963 4676 scope.go:117] "RemoveContainer" containerID="56deaad2dc97c293bc7d2f49a27dcfe4caafc4d2eecb6e7bcf0618d273307eb2" Sep 30 14:32:27 crc kubenswrapper[4676]: E0930 14:32:27.671505 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56deaad2dc97c293bc7d2f49a27dcfe4caafc4d2eecb6e7bcf0618d273307eb2\": container with ID starting with 56deaad2dc97c293bc7d2f49a27dcfe4caafc4d2eecb6e7bcf0618d273307eb2 not found: ID does not exist" containerID="56deaad2dc97c293bc7d2f49a27dcfe4caafc4d2eecb6e7bcf0618d273307eb2" Sep 30 14:32:27 crc kubenswrapper[4676]: I0930 14:32:27.671561 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56deaad2dc97c293bc7d2f49a27dcfe4caafc4d2eecb6e7bcf0618d273307eb2"} err="failed to get container status \"56deaad2dc97c293bc7d2f49a27dcfe4caafc4d2eecb6e7bcf0618d273307eb2\": rpc error: code = NotFound desc = could not find container \"56deaad2dc97c293bc7d2f49a27dcfe4caafc4d2eecb6e7bcf0618d273307eb2\": container with ID starting with 56deaad2dc97c293bc7d2f49a27dcfe4caafc4d2eecb6e7bcf0618d273307eb2 not found: ID does not exist" Sep 30 14:32:27 crc kubenswrapper[4676]: I0930 14:32:27.671591 4676 scope.go:117] "RemoveContainer" containerID="f43b21972a98ee5c8c8b6a82584aa40b89c7804cec52dd430fcfd28d8b79022b" Sep 30 14:32:27 crc kubenswrapper[4676]: E0930 14:32:27.672278 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f43b21972a98ee5c8c8b6a82584aa40b89c7804cec52dd430fcfd28d8b79022b\": container with ID starting with f43b21972a98ee5c8c8b6a82584aa40b89c7804cec52dd430fcfd28d8b79022b not found: ID does not exist" containerID="f43b21972a98ee5c8c8b6a82584aa40b89c7804cec52dd430fcfd28d8b79022b" Sep 30 14:32:27 crc kubenswrapper[4676]: I0930 14:32:27.672308 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f43b21972a98ee5c8c8b6a82584aa40b89c7804cec52dd430fcfd28d8b79022b"} err="failed to get container status \"f43b21972a98ee5c8c8b6a82584aa40b89c7804cec52dd430fcfd28d8b79022b\": rpc error: code = NotFound desc = could not find container \"f43b21972a98ee5c8c8b6a82584aa40b89c7804cec52dd430fcfd28d8b79022b\": container with ID starting with f43b21972a98ee5c8c8b6a82584aa40b89c7804cec52dd430fcfd28d8b79022b not found: ID does not exist" Sep 30 14:32:27 crc kubenswrapper[4676]: I0930 14:32:27.672327 4676 scope.go:117] "RemoveContainer" containerID="18eb42ace96cd361d6d6951cdafac694abff09667afd9055caa324fc044c3528" Sep 30 14:32:27 crc kubenswrapper[4676]: E0930 14:32:27.672632 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18eb42ace96cd361d6d6951cdafac694abff09667afd9055caa324fc044c3528\": container with ID starting with 18eb42ace96cd361d6d6951cdafac694abff09667afd9055caa324fc044c3528 not found: ID does not exist" containerID="18eb42ace96cd361d6d6951cdafac694abff09667afd9055caa324fc044c3528" Sep 30 14:32:27 crc kubenswrapper[4676]: I0930 14:32:27.672664 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18eb42ace96cd361d6d6951cdafac694abff09667afd9055caa324fc044c3528"} err="failed to get container status \"18eb42ace96cd361d6d6951cdafac694abff09667afd9055caa324fc044c3528\": rpc error: code = NotFound desc = could not find container \"18eb42ace96cd361d6d6951cdafac694abff09667afd9055caa324fc044c3528\": container with ID starting with 18eb42ace96cd361d6d6951cdafac694abff09667afd9055caa324fc044c3528 not found: ID does not exist" Sep 30 14:32:28 crc kubenswrapper[4676]: I0930 14:32:28.596229 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zhdfz" event={"ID":"6f7ebae7-0748-4052-859c-fb6a5fa89d33","Type":"ContainerStarted","Data":"6e9ca48ed36490af8b024d4f7b7f88ed8641c5a7186ec7323d7c5aa77a0e8e8a"} Sep 30 14:32:28 crc kubenswrapper[4676]: I0930 14:32:28.596590 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zhdfz" event={"ID":"6f7ebae7-0748-4052-859c-fb6a5fa89d33","Type":"ContainerStarted","Data":"f7934d4eacb5d2955b946923d3bbab595f93f2733c32d8e95677abd3fbbe461a"} Sep 30 14:32:28 crc kubenswrapper[4676]: I0930 14:32:28.621679 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zhdfz" podStartSLOduration=1.9793247109999998 podStartE2EDuration="2.621661189s" podCreationTimestamp="2025-09-30 14:32:26 +0000 UTC" firstStartedPulling="2025-09-30 14:32:27.575454469 +0000 UTC m=+2051.558542898" lastFinishedPulling="2025-09-30 14:32:28.217790947 +0000 UTC m=+2052.200879376" observedRunningTime="2025-09-30 14:32:28.614174076 +0000 UTC m=+2052.597262505" watchObservedRunningTime="2025-09-30 14:32:28.621661189 +0000 UTC m=+2052.604749618" Sep 30 14:32:29 crc kubenswrapper[4676]: I0930 14:32:29.458894 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="601a5049-f2d3-460d-9f5d-6c85ab4d192f" path="/var/lib/kubelet/pods/601a5049-f2d3-460d-9f5d-6c85ab4d192f/volumes" Sep 30 14:32:32 crc kubenswrapper[4676]: I0930 14:32:32.434306 4676 scope.go:117] "RemoveContainer" containerID="594bb900fe8a86f6d24186adf1f821b6fb52e495dce4ee8695064f3b8033f590" Sep 30 14:32:33 crc kubenswrapper[4676]: I0930 14:32:33.646892 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" event={"ID":"af133cb7-f0e4-428e-b348-c6e81493fc1d","Type":"ContainerStarted","Data":"587ffe0c6c9654995cc627899f735a2c46498e902ecdd2084cd7e9cf322d7a4f"} Sep 30 14:33:06 crc kubenswrapper[4676]: I0930 14:33:06.039078 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-4fvlz"] Sep 30 14:33:06 crc kubenswrapper[4676]: I0930 14:33:06.047331 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-4fvlz"] Sep 30 14:33:07 crc kubenswrapper[4676]: I0930 14:33:07.446290 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e77accdf-897d-4abd-b12c-b28bf6406a78" path="/var/lib/kubelet/pods/e77accdf-897d-4abd-b12c-b28bf6406a78/volumes" Sep 30 14:33:22 crc kubenswrapper[4676]: I0930 14:33:22.065953 4676 generic.go:334] "Generic (PLEG): container finished" podID="6f7ebae7-0748-4052-859c-fb6a5fa89d33" containerID="6e9ca48ed36490af8b024d4f7b7f88ed8641c5a7186ec7323d7c5aa77a0e8e8a" exitCode=2 Sep 30 14:33:22 crc kubenswrapper[4676]: I0930 14:33:22.066031 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zhdfz" event={"ID":"6f7ebae7-0748-4052-859c-fb6a5fa89d33","Type":"ContainerDied","Data":"6e9ca48ed36490af8b024d4f7b7f88ed8641c5a7186ec7323d7c5aa77a0e8e8a"} Sep 30 14:33:22 crc kubenswrapper[4676]: I0930 14:33:22.231830 4676 scope.go:117] "RemoveContainer" containerID="61489785c3d00ad3a9ec3beb08cedeeb572595f3b5c955cdb582f80269cd7247" Sep 30 14:33:22 crc kubenswrapper[4676]: I0930 14:33:22.270085 4676 scope.go:117] "RemoveContainer" containerID="c71d74e0bbf7d6f48c794a196ed4ac131c92a4a7148bf33f1be1e83fd93f7e73" Sep 30 14:33:22 crc kubenswrapper[4676]: I0930 14:33:22.326361 4676 scope.go:117] "RemoveContainer" containerID="26d0c3000bad072b538f1bc904894eea3b3bb242351875c7c5b939e6c2114141" Sep 30 14:33:23 crc kubenswrapper[4676]: I0930 14:33:23.492453 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zhdfz" Sep 30 14:33:23 crc kubenswrapper[4676]: I0930 14:33:23.648221 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gssnx\" (UniqueName: \"kubernetes.io/projected/6f7ebae7-0748-4052-859c-fb6a5fa89d33-kube-api-access-gssnx\") pod \"6f7ebae7-0748-4052-859c-fb6a5fa89d33\" (UID: \"6f7ebae7-0748-4052-859c-fb6a5fa89d33\") " Sep 30 14:33:23 crc kubenswrapper[4676]: I0930 14:33:23.648479 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6f7ebae7-0748-4052-859c-fb6a5fa89d33-inventory\") pod \"6f7ebae7-0748-4052-859c-fb6a5fa89d33\" (UID: \"6f7ebae7-0748-4052-859c-fb6a5fa89d33\") " Sep 30 14:33:23 crc kubenswrapper[4676]: I0930 14:33:23.648520 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6f7ebae7-0748-4052-859c-fb6a5fa89d33-ssh-key\") pod \"6f7ebae7-0748-4052-859c-fb6a5fa89d33\" (UID: \"6f7ebae7-0748-4052-859c-fb6a5fa89d33\") " Sep 30 14:33:23 crc kubenswrapper[4676]: I0930 14:33:23.657367 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f7ebae7-0748-4052-859c-fb6a5fa89d33-kube-api-access-gssnx" (OuterVolumeSpecName: "kube-api-access-gssnx") pod "6f7ebae7-0748-4052-859c-fb6a5fa89d33" (UID: "6f7ebae7-0748-4052-859c-fb6a5fa89d33"). InnerVolumeSpecName "kube-api-access-gssnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:33:23 crc kubenswrapper[4676]: I0930 14:33:23.679641 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f7ebae7-0748-4052-859c-fb6a5fa89d33-inventory" (OuterVolumeSpecName: "inventory") pod "6f7ebae7-0748-4052-859c-fb6a5fa89d33" (UID: "6f7ebae7-0748-4052-859c-fb6a5fa89d33"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:33:23 crc kubenswrapper[4676]: I0930 14:33:23.688558 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f7ebae7-0748-4052-859c-fb6a5fa89d33-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6f7ebae7-0748-4052-859c-fb6a5fa89d33" (UID: "6f7ebae7-0748-4052-859c-fb6a5fa89d33"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:33:23 crc kubenswrapper[4676]: I0930 14:33:23.751376 4676 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6f7ebae7-0748-4052-859c-fb6a5fa89d33-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 14:33:23 crc kubenswrapper[4676]: I0930 14:33:23.752046 4676 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6f7ebae7-0748-4052-859c-fb6a5fa89d33-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 14:33:23 crc kubenswrapper[4676]: I0930 14:33:23.752146 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gssnx\" (UniqueName: \"kubernetes.io/projected/6f7ebae7-0748-4052-859c-fb6a5fa89d33-kube-api-access-gssnx\") on node \"crc\" DevicePath \"\"" Sep 30 14:33:24 crc kubenswrapper[4676]: I0930 14:33:24.088776 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zhdfz" event={"ID":"6f7ebae7-0748-4052-859c-fb6a5fa89d33","Type":"ContainerDied","Data":"f7934d4eacb5d2955b946923d3bbab595f93f2733c32d8e95677abd3fbbe461a"} Sep 30 14:33:24 crc kubenswrapper[4676]: I0930 14:33:24.088822 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7934d4eacb5d2955b946923d3bbab595f93f2733c32d8e95677abd3fbbe461a" Sep 30 14:33:24 crc kubenswrapper[4676]: I0930 14:33:24.088851 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zhdfz" Sep 30 14:33:31 crc kubenswrapper[4676]: I0930 14:33:31.034140 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gq8qc"] Sep 30 14:33:31 crc kubenswrapper[4676]: E0930 14:33:31.035802 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="601a5049-f2d3-460d-9f5d-6c85ab4d192f" containerName="extract-utilities" Sep 30 14:33:31 crc kubenswrapper[4676]: I0930 14:33:31.035831 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="601a5049-f2d3-460d-9f5d-6c85ab4d192f" containerName="extract-utilities" Sep 30 14:33:31 crc kubenswrapper[4676]: E0930 14:33:31.035851 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f7ebae7-0748-4052-859c-fb6a5fa89d33" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Sep 30 14:33:31 crc kubenswrapper[4676]: I0930 14:33:31.035861 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f7ebae7-0748-4052-859c-fb6a5fa89d33" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Sep 30 14:33:31 crc kubenswrapper[4676]: E0930 14:33:31.035943 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="601a5049-f2d3-460d-9f5d-6c85ab4d192f" containerName="extract-content" Sep 30 14:33:31 crc kubenswrapper[4676]: I0930 14:33:31.035955 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="601a5049-f2d3-460d-9f5d-6c85ab4d192f" containerName="extract-content" Sep 30 14:33:31 crc kubenswrapper[4676]: E0930 14:33:31.035968 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="601a5049-f2d3-460d-9f5d-6c85ab4d192f" containerName="registry-server" Sep 30 14:33:31 crc kubenswrapper[4676]: I0930 14:33:31.035977 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="601a5049-f2d3-460d-9f5d-6c85ab4d192f" containerName="registry-server" Sep 30 14:33:31 crc kubenswrapper[4676]: I0930 14:33:31.036224 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="601a5049-f2d3-460d-9f5d-6c85ab4d192f" containerName="registry-server" Sep 30 14:33:31 crc kubenswrapper[4676]: I0930 14:33:31.036261 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f7ebae7-0748-4052-859c-fb6a5fa89d33" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Sep 30 14:33:31 crc kubenswrapper[4676]: I0930 14:33:31.037317 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gq8qc" Sep 30 14:33:31 crc kubenswrapper[4676]: I0930 14:33:31.040069 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 14:33:31 crc kubenswrapper[4676]: I0930 14:33:31.041666 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mzpth" Sep 30 14:33:31 crc kubenswrapper[4676]: I0930 14:33:31.042539 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 14:33:31 crc kubenswrapper[4676]: I0930 14:33:31.042746 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 14:33:31 crc kubenswrapper[4676]: I0930 14:33:31.043785 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gq8qc"] Sep 30 14:33:31 crc kubenswrapper[4676]: I0930 14:33:31.215902 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2ecdd5ea-0ceb-47e4-9e2a-3782b3050fa7-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gq8qc\" (UID: \"2ecdd5ea-0ceb-47e4-9e2a-3782b3050fa7\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gq8qc" Sep 30 14:33:31 crc kubenswrapper[4676]: I0930 14:33:31.216025 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhsgn\" (UniqueName: \"kubernetes.io/projected/2ecdd5ea-0ceb-47e4-9e2a-3782b3050fa7-kube-api-access-lhsgn\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gq8qc\" (UID: \"2ecdd5ea-0ceb-47e4-9e2a-3782b3050fa7\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gq8qc" Sep 30 14:33:31 crc kubenswrapper[4676]: I0930 14:33:31.216241 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2ecdd5ea-0ceb-47e4-9e2a-3782b3050fa7-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gq8qc\" (UID: \"2ecdd5ea-0ceb-47e4-9e2a-3782b3050fa7\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gq8qc" Sep 30 14:33:31 crc kubenswrapper[4676]: I0930 14:33:31.318017 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2ecdd5ea-0ceb-47e4-9e2a-3782b3050fa7-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gq8qc\" (UID: \"2ecdd5ea-0ceb-47e4-9e2a-3782b3050fa7\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gq8qc" Sep 30 14:33:31 crc kubenswrapper[4676]: I0930 14:33:31.318145 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2ecdd5ea-0ceb-47e4-9e2a-3782b3050fa7-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gq8qc\" (UID: \"2ecdd5ea-0ceb-47e4-9e2a-3782b3050fa7\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gq8qc" Sep 30 14:33:31 crc kubenswrapper[4676]: I0930 14:33:31.318213 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhsgn\" (UniqueName: \"kubernetes.io/projected/2ecdd5ea-0ceb-47e4-9e2a-3782b3050fa7-kube-api-access-lhsgn\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gq8qc\" (UID: \"2ecdd5ea-0ceb-47e4-9e2a-3782b3050fa7\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gq8qc" Sep 30 14:33:31 crc kubenswrapper[4676]: I0930 14:33:31.324892 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2ecdd5ea-0ceb-47e4-9e2a-3782b3050fa7-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gq8qc\" (UID: \"2ecdd5ea-0ceb-47e4-9e2a-3782b3050fa7\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gq8qc" Sep 30 14:33:31 crc kubenswrapper[4676]: I0930 14:33:31.336960 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhsgn\" (UniqueName: \"kubernetes.io/projected/2ecdd5ea-0ceb-47e4-9e2a-3782b3050fa7-kube-api-access-lhsgn\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gq8qc\" (UID: \"2ecdd5ea-0ceb-47e4-9e2a-3782b3050fa7\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gq8qc" Sep 30 14:33:31 crc kubenswrapper[4676]: I0930 14:33:31.341094 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2ecdd5ea-0ceb-47e4-9e2a-3782b3050fa7-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gq8qc\" (UID: \"2ecdd5ea-0ceb-47e4-9e2a-3782b3050fa7\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gq8qc" Sep 30 14:33:31 crc kubenswrapper[4676]: I0930 14:33:31.367317 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gq8qc" Sep 30 14:33:31 crc kubenswrapper[4676]: I0930 14:33:31.953982 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gq8qc"] Sep 30 14:33:32 crc kubenswrapper[4676]: I0930 14:33:32.166987 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gq8qc" event={"ID":"2ecdd5ea-0ceb-47e4-9e2a-3782b3050fa7","Type":"ContainerStarted","Data":"4d2494d465ca86715d234fd617d327d62d38fc188f13b57433981565294393dc"} Sep 30 14:33:33 crc kubenswrapper[4676]: I0930 14:33:33.194307 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gq8qc" event={"ID":"2ecdd5ea-0ceb-47e4-9e2a-3782b3050fa7","Type":"ContainerStarted","Data":"f94919ccbfefbf8bceb1536f7cb7d19e6bb680ad33f640e85fb9fd7bdd764376"} Sep 30 14:33:33 crc kubenswrapper[4676]: I0930 14:33:33.220411 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gq8qc" podStartSLOduration=1.460806467 podStartE2EDuration="2.220385573s" podCreationTimestamp="2025-09-30 14:33:31 +0000 UTC" firstStartedPulling="2025-09-30 14:33:31.959368976 +0000 UTC m=+2115.942457415" lastFinishedPulling="2025-09-30 14:33:32.718948082 +0000 UTC m=+2116.702036521" observedRunningTime="2025-09-30 14:33:33.215533605 +0000 UTC m=+2117.198622034" watchObservedRunningTime="2025-09-30 14:33:33.220385573 +0000 UTC m=+2117.203474002" Sep 30 14:33:34 crc kubenswrapper[4676]: I0930 14:33:34.331674 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rdn2k"] Sep 30 14:33:34 crc kubenswrapper[4676]: I0930 14:33:34.334944 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rdn2k" Sep 30 14:33:34 crc kubenswrapper[4676]: I0930 14:33:34.350781 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rdn2k"] Sep 30 14:33:34 crc kubenswrapper[4676]: I0930 14:33:34.487134 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64e83588-fc5d-46c8-ac17-2d8cddb97d90-utilities\") pod \"community-operators-rdn2k\" (UID: \"64e83588-fc5d-46c8-ac17-2d8cddb97d90\") " pod="openshift-marketplace/community-operators-rdn2k" Sep 30 14:33:34 crc kubenswrapper[4676]: I0930 14:33:34.487211 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rg4nx\" (UniqueName: \"kubernetes.io/projected/64e83588-fc5d-46c8-ac17-2d8cddb97d90-kube-api-access-rg4nx\") pod \"community-operators-rdn2k\" (UID: \"64e83588-fc5d-46c8-ac17-2d8cddb97d90\") " pod="openshift-marketplace/community-operators-rdn2k" Sep 30 14:33:34 crc kubenswrapper[4676]: I0930 14:33:34.487626 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64e83588-fc5d-46c8-ac17-2d8cddb97d90-catalog-content\") pod \"community-operators-rdn2k\" (UID: \"64e83588-fc5d-46c8-ac17-2d8cddb97d90\") " pod="openshift-marketplace/community-operators-rdn2k" Sep 30 14:33:34 crc kubenswrapper[4676]: I0930 14:33:34.527205 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8kvr7"] Sep 30 14:33:34 crc kubenswrapper[4676]: I0930 14:33:34.530169 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8kvr7" Sep 30 14:33:34 crc kubenswrapper[4676]: I0930 14:33:34.545037 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8kvr7"] Sep 30 14:33:34 crc kubenswrapper[4676]: I0930 14:33:34.590322 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64e83588-fc5d-46c8-ac17-2d8cddb97d90-catalog-content\") pod \"community-operators-rdn2k\" (UID: \"64e83588-fc5d-46c8-ac17-2d8cddb97d90\") " pod="openshift-marketplace/community-operators-rdn2k" Sep 30 14:33:34 crc kubenswrapper[4676]: I0930 14:33:34.590471 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64e83588-fc5d-46c8-ac17-2d8cddb97d90-utilities\") pod \"community-operators-rdn2k\" (UID: \"64e83588-fc5d-46c8-ac17-2d8cddb97d90\") " pod="openshift-marketplace/community-operators-rdn2k" Sep 30 14:33:34 crc kubenswrapper[4676]: I0930 14:33:34.590505 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rg4nx\" (UniqueName: \"kubernetes.io/projected/64e83588-fc5d-46c8-ac17-2d8cddb97d90-kube-api-access-rg4nx\") pod \"community-operators-rdn2k\" (UID: \"64e83588-fc5d-46c8-ac17-2d8cddb97d90\") " pod="openshift-marketplace/community-operators-rdn2k" Sep 30 14:33:34 crc kubenswrapper[4676]: I0930 14:33:34.591055 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64e83588-fc5d-46c8-ac17-2d8cddb97d90-catalog-content\") pod \"community-operators-rdn2k\" (UID: \"64e83588-fc5d-46c8-ac17-2d8cddb97d90\") " pod="openshift-marketplace/community-operators-rdn2k" Sep 30 14:33:34 crc kubenswrapper[4676]: I0930 14:33:34.591392 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64e83588-fc5d-46c8-ac17-2d8cddb97d90-utilities\") pod \"community-operators-rdn2k\" (UID: \"64e83588-fc5d-46c8-ac17-2d8cddb97d90\") " pod="openshift-marketplace/community-operators-rdn2k" Sep 30 14:33:34 crc kubenswrapper[4676]: I0930 14:33:34.613428 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rg4nx\" (UniqueName: \"kubernetes.io/projected/64e83588-fc5d-46c8-ac17-2d8cddb97d90-kube-api-access-rg4nx\") pod \"community-operators-rdn2k\" (UID: \"64e83588-fc5d-46c8-ac17-2d8cddb97d90\") " pod="openshift-marketplace/community-operators-rdn2k" Sep 30 14:33:34 crc kubenswrapper[4676]: I0930 14:33:34.659964 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rdn2k" Sep 30 14:33:34 crc kubenswrapper[4676]: I0930 14:33:34.693068 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gqt8\" (UniqueName: \"kubernetes.io/projected/fc3c389e-a38f-481d-a180-989c52166852-kube-api-access-5gqt8\") pod \"redhat-marketplace-8kvr7\" (UID: \"fc3c389e-a38f-481d-a180-989c52166852\") " pod="openshift-marketplace/redhat-marketplace-8kvr7" Sep 30 14:33:34 crc kubenswrapper[4676]: I0930 14:33:34.693620 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc3c389e-a38f-481d-a180-989c52166852-catalog-content\") pod \"redhat-marketplace-8kvr7\" (UID: \"fc3c389e-a38f-481d-a180-989c52166852\") " pod="openshift-marketplace/redhat-marketplace-8kvr7" Sep 30 14:33:34 crc kubenswrapper[4676]: I0930 14:33:34.693671 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc3c389e-a38f-481d-a180-989c52166852-utilities\") pod \"redhat-marketplace-8kvr7\" (UID: \"fc3c389e-a38f-481d-a180-989c52166852\") " pod="openshift-marketplace/redhat-marketplace-8kvr7" Sep 30 14:33:34 crc kubenswrapper[4676]: I0930 14:33:34.796122 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc3c389e-a38f-481d-a180-989c52166852-catalog-content\") pod \"redhat-marketplace-8kvr7\" (UID: \"fc3c389e-a38f-481d-a180-989c52166852\") " pod="openshift-marketplace/redhat-marketplace-8kvr7" Sep 30 14:33:34 crc kubenswrapper[4676]: I0930 14:33:34.796221 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc3c389e-a38f-481d-a180-989c52166852-utilities\") pod \"redhat-marketplace-8kvr7\" (UID: \"fc3c389e-a38f-481d-a180-989c52166852\") " pod="openshift-marketplace/redhat-marketplace-8kvr7" Sep 30 14:33:34 crc kubenswrapper[4676]: I0930 14:33:34.796385 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gqt8\" (UniqueName: \"kubernetes.io/projected/fc3c389e-a38f-481d-a180-989c52166852-kube-api-access-5gqt8\") pod \"redhat-marketplace-8kvr7\" (UID: \"fc3c389e-a38f-481d-a180-989c52166852\") " pod="openshift-marketplace/redhat-marketplace-8kvr7" Sep 30 14:33:34 crc kubenswrapper[4676]: I0930 14:33:34.796647 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc3c389e-a38f-481d-a180-989c52166852-catalog-content\") pod \"redhat-marketplace-8kvr7\" (UID: \"fc3c389e-a38f-481d-a180-989c52166852\") " pod="openshift-marketplace/redhat-marketplace-8kvr7" Sep 30 14:33:34 crc kubenswrapper[4676]: I0930 14:33:34.796740 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc3c389e-a38f-481d-a180-989c52166852-utilities\") pod \"redhat-marketplace-8kvr7\" (UID: \"fc3c389e-a38f-481d-a180-989c52166852\") " pod="openshift-marketplace/redhat-marketplace-8kvr7" Sep 30 14:33:34 crc kubenswrapper[4676]: I0930 14:33:34.817988 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gqt8\" (UniqueName: \"kubernetes.io/projected/fc3c389e-a38f-481d-a180-989c52166852-kube-api-access-5gqt8\") pod \"redhat-marketplace-8kvr7\" (UID: \"fc3c389e-a38f-481d-a180-989c52166852\") " pod="openshift-marketplace/redhat-marketplace-8kvr7" Sep 30 14:33:34 crc kubenswrapper[4676]: I0930 14:33:34.849194 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8kvr7" Sep 30 14:33:35 crc kubenswrapper[4676]: I0930 14:33:35.313389 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rdn2k"] Sep 30 14:33:35 crc kubenswrapper[4676]: I0930 14:33:35.417013 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8kvr7"] Sep 30 14:33:35 crc kubenswrapper[4676]: W0930 14:33:35.424079 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc3c389e_a38f_481d_a180_989c52166852.slice/crio-d7b903e1560a5eeaa8f5a798e69fa936daec6c99b6905491afb0389daa501586 WatchSource:0}: Error finding container d7b903e1560a5eeaa8f5a798e69fa936daec6c99b6905491afb0389daa501586: Status 404 returned error can't find the container with id d7b903e1560a5eeaa8f5a798e69fa936daec6c99b6905491afb0389daa501586 Sep 30 14:33:36 crc kubenswrapper[4676]: I0930 14:33:36.225078 4676 generic.go:334] "Generic (PLEG): container finished" podID="64e83588-fc5d-46c8-ac17-2d8cddb97d90" containerID="48e3d2a72f80ac023681f2eb7bdf103e3f76d03cb5f39cdcbdd3d44dc0d7a4fb" exitCode=0 Sep 30 14:33:36 crc kubenswrapper[4676]: I0930 14:33:36.225282 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rdn2k" event={"ID":"64e83588-fc5d-46c8-ac17-2d8cddb97d90","Type":"ContainerDied","Data":"48e3d2a72f80ac023681f2eb7bdf103e3f76d03cb5f39cdcbdd3d44dc0d7a4fb"} Sep 30 14:33:36 crc kubenswrapper[4676]: I0930 14:33:36.225489 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rdn2k" event={"ID":"64e83588-fc5d-46c8-ac17-2d8cddb97d90","Type":"ContainerStarted","Data":"d7ebf02e93512ffd4a8b8b827f2c6ffaa8657d0f7c1f5b791fe0645ec44a3951"} Sep 30 14:33:36 crc kubenswrapper[4676]: I0930 14:33:36.230289 4676 generic.go:334] "Generic (PLEG): container finished" podID="fc3c389e-a38f-481d-a180-989c52166852" containerID="b15a354dc6b17134d51ea075cc8390543c8ec0642ce4210e70583104ba45f32e" exitCode=0 Sep 30 14:33:36 crc kubenswrapper[4676]: I0930 14:33:36.230341 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8kvr7" event={"ID":"fc3c389e-a38f-481d-a180-989c52166852","Type":"ContainerDied","Data":"b15a354dc6b17134d51ea075cc8390543c8ec0642ce4210e70583104ba45f32e"} Sep 30 14:33:36 crc kubenswrapper[4676]: I0930 14:33:36.230365 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8kvr7" event={"ID":"fc3c389e-a38f-481d-a180-989c52166852","Type":"ContainerStarted","Data":"d7b903e1560a5eeaa8f5a798e69fa936daec6c99b6905491afb0389daa501586"} Sep 30 14:33:37 crc kubenswrapper[4676]: I0930 14:33:37.247517 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rdn2k" event={"ID":"64e83588-fc5d-46c8-ac17-2d8cddb97d90","Type":"ContainerStarted","Data":"c653007a50dfa811eddf68f9b05b0c291531fb9f4b634d48f0ee1d145dd0d945"} Sep 30 14:33:38 crc kubenswrapper[4676]: I0930 14:33:38.262195 4676 generic.go:334] "Generic (PLEG): container finished" podID="fc3c389e-a38f-481d-a180-989c52166852" containerID="9f2ad024058ab900a0cb573ea1d0692b4059b517e23fa171a53c1e9eb51b6754" exitCode=0 Sep 30 14:33:38 crc kubenswrapper[4676]: I0930 14:33:38.262295 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8kvr7" event={"ID":"fc3c389e-a38f-481d-a180-989c52166852","Type":"ContainerDied","Data":"9f2ad024058ab900a0cb573ea1d0692b4059b517e23fa171a53c1e9eb51b6754"} Sep 30 14:33:38 crc kubenswrapper[4676]: I0930 14:33:38.265259 4676 generic.go:334] "Generic (PLEG): container finished" podID="64e83588-fc5d-46c8-ac17-2d8cddb97d90" containerID="c653007a50dfa811eddf68f9b05b0c291531fb9f4b634d48f0ee1d145dd0d945" exitCode=0 Sep 30 14:33:38 crc kubenswrapper[4676]: I0930 14:33:38.265345 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rdn2k" event={"ID":"64e83588-fc5d-46c8-ac17-2d8cddb97d90","Type":"ContainerDied","Data":"c653007a50dfa811eddf68f9b05b0c291531fb9f4b634d48f0ee1d145dd0d945"} Sep 30 14:33:39 crc kubenswrapper[4676]: I0930 14:33:39.276460 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rdn2k" event={"ID":"64e83588-fc5d-46c8-ac17-2d8cddb97d90","Type":"ContainerStarted","Data":"186aaab8bed092c0f53ce44317ee3136fa1b9a192139d8bc8fb1ca962243cc0c"} Sep 30 14:33:39 crc kubenswrapper[4676]: I0930 14:33:39.280301 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8kvr7" event={"ID":"fc3c389e-a38f-481d-a180-989c52166852","Type":"ContainerStarted","Data":"8007b0672a55538a3e8d6d45067b739e0a40f30f8b339219b3196a0a482a9d84"} Sep 30 14:33:39 crc kubenswrapper[4676]: I0930 14:33:39.301830 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rdn2k" podStartSLOduration=2.476910485 podStartE2EDuration="5.301809348s" podCreationTimestamp="2025-09-30 14:33:34 +0000 UTC" firstStartedPulling="2025-09-30 14:33:36.228855994 +0000 UTC m=+2120.211944423" lastFinishedPulling="2025-09-30 14:33:39.053754857 +0000 UTC m=+2123.036843286" observedRunningTime="2025-09-30 14:33:39.296813768 +0000 UTC m=+2123.279902207" watchObservedRunningTime="2025-09-30 14:33:39.301809348 +0000 UTC m=+2123.284897777" Sep 30 14:33:39 crc kubenswrapper[4676]: I0930 14:33:39.319368 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8kvr7" podStartSLOduration=2.855396027 podStartE2EDuration="5.319349341s" podCreationTimestamp="2025-09-30 14:33:34 +0000 UTC" firstStartedPulling="2025-09-30 14:33:36.233760032 +0000 UTC m=+2120.216848461" lastFinishedPulling="2025-09-30 14:33:38.697713346 +0000 UTC m=+2122.680801775" observedRunningTime="2025-09-30 14:33:39.313270754 +0000 UTC m=+2123.296359173" watchObservedRunningTime="2025-09-30 14:33:39.319349341 +0000 UTC m=+2123.302437760" Sep 30 14:33:44 crc kubenswrapper[4676]: I0930 14:33:44.662050 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rdn2k" Sep 30 14:33:44 crc kubenswrapper[4676]: I0930 14:33:44.664561 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rdn2k" Sep 30 14:33:44 crc kubenswrapper[4676]: I0930 14:33:44.719264 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rdn2k" Sep 30 14:33:44 crc kubenswrapper[4676]: I0930 14:33:44.850161 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8kvr7" Sep 30 14:33:44 crc kubenswrapper[4676]: I0930 14:33:44.850210 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8kvr7" Sep 30 14:33:44 crc kubenswrapper[4676]: I0930 14:33:44.897480 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8kvr7" Sep 30 14:33:45 crc kubenswrapper[4676]: I0930 14:33:45.380184 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8kvr7" Sep 30 14:33:45 crc kubenswrapper[4676]: I0930 14:33:45.381562 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rdn2k" Sep 30 14:33:46 crc kubenswrapper[4676]: I0930 14:33:46.710764 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8kvr7"] Sep 30 14:33:47 crc kubenswrapper[4676]: I0930 14:33:47.343466 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8kvr7" podUID="fc3c389e-a38f-481d-a180-989c52166852" containerName="registry-server" containerID="cri-o://8007b0672a55538a3e8d6d45067b739e0a40f30f8b339219b3196a0a482a9d84" gracePeriod=2 Sep 30 14:33:47 crc kubenswrapper[4676]: I0930 14:33:47.717284 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rdn2k"] Sep 30 14:33:47 crc kubenswrapper[4676]: I0930 14:33:47.797568 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8kvr7" Sep 30 14:33:47 crc kubenswrapper[4676]: I0930 14:33:47.967423 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gqt8\" (UniqueName: \"kubernetes.io/projected/fc3c389e-a38f-481d-a180-989c52166852-kube-api-access-5gqt8\") pod \"fc3c389e-a38f-481d-a180-989c52166852\" (UID: \"fc3c389e-a38f-481d-a180-989c52166852\") " Sep 30 14:33:47 crc kubenswrapper[4676]: I0930 14:33:47.967564 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc3c389e-a38f-481d-a180-989c52166852-catalog-content\") pod \"fc3c389e-a38f-481d-a180-989c52166852\" (UID: \"fc3c389e-a38f-481d-a180-989c52166852\") " Sep 30 14:33:47 crc kubenswrapper[4676]: I0930 14:33:47.967609 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc3c389e-a38f-481d-a180-989c52166852-utilities\") pod \"fc3c389e-a38f-481d-a180-989c52166852\" (UID: \"fc3c389e-a38f-481d-a180-989c52166852\") " Sep 30 14:33:47 crc kubenswrapper[4676]: I0930 14:33:47.968676 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc3c389e-a38f-481d-a180-989c52166852-utilities" (OuterVolumeSpecName: "utilities") pod "fc3c389e-a38f-481d-a180-989c52166852" (UID: "fc3c389e-a38f-481d-a180-989c52166852"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:33:47 crc kubenswrapper[4676]: I0930 14:33:47.975802 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc3c389e-a38f-481d-a180-989c52166852-kube-api-access-5gqt8" (OuterVolumeSpecName: "kube-api-access-5gqt8") pod "fc3c389e-a38f-481d-a180-989c52166852" (UID: "fc3c389e-a38f-481d-a180-989c52166852"). InnerVolumeSpecName "kube-api-access-5gqt8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:33:47 crc kubenswrapper[4676]: I0930 14:33:47.979858 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc3c389e-a38f-481d-a180-989c52166852-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fc3c389e-a38f-481d-a180-989c52166852" (UID: "fc3c389e-a38f-481d-a180-989c52166852"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:33:48 crc kubenswrapper[4676]: I0930 14:33:48.069904 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5gqt8\" (UniqueName: \"kubernetes.io/projected/fc3c389e-a38f-481d-a180-989c52166852-kube-api-access-5gqt8\") on node \"crc\" DevicePath \"\"" Sep 30 14:33:48 crc kubenswrapper[4676]: I0930 14:33:48.069951 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc3c389e-a38f-481d-a180-989c52166852-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 14:33:48 crc kubenswrapper[4676]: I0930 14:33:48.069961 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc3c389e-a38f-481d-a180-989c52166852-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 14:33:48 crc kubenswrapper[4676]: I0930 14:33:48.356096 4676 generic.go:334] "Generic (PLEG): container finished" podID="fc3c389e-a38f-481d-a180-989c52166852" containerID="8007b0672a55538a3e8d6d45067b739e0a40f30f8b339219b3196a0a482a9d84" exitCode=0 Sep 30 14:33:48 crc kubenswrapper[4676]: I0930 14:33:48.356157 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8kvr7" Sep 30 14:33:48 crc kubenswrapper[4676]: I0930 14:33:48.356200 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8kvr7" event={"ID":"fc3c389e-a38f-481d-a180-989c52166852","Type":"ContainerDied","Data":"8007b0672a55538a3e8d6d45067b739e0a40f30f8b339219b3196a0a482a9d84"} Sep 30 14:33:48 crc kubenswrapper[4676]: I0930 14:33:48.356270 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8kvr7" event={"ID":"fc3c389e-a38f-481d-a180-989c52166852","Type":"ContainerDied","Data":"d7b903e1560a5eeaa8f5a798e69fa936daec6c99b6905491afb0389daa501586"} Sep 30 14:33:48 crc kubenswrapper[4676]: I0930 14:33:48.356292 4676 scope.go:117] "RemoveContainer" containerID="8007b0672a55538a3e8d6d45067b739e0a40f30f8b339219b3196a0a482a9d84" Sep 30 14:33:48 crc kubenswrapper[4676]: I0930 14:33:48.356690 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rdn2k" podUID="64e83588-fc5d-46c8-ac17-2d8cddb97d90" containerName="registry-server" containerID="cri-o://186aaab8bed092c0f53ce44317ee3136fa1b9a192139d8bc8fb1ca962243cc0c" gracePeriod=2 Sep 30 14:33:48 crc kubenswrapper[4676]: I0930 14:33:48.393595 4676 scope.go:117] "RemoveContainer" containerID="9f2ad024058ab900a0cb573ea1d0692b4059b517e23fa171a53c1e9eb51b6754" Sep 30 14:33:48 crc kubenswrapper[4676]: I0930 14:33:48.395466 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8kvr7"] Sep 30 14:33:48 crc kubenswrapper[4676]: I0930 14:33:48.406938 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8kvr7"] Sep 30 14:33:48 crc kubenswrapper[4676]: I0930 14:33:48.417799 4676 scope.go:117] "RemoveContainer" containerID="b15a354dc6b17134d51ea075cc8390543c8ec0642ce4210e70583104ba45f32e" Sep 30 14:33:48 crc kubenswrapper[4676]: I0930 14:33:48.541139 4676 scope.go:117] "RemoveContainer" containerID="8007b0672a55538a3e8d6d45067b739e0a40f30f8b339219b3196a0a482a9d84" Sep 30 14:33:48 crc kubenswrapper[4676]: E0930 14:33:48.541797 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8007b0672a55538a3e8d6d45067b739e0a40f30f8b339219b3196a0a482a9d84\": container with ID starting with 8007b0672a55538a3e8d6d45067b739e0a40f30f8b339219b3196a0a482a9d84 not found: ID does not exist" containerID="8007b0672a55538a3e8d6d45067b739e0a40f30f8b339219b3196a0a482a9d84" Sep 30 14:33:48 crc kubenswrapper[4676]: I0930 14:33:48.541854 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8007b0672a55538a3e8d6d45067b739e0a40f30f8b339219b3196a0a482a9d84"} err="failed to get container status \"8007b0672a55538a3e8d6d45067b739e0a40f30f8b339219b3196a0a482a9d84\": rpc error: code = NotFound desc = could not find container \"8007b0672a55538a3e8d6d45067b739e0a40f30f8b339219b3196a0a482a9d84\": container with ID starting with 8007b0672a55538a3e8d6d45067b739e0a40f30f8b339219b3196a0a482a9d84 not found: ID does not exist" Sep 30 14:33:48 crc kubenswrapper[4676]: I0930 14:33:48.541912 4676 scope.go:117] "RemoveContainer" containerID="9f2ad024058ab900a0cb573ea1d0692b4059b517e23fa171a53c1e9eb51b6754" Sep 30 14:33:48 crc kubenswrapper[4676]: E0930 14:33:48.544195 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f2ad024058ab900a0cb573ea1d0692b4059b517e23fa171a53c1e9eb51b6754\": container with ID starting with 9f2ad024058ab900a0cb573ea1d0692b4059b517e23fa171a53c1e9eb51b6754 not found: ID does not exist" containerID="9f2ad024058ab900a0cb573ea1d0692b4059b517e23fa171a53c1e9eb51b6754" Sep 30 14:33:48 crc kubenswrapper[4676]: I0930 14:33:48.544242 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f2ad024058ab900a0cb573ea1d0692b4059b517e23fa171a53c1e9eb51b6754"} err="failed to get container status \"9f2ad024058ab900a0cb573ea1d0692b4059b517e23fa171a53c1e9eb51b6754\": rpc error: code = NotFound desc = could not find container \"9f2ad024058ab900a0cb573ea1d0692b4059b517e23fa171a53c1e9eb51b6754\": container with ID starting with 9f2ad024058ab900a0cb573ea1d0692b4059b517e23fa171a53c1e9eb51b6754 not found: ID does not exist" Sep 30 14:33:48 crc kubenswrapper[4676]: I0930 14:33:48.544276 4676 scope.go:117] "RemoveContainer" containerID="b15a354dc6b17134d51ea075cc8390543c8ec0642ce4210e70583104ba45f32e" Sep 30 14:33:48 crc kubenswrapper[4676]: E0930 14:33:48.544545 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b15a354dc6b17134d51ea075cc8390543c8ec0642ce4210e70583104ba45f32e\": container with ID starting with b15a354dc6b17134d51ea075cc8390543c8ec0642ce4210e70583104ba45f32e not found: ID does not exist" containerID="b15a354dc6b17134d51ea075cc8390543c8ec0642ce4210e70583104ba45f32e" Sep 30 14:33:48 crc kubenswrapper[4676]: I0930 14:33:48.544607 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b15a354dc6b17134d51ea075cc8390543c8ec0642ce4210e70583104ba45f32e"} err="failed to get container status \"b15a354dc6b17134d51ea075cc8390543c8ec0642ce4210e70583104ba45f32e\": rpc error: code = NotFound desc = could not find container \"b15a354dc6b17134d51ea075cc8390543c8ec0642ce4210e70583104ba45f32e\": container with ID starting with b15a354dc6b17134d51ea075cc8390543c8ec0642ce4210e70583104ba45f32e not found: ID does not exist" Sep 30 14:33:48 crc kubenswrapper[4676]: I0930 14:33:48.809551 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rdn2k" Sep 30 14:33:48 crc kubenswrapper[4676]: I0930 14:33:48.984189 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rg4nx\" (UniqueName: \"kubernetes.io/projected/64e83588-fc5d-46c8-ac17-2d8cddb97d90-kube-api-access-rg4nx\") pod \"64e83588-fc5d-46c8-ac17-2d8cddb97d90\" (UID: \"64e83588-fc5d-46c8-ac17-2d8cddb97d90\") " Sep 30 14:33:48 crc kubenswrapper[4676]: I0930 14:33:48.984400 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64e83588-fc5d-46c8-ac17-2d8cddb97d90-utilities\") pod \"64e83588-fc5d-46c8-ac17-2d8cddb97d90\" (UID: \"64e83588-fc5d-46c8-ac17-2d8cddb97d90\") " Sep 30 14:33:48 crc kubenswrapper[4676]: I0930 14:33:48.984450 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64e83588-fc5d-46c8-ac17-2d8cddb97d90-catalog-content\") pod \"64e83588-fc5d-46c8-ac17-2d8cddb97d90\" (UID: \"64e83588-fc5d-46c8-ac17-2d8cddb97d90\") " Sep 30 14:33:48 crc kubenswrapper[4676]: I0930 14:33:48.986158 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64e83588-fc5d-46c8-ac17-2d8cddb97d90-utilities" (OuterVolumeSpecName: "utilities") pod "64e83588-fc5d-46c8-ac17-2d8cddb97d90" (UID: "64e83588-fc5d-46c8-ac17-2d8cddb97d90"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:33:48 crc kubenswrapper[4676]: I0930 14:33:48.993666 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64e83588-fc5d-46c8-ac17-2d8cddb97d90-kube-api-access-rg4nx" (OuterVolumeSpecName: "kube-api-access-rg4nx") pod "64e83588-fc5d-46c8-ac17-2d8cddb97d90" (UID: "64e83588-fc5d-46c8-ac17-2d8cddb97d90"). InnerVolumeSpecName "kube-api-access-rg4nx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:33:49 crc kubenswrapper[4676]: I0930 14:33:49.035835 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64e83588-fc5d-46c8-ac17-2d8cddb97d90-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "64e83588-fc5d-46c8-ac17-2d8cddb97d90" (UID: "64e83588-fc5d-46c8-ac17-2d8cddb97d90"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:33:49 crc kubenswrapper[4676]: I0930 14:33:49.087399 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rg4nx\" (UniqueName: \"kubernetes.io/projected/64e83588-fc5d-46c8-ac17-2d8cddb97d90-kube-api-access-rg4nx\") on node \"crc\" DevicePath \"\"" Sep 30 14:33:49 crc kubenswrapper[4676]: I0930 14:33:49.087467 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64e83588-fc5d-46c8-ac17-2d8cddb97d90-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 14:33:49 crc kubenswrapper[4676]: I0930 14:33:49.087482 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64e83588-fc5d-46c8-ac17-2d8cddb97d90-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 14:33:49 crc kubenswrapper[4676]: I0930 14:33:49.369800 4676 generic.go:334] "Generic (PLEG): container finished" podID="64e83588-fc5d-46c8-ac17-2d8cddb97d90" containerID="186aaab8bed092c0f53ce44317ee3136fa1b9a192139d8bc8fb1ca962243cc0c" exitCode=0 Sep 30 14:33:49 crc kubenswrapper[4676]: I0930 14:33:49.369858 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rdn2k" event={"ID":"64e83588-fc5d-46c8-ac17-2d8cddb97d90","Type":"ContainerDied","Data":"186aaab8bed092c0f53ce44317ee3136fa1b9a192139d8bc8fb1ca962243cc0c"} Sep 30 14:33:49 crc kubenswrapper[4676]: I0930 14:33:49.369908 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rdn2k" Sep 30 14:33:49 crc kubenswrapper[4676]: I0930 14:33:49.369932 4676 scope.go:117] "RemoveContainer" containerID="186aaab8bed092c0f53ce44317ee3136fa1b9a192139d8bc8fb1ca962243cc0c" Sep 30 14:33:49 crc kubenswrapper[4676]: I0930 14:33:49.369920 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rdn2k" event={"ID":"64e83588-fc5d-46c8-ac17-2d8cddb97d90","Type":"ContainerDied","Data":"d7ebf02e93512ffd4a8b8b827f2c6ffaa8657d0f7c1f5b791fe0645ec44a3951"} Sep 30 14:33:49 crc kubenswrapper[4676]: I0930 14:33:49.403498 4676 scope.go:117] "RemoveContainer" containerID="c653007a50dfa811eddf68f9b05b0c291531fb9f4b634d48f0ee1d145dd0d945" Sep 30 14:33:49 crc kubenswrapper[4676]: I0930 14:33:49.407068 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rdn2k"] Sep 30 14:33:49 crc kubenswrapper[4676]: I0930 14:33:49.415416 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rdn2k"] Sep 30 14:33:49 crc kubenswrapper[4676]: I0930 14:33:49.425445 4676 scope.go:117] "RemoveContainer" containerID="48e3d2a72f80ac023681f2eb7bdf103e3f76d03cb5f39cdcbdd3d44dc0d7a4fb" Sep 30 14:33:49 crc kubenswrapper[4676]: I0930 14:33:49.443522 4676 scope.go:117] "RemoveContainer" containerID="186aaab8bed092c0f53ce44317ee3136fa1b9a192139d8bc8fb1ca962243cc0c" Sep 30 14:33:49 crc kubenswrapper[4676]: E0930 14:33:49.444200 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"186aaab8bed092c0f53ce44317ee3136fa1b9a192139d8bc8fb1ca962243cc0c\": container with ID starting with 186aaab8bed092c0f53ce44317ee3136fa1b9a192139d8bc8fb1ca962243cc0c not found: ID does not exist" containerID="186aaab8bed092c0f53ce44317ee3136fa1b9a192139d8bc8fb1ca962243cc0c" Sep 30 14:33:49 crc kubenswrapper[4676]: I0930 14:33:49.444254 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"186aaab8bed092c0f53ce44317ee3136fa1b9a192139d8bc8fb1ca962243cc0c"} err="failed to get container status \"186aaab8bed092c0f53ce44317ee3136fa1b9a192139d8bc8fb1ca962243cc0c\": rpc error: code = NotFound desc = could not find container \"186aaab8bed092c0f53ce44317ee3136fa1b9a192139d8bc8fb1ca962243cc0c\": container with ID starting with 186aaab8bed092c0f53ce44317ee3136fa1b9a192139d8bc8fb1ca962243cc0c not found: ID does not exist" Sep 30 14:33:49 crc kubenswrapper[4676]: I0930 14:33:49.444282 4676 scope.go:117] "RemoveContainer" containerID="c653007a50dfa811eddf68f9b05b0c291531fb9f4b634d48f0ee1d145dd0d945" Sep 30 14:33:49 crc kubenswrapper[4676]: E0930 14:33:49.444777 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c653007a50dfa811eddf68f9b05b0c291531fb9f4b634d48f0ee1d145dd0d945\": container with ID starting with c653007a50dfa811eddf68f9b05b0c291531fb9f4b634d48f0ee1d145dd0d945 not found: ID does not exist" containerID="c653007a50dfa811eddf68f9b05b0c291531fb9f4b634d48f0ee1d145dd0d945" Sep 30 14:33:49 crc kubenswrapper[4676]: I0930 14:33:49.444835 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c653007a50dfa811eddf68f9b05b0c291531fb9f4b634d48f0ee1d145dd0d945"} err="failed to get container status \"c653007a50dfa811eddf68f9b05b0c291531fb9f4b634d48f0ee1d145dd0d945\": rpc error: code = NotFound desc = could not find container \"c653007a50dfa811eddf68f9b05b0c291531fb9f4b634d48f0ee1d145dd0d945\": container with ID starting with c653007a50dfa811eddf68f9b05b0c291531fb9f4b634d48f0ee1d145dd0d945 not found: ID does not exist" Sep 30 14:33:49 crc kubenswrapper[4676]: I0930 14:33:49.444864 4676 scope.go:117] "RemoveContainer" containerID="48e3d2a72f80ac023681f2eb7bdf103e3f76d03cb5f39cdcbdd3d44dc0d7a4fb" Sep 30 14:33:49 crc kubenswrapper[4676]: I0930 14:33:49.445061 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64e83588-fc5d-46c8-ac17-2d8cddb97d90" path="/var/lib/kubelet/pods/64e83588-fc5d-46c8-ac17-2d8cddb97d90/volumes" Sep 30 14:33:49 crc kubenswrapper[4676]: E0930 14:33:49.445492 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48e3d2a72f80ac023681f2eb7bdf103e3f76d03cb5f39cdcbdd3d44dc0d7a4fb\": container with ID starting with 48e3d2a72f80ac023681f2eb7bdf103e3f76d03cb5f39cdcbdd3d44dc0d7a4fb not found: ID does not exist" containerID="48e3d2a72f80ac023681f2eb7bdf103e3f76d03cb5f39cdcbdd3d44dc0d7a4fb" Sep 30 14:33:49 crc kubenswrapper[4676]: I0930 14:33:49.445544 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48e3d2a72f80ac023681f2eb7bdf103e3f76d03cb5f39cdcbdd3d44dc0d7a4fb"} err="failed to get container status \"48e3d2a72f80ac023681f2eb7bdf103e3f76d03cb5f39cdcbdd3d44dc0d7a4fb\": rpc error: code = NotFound desc = could not find container \"48e3d2a72f80ac023681f2eb7bdf103e3f76d03cb5f39cdcbdd3d44dc0d7a4fb\": container with ID starting with 48e3d2a72f80ac023681f2eb7bdf103e3f76d03cb5f39cdcbdd3d44dc0d7a4fb not found: ID does not exist" Sep 30 14:33:49 crc kubenswrapper[4676]: I0930 14:33:49.445685 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc3c389e-a38f-481d-a180-989c52166852" path="/var/lib/kubelet/pods/fc3c389e-a38f-481d-a180-989c52166852/volumes" Sep 30 14:34:15 crc kubenswrapper[4676]: I0930 14:34:15.602785 4676 generic.go:334] "Generic (PLEG): container finished" podID="2ecdd5ea-0ceb-47e4-9e2a-3782b3050fa7" containerID="f94919ccbfefbf8bceb1536f7cb7d19e6bb680ad33f640e85fb9fd7bdd764376" exitCode=0 Sep 30 14:34:15 crc kubenswrapper[4676]: I0930 14:34:15.603114 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gq8qc" event={"ID":"2ecdd5ea-0ceb-47e4-9e2a-3782b3050fa7","Type":"ContainerDied","Data":"f94919ccbfefbf8bceb1536f7cb7d19e6bb680ad33f640e85fb9fd7bdd764376"} Sep 30 14:34:17 crc kubenswrapper[4676]: I0930 14:34:17.033861 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gq8qc" Sep 30 14:34:17 crc kubenswrapper[4676]: I0930 14:34:17.126739 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2ecdd5ea-0ceb-47e4-9e2a-3782b3050fa7-inventory\") pod \"2ecdd5ea-0ceb-47e4-9e2a-3782b3050fa7\" (UID: \"2ecdd5ea-0ceb-47e4-9e2a-3782b3050fa7\") " Sep 30 14:34:17 crc kubenswrapper[4676]: I0930 14:34:17.126997 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2ecdd5ea-0ceb-47e4-9e2a-3782b3050fa7-ssh-key\") pod \"2ecdd5ea-0ceb-47e4-9e2a-3782b3050fa7\" (UID: \"2ecdd5ea-0ceb-47e4-9e2a-3782b3050fa7\") " Sep 30 14:34:17 crc kubenswrapper[4676]: I0930 14:34:17.127124 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhsgn\" (UniqueName: \"kubernetes.io/projected/2ecdd5ea-0ceb-47e4-9e2a-3782b3050fa7-kube-api-access-lhsgn\") pod \"2ecdd5ea-0ceb-47e4-9e2a-3782b3050fa7\" (UID: \"2ecdd5ea-0ceb-47e4-9e2a-3782b3050fa7\") " Sep 30 14:34:17 crc kubenswrapper[4676]: I0930 14:34:17.132418 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ecdd5ea-0ceb-47e4-9e2a-3782b3050fa7-kube-api-access-lhsgn" (OuterVolumeSpecName: "kube-api-access-lhsgn") pod "2ecdd5ea-0ceb-47e4-9e2a-3782b3050fa7" (UID: "2ecdd5ea-0ceb-47e4-9e2a-3782b3050fa7"). InnerVolumeSpecName "kube-api-access-lhsgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:34:17 crc kubenswrapper[4676]: I0930 14:34:17.154511 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ecdd5ea-0ceb-47e4-9e2a-3782b3050fa7-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2ecdd5ea-0ceb-47e4-9e2a-3782b3050fa7" (UID: "2ecdd5ea-0ceb-47e4-9e2a-3782b3050fa7"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:34:17 crc kubenswrapper[4676]: I0930 14:34:17.156816 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ecdd5ea-0ceb-47e4-9e2a-3782b3050fa7-inventory" (OuterVolumeSpecName: "inventory") pod "2ecdd5ea-0ceb-47e4-9e2a-3782b3050fa7" (UID: "2ecdd5ea-0ceb-47e4-9e2a-3782b3050fa7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:34:17 crc kubenswrapper[4676]: I0930 14:34:17.229666 4676 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2ecdd5ea-0ceb-47e4-9e2a-3782b3050fa7-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 14:34:17 crc kubenswrapper[4676]: I0930 14:34:17.229715 4676 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2ecdd5ea-0ceb-47e4-9e2a-3782b3050fa7-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 14:34:17 crc kubenswrapper[4676]: I0930 14:34:17.229726 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhsgn\" (UniqueName: \"kubernetes.io/projected/2ecdd5ea-0ceb-47e4-9e2a-3782b3050fa7-kube-api-access-lhsgn\") on node \"crc\" DevicePath \"\"" Sep 30 14:34:17 crc kubenswrapper[4676]: I0930 14:34:17.622503 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gq8qc" event={"ID":"2ecdd5ea-0ceb-47e4-9e2a-3782b3050fa7","Type":"ContainerDied","Data":"4d2494d465ca86715d234fd617d327d62d38fc188f13b57433981565294393dc"} Sep 30 14:34:17 crc kubenswrapper[4676]: I0930 14:34:17.622545 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d2494d465ca86715d234fd617d327d62d38fc188f13b57433981565294393dc" Sep 30 14:34:17 crc kubenswrapper[4676]: I0930 14:34:17.622569 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gq8qc" Sep 30 14:34:17 crc kubenswrapper[4676]: I0930 14:34:17.701029 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-kgg8r"] Sep 30 14:34:17 crc kubenswrapper[4676]: E0930 14:34:17.701469 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ecdd5ea-0ceb-47e4-9e2a-3782b3050fa7" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Sep 30 14:34:17 crc kubenswrapper[4676]: I0930 14:34:17.701496 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ecdd5ea-0ceb-47e4-9e2a-3782b3050fa7" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Sep 30 14:34:17 crc kubenswrapper[4676]: E0930 14:34:17.701519 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64e83588-fc5d-46c8-ac17-2d8cddb97d90" containerName="registry-server" Sep 30 14:34:17 crc kubenswrapper[4676]: I0930 14:34:17.701526 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="64e83588-fc5d-46c8-ac17-2d8cddb97d90" containerName="registry-server" Sep 30 14:34:17 crc kubenswrapper[4676]: E0930 14:34:17.701535 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc3c389e-a38f-481d-a180-989c52166852" containerName="registry-server" Sep 30 14:34:17 crc kubenswrapper[4676]: I0930 14:34:17.701542 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc3c389e-a38f-481d-a180-989c52166852" containerName="registry-server" Sep 30 14:34:17 crc kubenswrapper[4676]: E0930 14:34:17.701560 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc3c389e-a38f-481d-a180-989c52166852" containerName="extract-content" Sep 30 14:34:17 crc kubenswrapper[4676]: I0930 14:34:17.701567 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc3c389e-a38f-481d-a180-989c52166852" containerName="extract-content" Sep 30 14:34:17 crc kubenswrapper[4676]: E0930 14:34:17.701580 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64e83588-fc5d-46c8-ac17-2d8cddb97d90" containerName="extract-content" Sep 30 14:34:17 crc kubenswrapper[4676]: I0930 14:34:17.701589 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="64e83588-fc5d-46c8-ac17-2d8cddb97d90" containerName="extract-content" Sep 30 14:34:17 crc kubenswrapper[4676]: E0930 14:34:17.701611 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64e83588-fc5d-46c8-ac17-2d8cddb97d90" containerName="extract-utilities" Sep 30 14:34:17 crc kubenswrapper[4676]: I0930 14:34:17.701620 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="64e83588-fc5d-46c8-ac17-2d8cddb97d90" containerName="extract-utilities" Sep 30 14:34:17 crc kubenswrapper[4676]: E0930 14:34:17.701636 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc3c389e-a38f-481d-a180-989c52166852" containerName="extract-utilities" Sep 30 14:34:17 crc kubenswrapper[4676]: I0930 14:34:17.701643 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc3c389e-a38f-481d-a180-989c52166852" containerName="extract-utilities" Sep 30 14:34:17 crc kubenswrapper[4676]: I0930 14:34:17.701868 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="64e83588-fc5d-46c8-ac17-2d8cddb97d90" containerName="registry-server" Sep 30 14:34:17 crc kubenswrapper[4676]: I0930 14:34:17.701902 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc3c389e-a38f-481d-a180-989c52166852" containerName="registry-server" Sep 30 14:34:17 crc kubenswrapper[4676]: I0930 14:34:17.701915 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ecdd5ea-0ceb-47e4-9e2a-3782b3050fa7" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Sep 30 14:34:17 crc kubenswrapper[4676]: I0930 14:34:17.702515 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-kgg8r" Sep 30 14:34:17 crc kubenswrapper[4676]: I0930 14:34:17.707005 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 14:34:17 crc kubenswrapper[4676]: I0930 14:34:17.707172 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mzpth" Sep 30 14:34:17 crc kubenswrapper[4676]: I0930 14:34:17.707264 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 14:34:17 crc kubenswrapper[4676]: I0930 14:34:17.709143 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 14:34:17 crc kubenswrapper[4676]: I0930 14:34:17.716788 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-kgg8r"] Sep 30 14:34:17 crc kubenswrapper[4676]: I0930 14:34:17.841365 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/c9aca039-cea1-4fe5-8ee1-226f22cbefd2-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-kgg8r\" (UID: \"c9aca039-cea1-4fe5-8ee1-226f22cbefd2\") " pod="openstack/ssh-known-hosts-edpm-deployment-kgg8r" Sep 30 14:34:17 crc kubenswrapper[4676]: I0930 14:34:17.841427 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c9aca039-cea1-4fe5-8ee1-226f22cbefd2-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-kgg8r\" (UID: \"c9aca039-cea1-4fe5-8ee1-226f22cbefd2\") " pod="openstack/ssh-known-hosts-edpm-deployment-kgg8r" Sep 30 14:34:17 crc kubenswrapper[4676]: I0930 14:34:17.841486 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kw9f\" (UniqueName: \"kubernetes.io/projected/c9aca039-cea1-4fe5-8ee1-226f22cbefd2-kube-api-access-2kw9f\") pod \"ssh-known-hosts-edpm-deployment-kgg8r\" (UID: \"c9aca039-cea1-4fe5-8ee1-226f22cbefd2\") " pod="openstack/ssh-known-hosts-edpm-deployment-kgg8r" Sep 30 14:34:17 crc kubenswrapper[4676]: I0930 14:34:17.942992 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kw9f\" (UniqueName: \"kubernetes.io/projected/c9aca039-cea1-4fe5-8ee1-226f22cbefd2-kube-api-access-2kw9f\") pod \"ssh-known-hosts-edpm-deployment-kgg8r\" (UID: \"c9aca039-cea1-4fe5-8ee1-226f22cbefd2\") " pod="openstack/ssh-known-hosts-edpm-deployment-kgg8r" Sep 30 14:34:17 crc kubenswrapper[4676]: I0930 14:34:17.943139 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/c9aca039-cea1-4fe5-8ee1-226f22cbefd2-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-kgg8r\" (UID: \"c9aca039-cea1-4fe5-8ee1-226f22cbefd2\") " pod="openstack/ssh-known-hosts-edpm-deployment-kgg8r" Sep 30 14:34:17 crc kubenswrapper[4676]: I0930 14:34:17.943181 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c9aca039-cea1-4fe5-8ee1-226f22cbefd2-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-kgg8r\" (UID: \"c9aca039-cea1-4fe5-8ee1-226f22cbefd2\") " pod="openstack/ssh-known-hosts-edpm-deployment-kgg8r" Sep 30 14:34:17 crc kubenswrapper[4676]: I0930 14:34:17.948493 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/c9aca039-cea1-4fe5-8ee1-226f22cbefd2-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-kgg8r\" (UID: \"c9aca039-cea1-4fe5-8ee1-226f22cbefd2\") " pod="openstack/ssh-known-hosts-edpm-deployment-kgg8r" Sep 30 14:34:17 crc kubenswrapper[4676]: I0930 14:34:17.948901 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c9aca039-cea1-4fe5-8ee1-226f22cbefd2-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-kgg8r\" (UID: \"c9aca039-cea1-4fe5-8ee1-226f22cbefd2\") " pod="openstack/ssh-known-hosts-edpm-deployment-kgg8r" Sep 30 14:34:17 crc kubenswrapper[4676]: I0930 14:34:17.960532 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kw9f\" (UniqueName: \"kubernetes.io/projected/c9aca039-cea1-4fe5-8ee1-226f22cbefd2-kube-api-access-2kw9f\") pod \"ssh-known-hosts-edpm-deployment-kgg8r\" (UID: \"c9aca039-cea1-4fe5-8ee1-226f22cbefd2\") " pod="openstack/ssh-known-hosts-edpm-deployment-kgg8r" Sep 30 14:34:18 crc kubenswrapper[4676]: I0930 14:34:18.023077 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-kgg8r" Sep 30 14:34:18 crc kubenswrapper[4676]: I0930 14:34:18.575612 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-kgg8r"] Sep 30 14:34:18 crc kubenswrapper[4676]: I0930 14:34:18.631387 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-kgg8r" event={"ID":"c9aca039-cea1-4fe5-8ee1-226f22cbefd2","Type":"ContainerStarted","Data":"e64cd3ab913f0671bdde29f608b3b09fc331b576d36f44742b78d58a05c0dd32"} Sep 30 14:34:19 crc kubenswrapper[4676]: I0930 14:34:19.641298 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-kgg8r" event={"ID":"c9aca039-cea1-4fe5-8ee1-226f22cbefd2","Type":"ContainerStarted","Data":"684c2418715fb5ea923f86f851c8618b62e321a1367f40a7a2bfc182e127aec2"} Sep 30 14:34:19 crc kubenswrapper[4676]: I0930 14:34:19.668848 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-kgg8r" podStartSLOduration=2.244193436 podStartE2EDuration="2.668827408s" podCreationTimestamp="2025-09-30 14:34:17 +0000 UTC" firstStartedPulling="2025-09-30 14:34:18.586511393 +0000 UTC m=+2162.569599822" lastFinishedPulling="2025-09-30 14:34:19.011145365 +0000 UTC m=+2162.994233794" observedRunningTime="2025-09-30 14:34:19.661276835 +0000 UTC m=+2163.644365274" watchObservedRunningTime="2025-09-30 14:34:19.668827408 +0000 UTC m=+2163.651915837" Sep 30 14:34:26 crc kubenswrapper[4676]: I0930 14:34:26.705639 4676 generic.go:334] "Generic (PLEG): container finished" podID="c9aca039-cea1-4fe5-8ee1-226f22cbefd2" containerID="684c2418715fb5ea923f86f851c8618b62e321a1367f40a7a2bfc182e127aec2" exitCode=0 Sep 30 14:34:26 crc kubenswrapper[4676]: I0930 14:34:26.706073 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-kgg8r" event={"ID":"c9aca039-cea1-4fe5-8ee1-226f22cbefd2","Type":"ContainerDied","Data":"684c2418715fb5ea923f86f851c8618b62e321a1367f40a7a2bfc182e127aec2"} Sep 30 14:34:28 crc kubenswrapper[4676]: I0930 14:34:28.126774 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-kgg8r" Sep 30 14:34:28 crc kubenswrapper[4676]: I0930 14:34:28.243311 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c9aca039-cea1-4fe5-8ee1-226f22cbefd2-ssh-key-openstack-edpm-ipam\") pod \"c9aca039-cea1-4fe5-8ee1-226f22cbefd2\" (UID: \"c9aca039-cea1-4fe5-8ee1-226f22cbefd2\") " Sep 30 14:34:28 crc kubenswrapper[4676]: I0930 14:34:28.243865 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kw9f\" (UniqueName: \"kubernetes.io/projected/c9aca039-cea1-4fe5-8ee1-226f22cbefd2-kube-api-access-2kw9f\") pod \"c9aca039-cea1-4fe5-8ee1-226f22cbefd2\" (UID: \"c9aca039-cea1-4fe5-8ee1-226f22cbefd2\") " Sep 30 14:34:28 crc kubenswrapper[4676]: I0930 14:34:28.243983 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/c9aca039-cea1-4fe5-8ee1-226f22cbefd2-inventory-0\") pod \"c9aca039-cea1-4fe5-8ee1-226f22cbefd2\" (UID: \"c9aca039-cea1-4fe5-8ee1-226f22cbefd2\") " Sep 30 14:34:28 crc kubenswrapper[4676]: I0930 14:34:28.249695 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9aca039-cea1-4fe5-8ee1-226f22cbefd2-kube-api-access-2kw9f" (OuterVolumeSpecName: "kube-api-access-2kw9f") pod "c9aca039-cea1-4fe5-8ee1-226f22cbefd2" (UID: "c9aca039-cea1-4fe5-8ee1-226f22cbefd2"). InnerVolumeSpecName "kube-api-access-2kw9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:34:28 crc kubenswrapper[4676]: I0930 14:34:28.272353 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9aca039-cea1-4fe5-8ee1-226f22cbefd2-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "c9aca039-cea1-4fe5-8ee1-226f22cbefd2" (UID: "c9aca039-cea1-4fe5-8ee1-226f22cbefd2"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:34:28 crc kubenswrapper[4676]: I0930 14:34:28.272400 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9aca039-cea1-4fe5-8ee1-226f22cbefd2-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "c9aca039-cea1-4fe5-8ee1-226f22cbefd2" (UID: "c9aca039-cea1-4fe5-8ee1-226f22cbefd2"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:34:28 crc kubenswrapper[4676]: I0930 14:34:28.346098 4676 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/c9aca039-cea1-4fe5-8ee1-226f22cbefd2-inventory-0\") on node \"crc\" DevicePath \"\"" Sep 30 14:34:28 crc kubenswrapper[4676]: I0930 14:34:28.346137 4676 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c9aca039-cea1-4fe5-8ee1-226f22cbefd2-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Sep 30 14:34:28 crc kubenswrapper[4676]: I0930 14:34:28.346148 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2kw9f\" (UniqueName: \"kubernetes.io/projected/c9aca039-cea1-4fe5-8ee1-226f22cbefd2-kube-api-access-2kw9f\") on node \"crc\" DevicePath \"\"" Sep 30 14:34:28 crc kubenswrapper[4676]: I0930 14:34:28.725171 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-kgg8r" event={"ID":"c9aca039-cea1-4fe5-8ee1-226f22cbefd2","Type":"ContainerDied","Data":"e64cd3ab913f0671bdde29f608b3b09fc331b576d36f44742b78d58a05c0dd32"} Sep 30 14:34:28 crc kubenswrapper[4676]: I0930 14:34:28.725216 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e64cd3ab913f0671bdde29f608b3b09fc331b576d36f44742b78d58a05c0dd32" Sep 30 14:34:28 crc kubenswrapper[4676]: I0930 14:34:28.725483 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-kgg8r" Sep 30 14:34:28 crc kubenswrapper[4676]: I0930 14:34:28.800372 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-flzt4"] Sep 30 14:34:28 crc kubenswrapper[4676]: E0930 14:34:28.801541 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9aca039-cea1-4fe5-8ee1-226f22cbefd2" containerName="ssh-known-hosts-edpm-deployment" Sep 30 14:34:28 crc kubenswrapper[4676]: I0930 14:34:28.801630 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9aca039-cea1-4fe5-8ee1-226f22cbefd2" containerName="ssh-known-hosts-edpm-deployment" Sep 30 14:34:28 crc kubenswrapper[4676]: I0930 14:34:28.802214 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9aca039-cea1-4fe5-8ee1-226f22cbefd2" containerName="ssh-known-hosts-edpm-deployment" Sep 30 14:34:28 crc kubenswrapper[4676]: I0930 14:34:28.803285 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-flzt4" Sep 30 14:34:28 crc kubenswrapper[4676]: I0930 14:34:28.806744 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 14:34:28 crc kubenswrapper[4676]: I0930 14:34:28.807218 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 14:34:28 crc kubenswrapper[4676]: I0930 14:34:28.807644 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 14:34:28 crc kubenswrapper[4676]: I0930 14:34:28.807849 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mzpth" Sep 30 14:34:28 crc kubenswrapper[4676]: I0930 14:34:28.844097 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-flzt4"] Sep 30 14:34:28 crc kubenswrapper[4676]: I0930 14:34:28.956520 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkvrb\" (UniqueName: \"kubernetes.io/projected/44bcdced-a8cf-4b1d-baa4-31988a1ca72d-kube-api-access-wkvrb\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-flzt4\" (UID: \"44bcdced-a8cf-4b1d-baa4-31988a1ca72d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-flzt4" Sep 30 14:34:28 crc kubenswrapper[4676]: I0930 14:34:28.956686 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44bcdced-a8cf-4b1d-baa4-31988a1ca72d-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-flzt4\" (UID: \"44bcdced-a8cf-4b1d-baa4-31988a1ca72d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-flzt4" Sep 30 14:34:28 crc kubenswrapper[4676]: I0930 14:34:28.956746 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/44bcdced-a8cf-4b1d-baa4-31988a1ca72d-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-flzt4\" (UID: \"44bcdced-a8cf-4b1d-baa4-31988a1ca72d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-flzt4" Sep 30 14:34:29 crc kubenswrapper[4676]: I0930 14:34:29.058913 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44bcdced-a8cf-4b1d-baa4-31988a1ca72d-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-flzt4\" (UID: \"44bcdced-a8cf-4b1d-baa4-31988a1ca72d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-flzt4" Sep 30 14:34:29 crc kubenswrapper[4676]: I0930 14:34:29.059366 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/44bcdced-a8cf-4b1d-baa4-31988a1ca72d-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-flzt4\" (UID: \"44bcdced-a8cf-4b1d-baa4-31988a1ca72d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-flzt4" Sep 30 14:34:29 crc kubenswrapper[4676]: I0930 14:34:29.059509 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkvrb\" (UniqueName: \"kubernetes.io/projected/44bcdced-a8cf-4b1d-baa4-31988a1ca72d-kube-api-access-wkvrb\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-flzt4\" (UID: \"44bcdced-a8cf-4b1d-baa4-31988a1ca72d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-flzt4" Sep 30 14:34:29 crc kubenswrapper[4676]: I0930 14:34:29.062976 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44bcdced-a8cf-4b1d-baa4-31988a1ca72d-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-flzt4\" (UID: \"44bcdced-a8cf-4b1d-baa4-31988a1ca72d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-flzt4" Sep 30 14:34:29 crc kubenswrapper[4676]: I0930 14:34:29.063498 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/44bcdced-a8cf-4b1d-baa4-31988a1ca72d-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-flzt4\" (UID: \"44bcdced-a8cf-4b1d-baa4-31988a1ca72d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-flzt4" Sep 30 14:34:29 crc kubenswrapper[4676]: I0930 14:34:29.083715 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkvrb\" (UniqueName: \"kubernetes.io/projected/44bcdced-a8cf-4b1d-baa4-31988a1ca72d-kube-api-access-wkvrb\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-flzt4\" (UID: \"44bcdced-a8cf-4b1d-baa4-31988a1ca72d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-flzt4" Sep 30 14:34:29 crc kubenswrapper[4676]: I0930 14:34:29.128052 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-flzt4" Sep 30 14:34:29 crc kubenswrapper[4676]: I0930 14:34:29.642993 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-flzt4"] Sep 30 14:34:29 crc kubenswrapper[4676]: I0930 14:34:29.735535 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-flzt4" event={"ID":"44bcdced-a8cf-4b1d-baa4-31988a1ca72d","Type":"ContainerStarted","Data":"8dd1e2604c184ba39f0a92aac709ede25a475e89fda0718d5dec744d6ced301c"} Sep 30 14:34:30 crc kubenswrapper[4676]: I0930 14:34:30.746088 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-flzt4" event={"ID":"44bcdced-a8cf-4b1d-baa4-31988a1ca72d","Type":"ContainerStarted","Data":"4ab09d56a7e4d96347a7c2ed7841825209a1e3f941d7a83c60de3c3a8f1f805d"} Sep 30 14:34:30 crc kubenswrapper[4676]: I0930 14:34:30.779750 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-flzt4" podStartSLOduration=2.33750013 podStartE2EDuration="2.779723995s" podCreationTimestamp="2025-09-30 14:34:28 +0000 UTC" firstStartedPulling="2025-09-30 14:34:29.648536975 +0000 UTC m=+2173.631625404" lastFinishedPulling="2025-09-30 14:34:30.09076084 +0000 UTC m=+2174.073849269" observedRunningTime="2025-09-30 14:34:30.766734563 +0000 UTC m=+2174.749823002" watchObservedRunningTime="2025-09-30 14:34:30.779723995 +0000 UTC m=+2174.762812454" Sep 30 14:34:38 crc kubenswrapper[4676]: I0930 14:34:38.820757 4676 generic.go:334] "Generic (PLEG): container finished" podID="44bcdced-a8cf-4b1d-baa4-31988a1ca72d" containerID="4ab09d56a7e4d96347a7c2ed7841825209a1e3f941d7a83c60de3c3a8f1f805d" exitCode=0 Sep 30 14:34:38 crc kubenswrapper[4676]: I0930 14:34:38.820852 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-flzt4" event={"ID":"44bcdced-a8cf-4b1d-baa4-31988a1ca72d","Type":"ContainerDied","Data":"4ab09d56a7e4d96347a7c2ed7841825209a1e3f941d7a83c60de3c3a8f1f805d"} Sep 30 14:34:40 crc kubenswrapper[4676]: I0930 14:34:40.289227 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-flzt4" Sep 30 14:34:40 crc kubenswrapper[4676]: I0930 14:34:40.391386 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/44bcdced-a8cf-4b1d-baa4-31988a1ca72d-ssh-key\") pod \"44bcdced-a8cf-4b1d-baa4-31988a1ca72d\" (UID: \"44bcdced-a8cf-4b1d-baa4-31988a1ca72d\") " Sep 30 14:34:40 crc kubenswrapper[4676]: I0930 14:34:40.391780 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkvrb\" (UniqueName: \"kubernetes.io/projected/44bcdced-a8cf-4b1d-baa4-31988a1ca72d-kube-api-access-wkvrb\") pod \"44bcdced-a8cf-4b1d-baa4-31988a1ca72d\" (UID: \"44bcdced-a8cf-4b1d-baa4-31988a1ca72d\") " Sep 30 14:34:40 crc kubenswrapper[4676]: I0930 14:34:40.391961 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44bcdced-a8cf-4b1d-baa4-31988a1ca72d-inventory\") pod \"44bcdced-a8cf-4b1d-baa4-31988a1ca72d\" (UID: \"44bcdced-a8cf-4b1d-baa4-31988a1ca72d\") " Sep 30 14:34:40 crc kubenswrapper[4676]: I0930 14:34:40.396977 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44bcdced-a8cf-4b1d-baa4-31988a1ca72d-kube-api-access-wkvrb" (OuterVolumeSpecName: "kube-api-access-wkvrb") pod "44bcdced-a8cf-4b1d-baa4-31988a1ca72d" (UID: "44bcdced-a8cf-4b1d-baa4-31988a1ca72d"). InnerVolumeSpecName "kube-api-access-wkvrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:34:40 crc kubenswrapper[4676]: I0930 14:34:40.420666 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44bcdced-a8cf-4b1d-baa4-31988a1ca72d-inventory" (OuterVolumeSpecName: "inventory") pod "44bcdced-a8cf-4b1d-baa4-31988a1ca72d" (UID: "44bcdced-a8cf-4b1d-baa4-31988a1ca72d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:34:40 crc kubenswrapper[4676]: I0930 14:34:40.420856 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44bcdced-a8cf-4b1d-baa4-31988a1ca72d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "44bcdced-a8cf-4b1d-baa4-31988a1ca72d" (UID: "44bcdced-a8cf-4b1d-baa4-31988a1ca72d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:34:40 crc kubenswrapper[4676]: I0930 14:34:40.497794 4676 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44bcdced-a8cf-4b1d-baa4-31988a1ca72d-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 14:34:40 crc kubenswrapper[4676]: I0930 14:34:40.497838 4676 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/44bcdced-a8cf-4b1d-baa4-31988a1ca72d-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 14:34:40 crc kubenswrapper[4676]: I0930 14:34:40.497853 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkvrb\" (UniqueName: \"kubernetes.io/projected/44bcdced-a8cf-4b1d-baa4-31988a1ca72d-kube-api-access-wkvrb\") on node \"crc\" DevicePath \"\"" Sep 30 14:34:40 crc kubenswrapper[4676]: I0930 14:34:40.843415 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-flzt4" event={"ID":"44bcdced-a8cf-4b1d-baa4-31988a1ca72d","Type":"ContainerDied","Data":"8dd1e2604c184ba39f0a92aac709ede25a475e89fda0718d5dec744d6ced301c"} Sep 30 14:34:40 crc kubenswrapper[4676]: I0930 14:34:40.843703 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8dd1e2604c184ba39f0a92aac709ede25a475e89fda0718d5dec744d6ced301c" Sep 30 14:34:40 crc kubenswrapper[4676]: I0930 14:34:40.843492 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-flzt4" Sep 30 14:34:40 crc kubenswrapper[4676]: I0930 14:34:40.923121 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sgkqz"] Sep 30 14:34:40 crc kubenswrapper[4676]: E0930 14:34:40.923588 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44bcdced-a8cf-4b1d-baa4-31988a1ca72d" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Sep 30 14:34:40 crc kubenswrapper[4676]: I0930 14:34:40.923611 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="44bcdced-a8cf-4b1d-baa4-31988a1ca72d" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Sep 30 14:34:40 crc kubenswrapper[4676]: I0930 14:34:40.923859 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="44bcdced-a8cf-4b1d-baa4-31988a1ca72d" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Sep 30 14:34:40 crc kubenswrapper[4676]: I0930 14:34:40.924757 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sgkqz" Sep 30 14:34:40 crc kubenswrapper[4676]: I0930 14:34:40.931121 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mzpth" Sep 30 14:34:40 crc kubenswrapper[4676]: I0930 14:34:40.931184 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 14:34:40 crc kubenswrapper[4676]: I0930 14:34:40.931196 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 14:34:40 crc kubenswrapper[4676]: I0930 14:34:40.931640 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 14:34:40 crc kubenswrapper[4676]: I0930 14:34:40.935357 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sgkqz"] Sep 30 14:34:41 crc kubenswrapper[4676]: I0930 14:34:41.007028 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fc6b249a-7b71-4b2e-9d6a-6b6b635b8ddc-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-sgkqz\" (UID: \"fc6b249a-7b71-4b2e-9d6a-6b6b635b8ddc\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sgkqz" Sep 30 14:34:41 crc kubenswrapper[4676]: I0930 14:34:41.007100 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fc6b249a-7b71-4b2e-9d6a-6b6b635b8ddc-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-sgkqz\" (UID: \"fc6b249a-7b71-4b2e-9d6a-6b6b635b8ddc\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sgkqz" Sep 30 14:34:41 crc kubenswrapper[4676]: I0930 14:34:41.007366 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5nmv\" (UniqueName: \"kubernetes.io/projected/fc6b249a-7b71-4b2e-9d6a-6b6b635b8ddc-kube-api-access-l5nmv\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-sgkqz\" (UID: \"fc6b249a-7b71-4b2e-9d6a-6b6b635b8ddc\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sgkqz" Sep 30 14:34:41 crc kubenswrapper[4676]: I0930 14:34:41.111281 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5nmv\" (UniqueName: \"kubernetes.io/projected/fc6b249a-7b71-4b2e-9d6a-6b6b635b8ddc-kube-api-access-l5nmv\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-sgkqz\" (UID: \"fc6b249a-7b71-4b2e-9d6a-6b6b635b8ddc\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sgkqz" Sep 30 14:34:41 crc kubenswrapper[4676]: I0930 14:34:41.111495 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fc6b249a-7b71-4b2e-9d6a-6b6b635b8ddc-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-sgkqz\" (UID: \"fc6b249a-7b71-4b2e-9d6a-6b6b635b8ddc\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sgkqz" Sep 30 14:34:41 crc kubenswrapper[4676]: I0930 14:34:41.111528 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fc6b249a-7b71-4b2e-9d6a-6b6b635b8ddc-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-sgkqz\" (UID: \"fc6b249a-7b71-4b2e-9d6a-6b6b635b8ddc\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sgkqz" Sep 30 14:34:41 crc kubenswrapper[4676]: I0930 14:34:41.118751 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fc6b249a-7b71-4b2e-9d6a-6b6b635b8ddc-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-sgkqz\" (UID: \"fc6b249a-7b71-4b2e-9d6a-6b6b635b8ddc\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sgkqz" Sep 30 14:34:41 crc kubenswrapper[4676]: I0930 14:34:41.119281 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fc6b249a-7b71-4b2e-9d6a-6b6b635b8ddc-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-sgkqz\" (UID: \"fc6b249a-7b71-4b2e-9d6a-6b6b635b8ddc\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sgkqz" Sep 30 14:34:41 crc kubenswrapper[4676]: I0930 14:34:41.130470 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5nmv\" (UniqueName: \"kubernetes.io/projected/fc6b249a-7b71-4b2e-9d6a-6b6b635b8ddc-kube-api-access-l5nmv\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-sgkqz\" (UID: \"fc6b249a-7b71-4b2e-9d6a-6b6b635b8ddc\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sgkqz" Sep 30 14:34:41 crc kubenswrapper[4676]: I0930 14:34:41.254216 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sgkqz" Sep 30 14:34:41 crc kubenswrapper[4676]: I0930 14:34:41.763840 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sgkqz"] Sep 30 14:34:41 crc kubenswrapper[4676]: I0930 14:34:41.855846 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sgkqz" event={"ID":"fc6b249a-7b71-4b2e-9d6a-6b6b635b8ddc","Type":"ContainerStarted","Data":"73ce6f26aa9d4f0d81255ea60a387cbb307717de02edef078bb9b44e5990c22c"} Sep 30 14:34:42 crc kubenswrapper[4676]: I0930 14:34:42.871380 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sgkqz" event={"ID":"fc6b249a-7b71-4b2e-9d6a-6b6b635b8ddc","Type":"ContainerStarted","Data":"9a673bfa4f78bf3f490981c57cf01fa47d6154c2ade4f03c52ffadb01ad56224"} Sep 30 14:34:42 crc kubenswrapper[4676]: I0930 14:34:42.890937 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sgkqz" podStartSLOduration=2.431239828 podStartE2EDuration="2.890916033s" podCreationTimestamp="2025-09-30 14:34:40 +0000 UTC" firstStartedPulling="2025-09-30 14:34:41.774509769 +0000 UTC m=+2185.757598198" lastFinishedPulling="2025-09-30 14:34:42.234185964 +0000 UTC m=+2186.217274403" observedRunningTime="2025-09-30 14:34:42.888426034 +0000 UTC m=+2186.871514463" watchObservedRunningTime="2025-09-30 14:34:42.890916033 +0000 UTC m=+2186.874004462" Sep 30 14:34:51 crc kubenswrapper[4676]: I0930 14:34:51.957330 4676 generic.go:334] "Generic (PLEG): container finished" podID="fc6b249a-7b71-4b2e-9d6a-6b6b635b8ddc" containerID="9a673bfa4f78bf3f490981c57cf01fa47d6154c2ade4f03c52ffadb01ad56224" exitCode=0 Sep 30 14:34:51 crc kubenswrapper[4676]: I0930 14:34:51.957441 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sgkqz" event={"ID":"fc6b249a-7b71-4b2e-9d6a-6b6b635b8ddc","Type":"ContainerDied","Data":"9a673bfa4f78bf3f490981c57cf01fa47d6154c2ade4f03c52ffadb01ad56224"} Sep 30 14:34:53 crc kubenswrapper[4676]: I0930 14:34:53.380735 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sgkqz" Sep 30 14:34:53 crc kubenswrapper[4676]: I0930 14:34:53.447380 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fc6b249a-7b71-4b2e-9d6a-6b6b635b8ddc-inventory\") pod \"fc6b249a-7b71-4b2e-9d6a-6b6b635b8ddc\" (UID: \"fc6b249a-7b71-4b2e-9d6a-6b6b635b8ddc\") " Sep 30 14:34:53 crc kubenswrapper[4676]: I0930 14:34:53.447432 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fc6b249a-7b71-4b2e-9d6a-6b6b635b8ddc-ssh-key\") pod \"fc6b249a-7b71-4b2e-9d6a-6b6b635b8ddc\" (UID: \"fc6b249a-7b71-4b2e-9d6a-6b6b635b8ddc\") " Sep 30 14:34:53 crc kubenswrapper[4676]: I0930 14:34:53.447474 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5nmv\" (UniqueName: \"kubernetes.io/projected/fc6b249a-7b71-4b2e-9d6a-6b6b635b8ddc-kube-api-access-l5nmv\") pod \"fc6b249a-7b71-4b2e-9d6a-6b6b635b8ddc\" (UID: \"fc6b249a-7b71-4b2e-9d6a-6b6b635b8ddc\") " Sep 30 14:34:53 crc kubenswrapper[4676]: I0930 14:34:53.452447 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc6b249a-7b71-4b2e-9d6a-6b6b635b8ddc-kube-api-access-l5nmv" (OuterVolumeSpecName: "kube-api-access-l5nmv") pod "fc6b249a-7b71-4b2e-9d6a-6b6b635b8ddc" (UID: "fc6b249a-7b71-4b2e-9d6a-6b6b635b8ddc"). InnerVolumeSpecName "kube-api-access-l5nmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:34:53 crc kubenswrapper[4676]: I0930 14:34:53.471914 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc6b249a-7b71-4b2e-9d6a-6b6b635b8ddc-inventory" (OuterVolumeSpecName: "inventory") pod "fc6b249a-7b71-4b2e-9d6a-6b6b635b8ddc" (UID: "fc6b249a-7b71-4b2e-9d6a-6b6b635b8ddc"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:34:53 crc kubenswrapper[4676]: I0930 14:34:53.475572 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc6b249a-7b71-4b2e-9d6a-6b6b635b8ddc-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "fc6b249a-7b71-4b2e-9d6a-6b6b635b8ddc" (UID: "fc6b249a-7b71-4b2e-9d6a-6b6b635b8ddc"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:34:53 crc kubenswrapper[4676]: I0930 14:34:53.549983 4676 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fc6b249a-7b71-4b2e-9d6a-6b6b635b8ddc-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 14:34:53 crc kubenswrapper[4676]: I0930 14:34:53.550290 4676 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fc6b249a-7b71-4b2e-9d6a-6b6b635b8ddc-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 14:34:53 crc kubenswrapper[4676]: I0930 14:34:53.550371 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5nmv\" (UniqueName: \"kubernetes.io/projected/fc6b249a-7b71-4b2e-9d6a-6b6b635b8ddc-kube-api-access-l5nmv\") on node \"crc\" DevicePath \"\"" Sep 30 14:34:53 crc kubenswrapper[4676]: I0930 14:34:53.977529 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sgkqz" event={"ID":"fc6b249a-7b71-4b2e-9d6a-6b6b635b8ddc","Type":"ContainerDied","Data":"73ce6f26aa9d4f0d81255ea60a387cbb307717de02edef078bb9b44e5990c22c"} Sep 30 14:34:53 crc kubenswrapper[4676]: I0930 14:34:53.977576 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73ce6f26aa9d4f0d81255ea60a387cbb307717de02edef078bb9b44e5990c22c" Sep 30 14:34:53 crc kubenswrapper[4676]: I0930 14:34:53.978053 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-sgkqz" Sep 30 14:34:54 crc kubenswrapper[4676]: I0930 14:34:54.053442 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-524mx"] Sep 30 14:34:54 crc kubenswrapper[4676]: E0930 14:34:54.053862 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc6b249a-7b71-4b2e-9d6a-6b6b635b8ddc" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Sep 30 14:34:54 crc kubenswrapper[4676]: I0930 14:34:54.053903 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc6b249a-7b71-4b2e-9d6a-6b6b635b8ddc" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Sep 30 14:34:54 crc kubenswrapper[4676]: I0930 14:34:54.054151 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc6b249a-7b71-4b2e-9d6a-6b6b635b8ddc" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Sep 30 14:34:54 crc kubenswrapper[4676]: I0930 14:34:54.054784 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-524mx" Sep 30 14:34:54 crc kubenswrapper[4676]: I0930 14:34:54.056394 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 14:34:54 crc kubenswrapper[4676]: I0930 14:34:54.057265 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mzpth" Sep 30 14:34:54 crc kubenswrapper[4676]: I0930 14:34:54.057388 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 14:34:54 crc kubenswrapper[4676]: I0930 14:34:54.057425 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Sep 30 14:34:54 crc kubenswrapper[4676]: I0930 14:34:54.057759 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Sep 30 14:34:54 crc kubenswrapper[4676]: I0930 14:34:54.057998 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Sep 30 14:34:54 crc kubenswrapper[4676]: I0930 14:34:54.058390 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 14:34:54 crc kubenswrapper[4676]: I0930 14:34:54.059560 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Sep 30 14:34:54 crc kubenswrapper[4676]: I0930 14:34:54.076938 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-524mx"] Sep 30 14:34:54 crc kubenswrapper[4676]: I0930 14:34:54.161411 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425308e0-6300-4e6a-922e-dc9ef39d61f8-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-524mx\" (UID: \"425308e0-6300-4e6a-922e-dc9ef39d61f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-524mx" Sep 30 14:34:54 crc kubenswrapper[4676]: I0930 14:34:54.161740 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425308e0-6300-4e6a-922e-dc9ef39d61f8-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-524mx\" (UID: \"425308e0-6300-4e6a-922e-dc9ef39d61f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-524mx" Sep 30 14:34:54 crc kubenswrapper[4676]: I0930 14:34:54.161800 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/425308e0-6300-4e6a-922e-dc9ef39d61f8-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-524mx\" (UID: \"425308e0-6300-4e6a-922e-dc9ef39d61f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-524mx" Sep 30 14:34:54 crc kubenswrapper[4676]: I0930 14:34:54.161823 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/425308e0-6300-4e6a-922e-dc9ef39d61f8-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-524mx\" (UID: \"425308e0-6300-4e6a-922e-dc9ef39d61f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-524mx" Sep 30 14:34:54 crc kubenswrapper[4676]: I0930 14:34:54.161869 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425308e0-6300-4e6a-922e-dc9ef39d61f8-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-524mx\" (UID: \"425308e0-6300-4e6a-922e-dc9ef39d61f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-524mx" Sep 30 14:34:54 crc kubenswrapper[4676]: I0930 14:34:54.162023 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425308e0-6300-4e6a-922e-dc9ef39d61f8-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-524mx\" (UID: \"425308e0-6300-4e6a-922e-dc9ef39d61f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-524mx" Sep 30 14:34:54 crc kubenswrapper[4676]: I0930 14:34:54.162055 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/425308e0-6300-4e6a-922e-dc9ef39d61f8-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-524mx\" (UID: \"425308e0-6300-4e6a-922e-dc9ef39d61f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-524mx" Sep 30 14:34:54 crc kubenswrapper[4676]: I0930 14:34:54.162071 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425308e0-6300-4e6a-922e-dc9ef39d61f8-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-524mx\" (UID: \"425308e0-6300-4e6a-922e-dc9ef39d61f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-524mx" Sep 30 14:34:54 crc kubenswrapper[4676]: I0930 14:34:54.162113 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/425308e0-6300-4e6a-922e-dc9ef39d61f8-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-524mx\" (UID: \"425308e0-6300-4e6a-922e-dc9ef39d61f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-524mx" Sep 30 14:34:54 crc kubenswrapper[4676]: I0930 14:34:54.162221 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425308e0-6300-4e6a-922e-dc9ef39d61f8-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-524mx\" (UID: \"425308e0-6300-4e6a-922e-dc9ef39d61f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-524mx" Sep 30 14:34:54 crc kubenswrapper[4676]: I0930 14:34:54.162252 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/425308e0-6300-4e6a-922e-dc9ef39d61f8-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-524mx\" (UID: \"425308e0-6300-4e6a-922e-dc9ef39d61f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-524mx" Sep 30 14:34:54 crc kubenswrapper[4676]: I0930 14:34:54.162295 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/425308e0-6300-4e6a-922e-dc9ef39d61f8-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-524mx\" (UID: \"425308e0-6300-4e6a-922e-dc9ef39d61f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-524mx" Sep 30 14:34:54 crc kubenswrapper[4676]: I0930 14:34:54.162321 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7jfj\" (UniqueName: \"kubernetes.io/projected/425308e0-6300-4e6a-922e-dc9ef39d61f8-kube-api-access-g7jfj\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-524mx\" (UID: \"425308e0-6300-4e6a-922e-dc9ef39d61f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-524mx" Sep 30 14:34:54 crc kubenswrapper[4676]: I0930 14:34:54.162340 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425308e0-6300-4e6a-922e-dc9ef39d61f8-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-524mx\" (UID: \"425308e0-6300-4e6a-922e-dc9ef39d61f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-524mx" Sep 30 14:34:54 crc kubenswrapper[4676]: I0930 14:34:54.264220 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425308e0-6300-4e6a-922e-dc9ef39d61f8-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-524mx\" (UID: \"425308e0-6300-4e6a-922e-dc9ef39d61f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-524mx" Sep 30 14:34:54 crc kubenswrapper[4676]: I0930 14:34:54.264349 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/425308e0-6300-4e6a-922e-dc9ef39d61f8-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-524mx\" (UID: \"425308e0-6300-4e6a-922e-dc9ef39d61f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-524mx" Sep 30 14:34:54 crc kubenswrapper[4676]: I0930 14:34:54.264397 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/425308e0-6300-4e6a-922e-dc9ef39d61f8-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-524mx\" (UID: \"425308e0-6300-4e6a-922e-dc9ef39d61f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-524mx" Sep 30 14:34:54 crc kubenswrapper[4676]: I0930 14:34:54.264445 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7jfj\" (UniqueName: \"kubernetes.io/projected/425308e0-6300-4e6a-922e-dc9ef39d61f8-kube-api-access-g7jfj\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-524mx\" (UID: \"425308e0-6300-4e6a-922e-dc9ef39d61f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-524mx" Sep 30 14:34:54 crc kubenswrapper[4676]: I0930 14:34:54.264464 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425308e0-6300-4e6a-922e-dc9ef39d61f8-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-524mx\" (UID: \"425308e0-6300-4e6a-922e-dc9ef39d61f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-524mx" Sep 30 14:34:54 crc kubenswrapper[4676]: I0930 14:34:54.264494 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425308e0-6300-4e6a-922e-dc9ef39d61f8-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-524mx\" (UID: \"425308e0-6300-4e6a-922e-dc9ef39d61f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-524mx" Sep 30 14:34:54 crc kubenswrapper[4676]: I0930 14:34:54.264513 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425308e0-6300-4e6a-922e-dc9ef39d61f8-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-524mx\" (UID: \"425308e0-6300-4e6a-922e-dc9ef39d61f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-524mx" Sep 30 14:34:54 crc kubenswrapper[4676]: I0930 14:34:54.264539 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/425308e0-6300-4e6a-922e-dc9ef39d61f8-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-524mx\" (UID: \"425308e0-6300-4e6a-922e-dc9ef39d61f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-524mx" Sep 30 14:34:54 crc kubenswrapper[4676]: I0930 14:34:54.264557 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/425308e0-6300-4e6a-922e-dc9ef39d61f8-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-524mx\" (UID: \"425308e0-6300-4e6a-922e-dc9ef39d61f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-524mx" Sep 30 14:34:54 crc kubenswrapper[4676]: I0930 14:34:54.264577 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425308e0-6300-4e6a-922e-dc9ef39d61f8-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-524mx\" (UID: \"425308e0-6300-4e6a-922e-dc9ef39d61f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-524mx" Sep 30 14:34:54 crc kubenswrapper[4676]: I0930 14:34:54.264620 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425308e0-6300-4e6a-922e-dc9ef39d61f8-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-524mx\" (UID: \"425308e0-6300-4e6a-922e-dc9ef39d61f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-524mx" Sep 30 14:34:54 crc kubenswrapper[4676]: I0930 14:34:54.264656 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/425308e0-6300-4e6a-922e-dc9ef39d61f8-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-524mx\" (UID: \"425308e0-6300-4e6a-922e-dc9ef39d61f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-524mx" Sep 30 14:34:54 crc kubenswrapper[4676]: I0930 14:34:54.264677 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425308e0-6300-4e6a-922e-dc9ef39d61f8-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-524mx\" (UID: \"425308e0-6300-4e6a-922e-dc9ef39d61f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-524mx" Sep 30 14:34:54 crc kubenswrapper[4676]: I0930 14:34:54.264726 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/425308e0-6300-4e6a-922e-dc9ef39d61f8-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-524mx\" (UID: \"425308e0-6300-4e6a-922e-dc9ef39d61f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-524mx" Sep 30 14:34:54 crc kubenswrapper[4676]: I0930 14:34:54.269603 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/425308e0-6300-4e6a-922e-dc9ef39d61f8-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-524mx\" (UID: \"425308e0-6300-4e6a-922e-dc9ef39d61f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-524mx" Sep 30 14:34:54 crc kubenswrapper[4676]: I0930 14:34:54.269805 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/425308e0-6300-4e6a-922e-dc9ef39d61f8-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-524mx\" (UID: \"425308e0-6300-4e6a-922e-dc9ef39d61f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-524mx" Sep 30 14:34:54 crc kubenswrapper[4676]: I0930 14:34:54.270060 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425308e0-6300-4e6a-922e-dc9ef39d61f8-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-524mx\" (UID: \"425308e0-6300-4e6a-922e-dc9ef39d61f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-524mx" Sep 30 14:34:54 crc kubenswrapper[4676]: I0930 14:34:54.270207 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/425308e0-6300-4e6a-922e-dc9ef39d61f8-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-524mx\" (UID: \"425308e0-6300-4e6a-922e-dc9ef39d61f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-524mx" Sep 30 14:34:54 crc kubenswrapper[4676]: I0930 14:34:54.270559 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425308e0-6300-4e6a-922e-dc9ef39d61f8-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-524mx\" (UID: \"425308e0-6300-4e6a-922e-dc9ef39d61f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-524mx" Sep 30 14:34:54 crc kubenswrapper[4676]: I0930 14:34:54.271194 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425308e0-6300-4e6a-922e-dc9ef39d61f8-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-524mx\" (UID: \"425308e0-6300-4e6a-922e-dc9ef39d61f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-524mx" Sep 30 14:34:54 crc kubenswrapper[4676]: I0930 14:34:54.271645 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/425308e0-6300-4e6a-922e-dc9ef39d61f8-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-524mx\" (UID: \"425308e0-6300-4e6a-922e-dc9ef39d61f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-524mx" Sep 30 14:34:54 crc kubenswrapper[4676]: I0930 14:34:54.271965 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425308e0-6300-4e6a-922e-dc9ef39d61f8-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-524mx\" (UID: \"425308e0-6300-4e6a-922e-dc9ef39d61f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-524mx" Sep 30 14:34:54 crc kubenswrapper[4676]: I0930 14:34:54.272058 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425308e0-6300-4e6a-922e-dc9ef39d61f8-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-524mx\" (UID: \"425308e0-6300-4e6a-922e-dc9ef39d61f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-524mx" Sep 30 14:34:54 crc kubenswrapper[4676]: I0930 14:34:54.272478 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425308e0-6300-4e6a-922e-dc9ef39d61f8-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-524mx\" (UID: \"425308e0-6300-4e6a-922e-dc9ef39d61f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-524mx" Sep 30 14:34:54 crc kubenswrapper[4676]: I0930 14:34:54.272640 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425308e0-6300-4e6a-922e-dc9ef39d61f8-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-524mx\" (UID: \"425308e0-6300-4e6a-922e-dc9ef39d61f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-524mx" Sep 30 14:34:54 crc kubenswrapper[4676]: I0930 14:34:54.273138 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/425308e0-6300-4e6a-922e-dc9ef39d61f8-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-524mx\" (UID: \"425308e0-6300-4e6a-922e-dc9ef39d61f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-524mx" Sep 30 14:34:54 crc kubenswrapper[4676]: I0930 14:34:54.273993 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/425308e0-6300-4e6a-922e-dc9ef39d61f8-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-524mx\" (UID: \"425308e0-6300-4e6a-922e-dc9ef39d61f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-524mx" Sep 30 14:34:54 crc kubenswrapper[4676]: I0930 14:34:54.282279 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7jfj\" (UniqueName: \"kubernetes.io/projected/425308e0-6300-4e6a-922e-dc9ef39d61f8-kube-api-access-g7jfj\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-524mx\" (UID: \"425308e0-6300-4e6a-922e-dc9ef39d61f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-524mx" Sep 30 14:34:54 crc kubenswrapper[4676]: I0930 14:34:54.372353 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-524mx" Sep 30 14:34:54 crc kubenswrapper[4676]: I0930 14:34:54.891339 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-524mx"] Sep 30 14:34:54 crc kubenswrapper[4676]: I0930 14:34:54.988656 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-524mx" event={"ID":"425308e0-6300-4e6a-922e-dc9ef39d61f8","Type":"ContainerStarted","Data":"5d0c9967a8587e1c1b9877dbe26a77f8f384d54354bf2dc29c4f94ca373a6aa7"} Sep 30 14:34:56 crc kubenswrapper[4676]: I0930 14:34:56.000195 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-524mx" event={"ID":"425308e0-6300-4e6a-922e-dc9ef39d61f8","Type":"ContainerStarted","Data":"974d100d6cd59321e9531dffdd190db68fb7af9475aef2b81b7f277a11bf0157"} Sep 30 14:34:56 crc kubenswrapper[4676]: I0930 14:34:56.026557 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-524mx" podStartSLOduration=1.536917546 podStartE2EDuration="2.026529102s" podCreationTimestamp="2025-09-30 14:34:54 +0000 UTC" firstStartedPulling="2025-09-30 14:34:54.897395831 +0000 UTC m=+2198.880484260" lastFinishedPulling="2025-09-30 14:34:55.387007387 +0000 UTC m=+2199.370095816" observedRunningTime="2025-09-30 14:34:56.016744926 +0000 UTC m=+2199.999833375" watchObservedRunningTime="2025-09-30 14:34:56.026529102 +0000 UTC m=+2200.009617531" Sep 30 14:34:59 crc kubenswrapper[4676]: I0930 14:34:59.919058 4676 patch_prober.go:28] interesting pod/machine-config-daemon-4k2dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:34:59 crc kubenswrapper[4676]: I0930 14:34:59.919421 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:35:29 crc kubenswrapper[4676]: I0930 14:35:29.920168 4676 patch_prober.go:28] interesting pod/machine-config-daemon-4k2dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:35:29 crc kubenswrapper[4676]: I0930 14:35:29.921123 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:35:32 crc kubenswrapper[4676]: I0930 14:35:32.326199 4676 generic.go:334] "Generic (PLEG): container finished" podID="425308e0-6300-4e6a-922e-dc9ef39d61f8" containerID="974d100d6cd59321e9531dffdd190db68fb7af9475aef2b81b7f277a11bf0157" exitCode=0 Sep 30 14:35:32 crc kubenswrapper[4676]: I0930 14:35:32.326417 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-524mx" event={"ID":"425308e0-6300-4e6a-922e-dc9ef39d61f8","Type":"ContainerDied","Data":"974d100d6cd59321e9531dffdd190db68fb7af9475aef2b81b7f277a11bf0157"} Sep 30 14:35:33 crc kubenswrapper[4676]: I0930 14:35:33.781166 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-524mx" Sep 30 14:35:33 crc kubenswrapper[4676]: I0930 14:35:33.893587 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425308e0-6300-4e6a-922e-dc9ef39d61f8-nova-combined-ca-bundle\") pod \"425308e0-6300-4e6a-922e-dc9ef39d61f8\" (UID: \"425308e0-6300-4e6a-922e-dc9ef39d61f8\") " Sep 30 14:35:33 crc kubenswrapper[4676]: I0930 14:35:33.894040 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425308e0-6300-4e6a-922e-dc9ef39d61f8-libvirt-combined-ca-bundle\") pod \"425308e0-6300-4e6a-922e-dc9ef39d61f8\" (UID: \"425308e0-6300-4e6a-922e-dc9ef39d61f8\") " Sep 30 14:35:33 crc kubenswrapper[4676]: I0930 14:35:33.894172 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/425308e0-6300-4e6a-922e-dc9ef39d61f8-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"425308e0-6300-4e6a-922e-dc9ef39d61f8\" (UID: \"425308e0-6300-4e6a-922e-dc9ef39d61f8\") " Sep 30 14:35:33 crc kubenswrapper[4676]: I0930 14:35:33.894284 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/425308e0-6300-4e6a-922e-dc9ef39d61f8-ssh-key\") pod \"425308e0-6300-4e6a-922e-dc9ef39d61f8\" (UID: \"425308e0-6300-4e6a-922e-dc9ef39d61f8\") " Sep 30 14:35:33 crc kubenswrapper[4676]: I0930 14:35:33.894405 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/425308e0-6300-4e6a-922e-dc9ef39d61f8-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"425308e0-6300-4e6a-922e-dc9ef39d61f8\" (UID: \"425308e0-6300-4e6a-922e-dc9ef39d61f8\") " Sep 30 14:35:33 crc kubenswrapper[4676]: I0930 14:35:33.894564 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425308e0-6300-4e6a-922e-dc9ef39d61f8-telemetry-combined-ca-bundle\") pod \"425308e0-6300-4e6a-922e-dc9ef39d61f8\" (UID: \"425308e0-6300-4e6a-922e-dc9ef39d61f8\") " Sep 30 14:35:33 crc kubenswrapper[4676]: I0930 14:35:33.894695 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/425308e0-6300-4e6a-922e-dc9ef39d61f8-inventory\") pod \"425308e0-6300-4e6a-922e-dc9ef39d61f8\" (UID: \"425308e0-6300-4e6a-922e-dc9ef39d61f8\") " Sep 30 14:35:33 crc kubenswrapper[4676]: I0930 14:35:33.894853 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425308e0-6300-4e6a-922e-dc9ef39d61f8-ovn-combined-ca-bundle\") pod \"425308e0-6300-4e6a-922e-dc9ef39d61f8\" (UID: \"425308e0-6300-4e6a-922e-dc9ef39d61f8\") " Sep 30 14:35:33 crc kubenswrapper[4676]: I0930 14:35:33.895054 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7jfj\" (UniqueName: \"kubernetes.io/projected/425308e0-6300-4e6a-922e-dc9ef39d61f8-kube-api-access-g7jfj\") pod \"425308e0-6300-4e6a-922e-dc9ef39d61f8\" (UID: \"425308e0-6300-4e6a-922e-dc9ef39d61f8\") " Sep 30 14:35:33 crc kubenswrapper[4676]: I0930 14:35:33.895278 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/425308e0-6300-4e6a-922e-dc9ef39d61f8-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"425308e0-6300-4e6a-922e-dc9ef39d61f8\" (UID: \"425308e0-6300-4e6a-922e-dc9ef39d61f8\") " Sep 30 14:35:33 crc kubenswrapper[4676]: I0930 14:35:33.895400 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425308e0-6300-4e6a-922e-dc9ef39d61f8-bootstrap-combined-ca-bundle\") pod \"425308e0-6300-4e6a-922e-dc9ef39d61f8\" (UID: \"425308e0-6300-4e6a-922e-dc9ef39d61f8\") " Sep 30 14:35:33 crc kubenswrapper[4676]: I0930 14:35:33.895511 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425308e0-6300-4e6a-922e-dc9ef39d61f8-neutron-metadata-combined-ca-bundle\") pod \"425308e0-6300-4e6a-922e-dc9ef39d61f8\" (UID: \"425308e0-6300-4e6a-922e-dc9ef39d61f8\") " Sep 30 14:35:33 crc kubenswrapper[4676]: I0930 14:35:33.895599 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425308e0-6300-4e6a-922e-dc9ef39d61f8-repo-setup-combined-ca-bundle\") pod \"425308e0-6300-4e6a-922e-dc9ef39d61f8\" (UID: \"425308e0-6300-4e6a-922e-dc9ef39d61f8\") " Sep 30 14:35:33 crc kubenswrapper[4676]: I0930 14:35:33.895714 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/425308e0-6300-4e6a-922e-dc9ef39d61f8-openstack-edpm-ipam-ovn-default-certs-0\") pod \"425308e0-6300-4e6a-922e-dc9ef39d61f8\" (UID: \"425308e0-6300-4e6a-922e-dc9ef39d61f8\") " Sep 30 14:35:33 crc kubenswrapper[4676]: I0930 14:35:33.900441 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/425308e0-6300-4e6a-922e-dc9ef39d61f8-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "425308e0-6300-4e6a-922e-dc9ef39d61f8" (UID: "425308e0-6300-4e6a-922e-dc9ef39d61f8"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:35:33 crc kubenswrapper[4676]: I0930 14:35:33.900764 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/425308e0-6300-4e6a-922e-dc9ef39d61f8-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "425308e0-6300-4e6a-922e-dc9ef39d61f8" (UID: "425308e0-6300-4e6a-922e-dc9ef39d61f8"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:35:33 crc kubenswrapper[4676]: I0930 14:35:33.901442 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/425308e0-6300-4e6a-922e-dc9ef39d61f8-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "425308e0-6300-4e6a-922e-dc9ef39d61f8" (UID: "425308e0-6300-4e6a-922e-dc9ef39d61f8"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:35:33 crc kubenswrapper[4676]: I0930 14:35:33.901690 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/425308e0-6300-4e6a-922e-dc9ef39d61f8-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "425308e0-6300-4e6a-922e-dc9ef39d61f8" (UID: "425308e0-6300-4e6a-922e-dc9ef39d61f8"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:35:33 crc kubenswrapper[4676]: I0930 14:35:33.901778 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/425308e0-6300-4e6a-922e-dc9ef39d61f8-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "425308e0-6300-4e6a-922e-dc9ef39d61f8" (UID: "425308e0-6300-4e6a-922e-dc9ef39d61f8"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:35:33 crc kubenswrapper[4676]: I0930 14:35:33.902195 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/425308e0-6300-4e6a-922e-dc9ef39d61f8-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "425308e0-6300-4e6a-922e-dc9ef39d61f8" (UID: "425308e0-6300-4e6a-922e-dc9ef39d61f8"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:35:33 crc kubenswrapper[4676]: I0930 14:35:33.902493 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/425308e0-6300-4e6a-922e-dc9ef39d61f8-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "425308e0-6300-4e6a-922e-dc9ef39d61f8" (UID: "425308e0-6300-4e6a-922e-dc9ef39d61f8"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:35:33 crc kubenswrapper[4676]: I0930 14:35:33.903363 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/425308e0-6300-4e6a-922e-dc9ef39d61f8-kube-api-access-g7jfj" (OuterVolumeSpecName: "kube-api-access-g7jfj") pod "425308e0-6300-4e6a-922e-dc9ef39d61f8" (UID: "425308e0-6300-4e6a-922e-dc9ef39d61f8"). InnerVolumeSpecName "kube-api-access-g7jfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:35:33 crc kubenswrapper[4676]: I0930 14:35:33.903950 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/425308e0-6300-4e6a-922e-dc9ef39d61f8-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "425308e0-6300-4e6a-922e-dc9ef39d61f8" (UID: "425308e0-6300-4e6a-922e-dc9ef39d61f8"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:35:33 crc kubenswrapper[4676]: I0930 14:35:33.904054 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/425308e0-6300-4e6a-922e-dc9ef39d61f8-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "425308e0-6300-4e6a-922e-dc9ef39d61f8" (UID: "425308e0-6300-4e6a-922e-dc9ef39d61f8"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:35:33 crc kubenswrapper[4676]: I0930 14:35:33.905857 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/425308e0-6300-4e6a-922e-dc9ef39d61f8-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "425308e0-6300-4e6a-922e-dc9ef39d61f8" (UID: "425308e0-6300-4e6a-922e-dc9ef39d61f8"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:35:33 crc kubenswrapper[4676]: I0930 14:35:33.907115 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/425308e0-6300-4e6a-922e-dc9ef39d61f8-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "425308e0-6300-4e6a-922e-dc9ef39d61f8" (UID: "425308e0-6300-4e6a-922e-dc9ef39d61f8"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:35:33 crc kubenswrapper[4676]: I0930 14:35:33.928134 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/425308e0-6300-4e6a-922e-dc9ef39d61f8-inventory" (OuterVolumeSpecName: "inventory") pod "425308e0-6300-4e6a-922e-dc9ef39d61f8" (UID: "425308e0-6300-4e6a-922e-dc9ef39d61f8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:35:33 crc kubenswrapper[4676]: I0930 14:35:33.930981 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/425308e0-6300-4e6a-922e-dc9ef39d61f8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "425308e0-6300-4e6a-922e-dc9ef39d61f8" (UID: "425308e0-6300-4e6a-922e-dc9ef39d61f8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:35:33 crc kubenswrapper[4676]: I0930 14:35:33.998407 4676 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/425308e0-6300-4e6a-922e-dc9ef39d61f8-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Sep 30 14:35:33 crc kubenswrapper[4676]: I0930 14:35:33.998449 4676 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425308e0-6300-4e6a-922e-dc9ef39d61f8-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:35:33 crc kubenswrapper[4676]: I0930 14:35:33.998462 4676 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425308e0-6300-4e6a-922e-dc9ef39d61f8-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:35:33 crc kubenswrapper[4676]: I0930 14:35:33.998471 4676 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425308e0-6300-4e6a-922e-dc9ef39d61f8-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:35:33 crc kubenswrapper[4676]: I0930 14:35:33.998480 4676 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/425308e0-6300-4e6a-922e-dc9ef39d61f8-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Sep 30 14:35:33 crc kubenswrapper[4676]: I0930 14:35:33.998489 4676 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425308e0-6300-4e6a-922e-dc9ef39d61f8-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:35:33 crc kubenswrapper[4676]: I0930 14:35:33.998500 4676 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425308e0-6300-4e6a-922e-dc9ef39d61f8-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:35:33 crc kubenswrapper[4676]: I0930 14:35:33.998509 4676 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/425308e0-6300-4e6a-922e-dc9ef39d61f8-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Sep 30 14:35:33 crc kubenswrapper[4676]: I0930 14:35:33.998518 4676 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/425308e0-6300-4e6a-922e-dc9ef39d61f8-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 14:35:33 crc kubenswrapper[4676]: I0930 14:35:33.998526 4676 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/425308e0-6300-4e6a-922e-dc9ef39d61f8-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Sep 30 14:35:33 crc kubenswrapper[4676]: I0930 14:35:33.998535 4676 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425308e0-6300-4e6a-922e-dc9ef39d61f8-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:35:33 crc kubenswrapper[4676]: I0930 14:35:33.998543 4676 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/425308e0-6300-4e6a-922e-dc9ef39d61f8-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 14:35:33 crc kubenswrapper[4676]: I0930 14:35:33.998553 4676 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425308e0-6300-4e6a-922e-dc9ef39d61f8-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:35:33 crc kubenswrapper[4676]: I0930 14:35:33.998561 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7jfj\" (UniqueName: \"kubernetes.io/projected/425308e0-6300-4e6a-922e-dc9ef39d61f8-kube-api-access-g7jfj\") on node \"crc\" DevicePath \"\"" Sep 30 14:35:34 crc kubenswrapper[4676]: I0930 14:35:34.350235 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-524mx" event={"ID":"425308e0-6300-4e6a-922e-dc9ef39d61f8","Type":"ContainerDied","Data":"5d0c9967a8587e1c1b9877dbe26a77f8f384d54354bf2dc29c4f94ca373a6aa7"} Sep 30 14:35:34 crc kubenswrapper[4676]: I0930 14:35:34.350606 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d0c9967a8587e1c1b9877dbe26a77f8f384d54354bf2dc29c4f94ca373a6aa7" Sep 30 14:35:34 crc kubenswrapper[4676]: I0930 14:35:34.350295 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-524mx" Sep 30 14:35:34 crc kubenswrapper[4676]: I0930 14:35:34.465724 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-js2fj"] Sep 30 14:35:34 crc kubenswrapper[4676]: E0930 14:35:34.466251 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="425308e0-6300-4e6a-922e-dc9ef39d61f8" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Sep 30 14:35:34 crc kubenswrapper[4676]: I0930 14:35:34.466277 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="425308e0-6300-4e6a-922e-dc9ef39d61f8" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Sep 30 14:35:34 crc kubenswrapper[4676]: I0930 14:35:34.466502 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="425308e0-6300-4e6a-922e-dc9ef39d61f8" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Sep 30 14:35:34 crc kubenswrapper[4676]: I0930 14:35:34.467297 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-js2fj" Sep 30 14:35:34 crc kubenswrapper[4676]: I0930 14:35:34.473095 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 14:35:34 crc kubenswrapper[4676]: I0930 14:35:34.473329 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mzpth" Sep 30 14:35:34 crc kubenswrapper[4676]: I0930 14:35:34.473456 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 14:35:34 crc kubenswrapper[4676]: I0930 14:35:34.473699 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Sep 30 14:35:34 crc kubenswrapper[4676]: I0930 14:35:34.473892 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 14:35:34 crc kubenswrapper[4676]: I0930 14:35:34.490001 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-js2fj"] Sep 30 14:35:34 crc kubenswrapper[4676]: I0930 14:35:34.609578 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac2b4c79-3867-4f1b-bb55-d0978cffaded-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-js2fj\" (UID: \"ac2b4c79-3867-4f1b-bb55-d0978cffaded\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-js2fj" Sep 30 14:35:34 crc kubenswrapper[4676]: I0930 14:35:34.609646 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac2b4c79-3867-4f1b-bb55-d0978cffaded-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-js2fj\" (UID: \"ac2b4c79-3867-4f1b-bb55-d0978cffaded\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-js2fj" Sep 30 14:35:34 crc kubenswrapper[4676]: I0930 14:35:34.609738 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ac2b4c79-3867-4f1b-bb55-d0978cffaded-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-js2fj\" (UID: \"ac2b4c79-3867-4f1b-bb55-d0978cffaded\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-js2fj" Sep 30 14:35:34 crc kubenswrapper[4676]: I0930 14:35:34.611129 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zrzc\" (UniqueName: \"kubernetes.io/projected/ac2b4c79-3867-4f1b-bb55-d0978cffaded-kube-api-access-6zrzc\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-js2fj\" (UID: \"ac2b4c79-3867-4f1b-bb55-d0978cffaded\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-js2fj" Sep 30 14:35:34 crc kubenswrapper[4676]: I0930 14:35:34.611288 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ac2b4c79-3867-4f1b-bb55-d0978cffaded-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-js2fj\" (UID: \"ac2b4c79-3867-4f1b-bb55-d0978cffaded\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-js2fj" Sep 30 14:35:34 crc kubenswrapper[4676]: I0930 14:35:34.712977 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac2b4c79-3867-4f1b-bb55-d0978cffaded-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-js2fj\" (UID: \"ac2b4c79-3867-4f1b-bb55-d0978cffaded\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-js2fj" Sep 30 14:35:34 crc kubenswrapper[4676]: I0930 14:35:34.713043 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac2b4c79-3867-4f1b-bb55-d0978cffaded-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-js2fj\" (UID: \"ac2b4c79-3867-4f1b-bb55-d0978cffaded\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-js2fj" Sep 30 14:35:34 crc kubenswrapper[4676]: I0930 14:35:34.713064 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ac2b4c79-3867-4f1b-bb55-d0978cffaded-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-js2fj\" (UID: \"ac2b4c79-3867-4f1b-bb55-d0978cffaded\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-js2fj" Sep 30 14:35:34 crc kubenswrapper[4676]: I0930 14:35:34.713147 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zrzc\" (UniqueName: \"kubernetes.io/projected/ac2b4c79-3867-4f1b-bb55-d0978cffaded-kube-api-access-6zrzc\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-js2fj\" (UID: \"ac2b4c79-3867-4f1b-bb55-d0978cffaded\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-js2fj" Sep 30 14:35:34 crc kubenswrapper[4676]: I0930 14:35:34.713239 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ac2b4c79-3867-4f1b-bb55-d0978cffaded-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-js2fj\" (UID: \"ac2b4c79-3867-4f1b-bb55-d0978cffaded\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-js2fj" Sep 30 14:35:34 crc kubenswrapper[4676]: I0930 14:35:34.714311 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ac2b4c79-3867-4f1b-bb55-d0978cffaded-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-js2fj\" (UID: \"ac2b4c79-3867-4f1b-bb55-d0978cffaded\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-js2fj" Sep 30 14:35:34 crc kubenswrapper[4676]: I0930 14:35:34.718312 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ac2b4c79-3867-4f1b-bb55-d0978cffaded-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-js2fj\" (UID: \"ac2b4c79-3867-4f1b-bb55-d0978cffaded\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-js2fj" Sep 30 14:35:34 crc kubenswrapper[4676]: I0930 14:35:34.718375 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac2b4c79-3867-4f1b-bb55-d0978cffaded-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-js2fj\" (UID: \"ac2b4c79-3867-4f1b-bb55-d0978cffaded\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-js2fj" Sep 30 14:35:34 crc kubenswrapper[4676]: I0930 14:35:34.737216 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac2b4c79-3867-4f1b-bb55-d0978cffaded-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-js2fj\" (UID: \"ac2b4c79-3867-4f1b-bb55-d0978cffaded\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-js2fj" Sep 30 14:35:34 crc kubenswrapper[4676]: I0930 14:35:34.759925 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zrzc\" (UniqueName: \"kubernetes.io/projected/ac2b4c79-3867-4f1b-bb55-d0978cffaded-kube-api-access-6zrzc\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-js2fj\" (UID: \"ac2b4c79-3867-4f1b-bb55-d0978cffaded\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-js2fj" Sep 30 14:35:34 crc kubenswrapper[4676]: I0930 14:35:34.790016 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-js2fj" Sep 30 14:35:35 crc kubenswrapper[4676]: W0930 14:35:35.336784 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac2b4c79_3867_4f1b_bb55_d0978cffaded.slice/crio-11b045df675c4aedc2e3902f39e4132286e1e7d90314ebac72dd296200731480 WatchSource:0}: Error finding container 11b045df675c4aedc2e3902f39e4132286e1e7d90314ebac72dd296200731480: Status 404 returned error can't find the container with id 11b045df675c4aedc2e3902f39e4132286e1e7d90314ebac72dd296200731480 Sep 30 14:35:35 crc kubenswrapper[4676]: I0930 14:35:35.338337 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-js2fj"] Sep 30 14:35:35 crc kubenswrapper[4676]: I0930 14:35:35.346325 4676 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 14:35:35 crc kubenswrapper[4676]: I0930 14:35:35.376093 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-js2fj" event={"ID":"ac2b4c79-3867-4f1b-bb55-d0978cffaded","Type":"ContainerStarted","Data":"11b045df675c4aedc2e3902f39e4132286e1e7d90314ebac72dd296200731480"} Sep 30 14:35:36 crc kubenswrapper[4676]: I0930 14:35:36.385911 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-js2fj" event={"ID":"ac2b4c79-3867-4f1b-bb55-d0978cffaded","Type":"ContainerStarted","Data":"921b48e9fe09e41f1fe0ee3bc951670dc88bbe296a54ee176254bb8ee8c84bfc"} Sep 30 14:35:36 crc kubenswrapper[4676]: I0930 14:35:36.409231 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-js2fj" podStartSLOduration=1.960928333 podStartE2EDuration="2.409207258s" podCreationTimestamp="2025-09-30 14:35:34 +0000 UTC" firstStartedPulling="2025-09-30 14:35:35.345972454 +0000 UTC m=+2239.329060893" lastFinishedPulling="2025-09-30 14:35:35.794251389 +0000 UTC m=+2239.777339818" observedRunningTime="2025-09-30 14:35:36.40653162 +0000 UTC m=+2240.389620059" watchObservedRunningTime="2025-09-30 14:35:36.409207258 +0000 UTC m=+2240.392295687" Sep 30 14:35:59 crc kubenswrapper[4676]: I0930 14:35:59.920248 4676 patch_prober.go:28] interesting pod/machine-config-daemon-4k2dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:35:59 crc kubenswrapper[4676]: I0930 14:35:59.921058 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:35:59 crc kubenswrapper[4676]: I0930 14:35:59.921109 4676 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" Sep 30 14:35:59 crc kubenswrapper[4676]: I0930 14:35:59.921871 4676 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"587ffe0c6c9654995cc627899f735a2c46498e902ecdd2084cd7e9cf322d7a4f"} pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 14:35:59 crc kubenswrapper[4676]: I0930 14:35:59.921964 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerName="machine-config-daemon" containerID="cri-o://587ffe0c6c9654995cc627899f735a2c46498e902ecdd2084cd7e9cf322d7a4f" gracePeriod=600 Sep 30 14:36:00 crc kubenswrapper[4676]: I0930 14:36:00.589234 4676 generic.go:334] "Generic (PLEG): container finished" podID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerID="587ffe0c6c9654995cc627899f735a2c46498e902ecdd2084cd7e9cf322d7a4f" exitCode=0 Sep 30 14:36:00 crc kubenswrapper[4676]: I0930 14:36:00.589465 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" event={"ID":"af133cb7-f0e4-428e-b348-c6e81493fc1d","Type":"ContainerDied","Data":"587ffe0c6c9654995cc627899f735a2c46498e902ecdd2084cd7e9cf322d7a4f"} Sep 30 14:36:00 crc kubenswrapper[4676]: I0930 14:36:00.589831 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" event={"ID":"af133cb7-f0e4-428e-b348-c6e81493fc1d","Type":"ContainerStarted","Data":"477c0a9d00ef4eb891b1ca42013ec95f070550ef2a4a839127e5b5c32ca124a9"} Sep 30 14:36:00 crc kubenswrapper[4676]: I0930 14:36:00.589850 4676 scope.go:117] "RemoveContainer" containerID="594bb900fe8a86f6d24186adf1f821b6fb52e495dce4ee8695064f3b8033f590" Sep 30 14:36:35 crc kubenswrapper[4676]: I0930 14:36:35.897618 4676 generic.go:334] "Generic (PLEG): container finished" podID="ac2b4c79-3867-4f1b-bb55-d0978cffaded" containerID="921b48e9fe09e41f1fe0ee3bc951670dc88bbe296a54ee176254bb8ee8c84bfc" exitCode=0 Sep 30 14:36:35 crc kubenswrapper[4676]: I0930 14:36:35.897714 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-js2fj" event={"ID":"ac2b4c79-3867-4f1b-bb55-d0978cffaded","Type":"ContainerDied","Data":"921b48e9fe09e41f1fe0ee3bc951670dc88bbe296a54ee176254bb8ee8c84bfc"} Sep 30 14:36:37 crc kubenswrapper[4676]: I0930 14:36:37.299842 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-js2fj" Sep 30 14:36:37 crc kubenswrapper[4676]: I0930 14:36:37.463248 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ac2b4c79-3867-4f1b-bb55-d0978cffaded-ssh-key\") pod \"ac2b4c79-3867-4f1b-bb55-d0978cffaded\" (UID: \"ac2b4c79-3867-4f1b-bb55-d0978cffaded\") " Sep 30 14:36:37 crc kubenswrapper[4676]: I0930 14:36:37.463505 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zrzc\" (UniqueName: \"kubernetes.io/projected/ac2b4c79-3867-4f1b-bb55-d0978cffaded-kube-api-access-6zrzc\") pod \"ac2b4c79-3867-4f1b-bb55-d0978cffaded\" (UID: \"ac2b4c79-3867-4f1b-bb55-d0978cffaded\") " Sep 30 14:36:37 crc kubenswrapper[4676]: I0930 14:36:37.463540 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac2b4c79-3867-4f1b-bb55-d0978cffaded-inventory\") pod \"ac2b4c79-3867-4f1b-bb55-d0978cffaded\" (UID: \"ac2b4c79-3867-4f1b-bb55-d0978cffaded\") " Sep 30 14:36:37 crc kubenswrapper[4676]: I0930 14:36:37.463597 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac2b4c79-3867-4f1b-bb55-d0978cffaded-ovn-combined-ca-bundle\") pod \"ac2b4c79-3867-4f1b-bb55-d0978cffaded\" (UID: \"ac2b4c79-3867-4f1b-bb55-d0978cffaded\") " Sep 30 14:36:37 crc kubenswrapper[4676]: I0930 14:36:37.463764 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ac2b4c79-3867-4f1b-bb55-d0978cffaded-ovncontroller-config-0\") pod \"ac2b4c79-3867-4f1b-bb55-d0978cffaded\" (UID: \"ac2b4c79-3867-4f1b-bb55-d0978cffaded\") " Sep 30 14:36:37 crc kubenswrapper[4676]: I0930 14:36:37.469840 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac2b4c79-3867-4f1b-bb55-d0978cffaded-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "ac2b4c79-3867-4f1b-bb55-d0978cffaded" (UID: "ac2b4c79-3867-4f1b-bb55-d0978cffaded"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:36:37 crc kubenswrapper[4676]: I0930 14:36:37.472190 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac2b4c79-3867-4f1b-bb55-d0978cffaded-kube-api-access-6zrzc" (OuterVolumeSpecName: "kube-api-access-6zrzc") pod "ac2b4c79-3867-4f1b-bb55-d0978cffaded" (UID: "ac2b4c79-3867-4f1b-bb55-d0978cffaded"). InnerVolumeSpecName "kube-api-access-6zrzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:36:37 crc kubenswrapper[4676]: I0930 14:36:37.495170 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac2b4c79-3867-4f1b-bb55-d0978cffaded-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "ac2b4c79-3867-4f1b-bb55-d0978cffaded" (UID: "ac2b4c79-3867-4f1b-bb55-d0978cffaded"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:36:37 crc kubenswrapper[4676]: I0930 14:36:37.497700 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac2b4c79-3867-4f1b-bb55-d0978cffaded-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ac2b4c79-3867-4f1b-bb55-d0978cffaded" (UID: "ac2b4c79-3867-4f1b-bb55-d0978cffaded"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:36:37 crc kubenswrapper[4676]: I0930 14:36:37.498238 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac2b4c79-3867-4f1b-bb55-d0978cffaded-inventory" (OuterVolumeSpecName: "inventory") pod "ac2b4c79-3867-4f1b-bb55-d0978cffaded" (UID: "ac2b4c79-3867-4f1b-bb55-d0978cffaded"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:36:37 crc kubenswrapper[4676]: I0930 14:36:37.566594 4676 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ac2b4c79-3867-4f1b-bb55-d0978cffaded-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 14:36:37 crc kubenswrapper[4676]: I0930 14:36:37.566632 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6zrzc\" (UniqueName: \"kubernetes.io/projected/ac2b4c79-3867-4f1b-bb55-d0978cffaded-kube-api-access-6zrzc\") on node \"crc\" DevicePath \"\"" Sep 30 14:36:37 crc kubenswrapper[4676]: I0930 14:36:37.566643 4676 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac2b4c79-3867-4f1b-bb55-d0978cffaded-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 14:36:37 crc kubenswrapper[4676]: I0930 14:36:37.566653 4676 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac2b4c79-3867-4f1b-bb55-d0978cffaded-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:36:37 crc kubenswrapper[4676]: I0930 14:36:37.566664 4676 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ac2b4c79-3867-4f1b-bb55-d0978cffaded-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Sep 30 14:36:37 crc kubenswrapper[4676]: I0930 14:36:37.914637 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-js2fj" event={"ID":"ac2b4c79-3867-4f1b-bb55-d0978cffaded","Type":"ContainerDied","Data":"11b045df675c4aedc2e3902f39e4132286e1e7d90314ebac72dd296200731480"} Sep 30 14:36:37 crc kubenswrapper[4676]: I0930 14:36:37.914686 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11b045df675c4aedc2e3902f39e4132286e1e7d90314ebac72dd296200731480" Sep 30 14:36:37 crc kubenswrapper[4676]: I0930 14:36:37.914698 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-js2fj" Sep 30 14:36:38 crc kubenswrapper[4676]: I0930 14:36:38.064661 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w6g6m"] Sep 30 14:36:38 crc kubenswrapper[4676]: E0930 14:36:38.065344 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac2b4c79-3867-4f1b-bb55-d0978cffaded" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Sep 30 14:36:38 crc kubenswrapper[4676]: I0930 14:36:38.065372 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac2b4c79-3867-4f1b-bb55-d0978cffaded" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Sep 30 14:36:38 crc kubenswrapper[4676]: I0930 14:36:38.065650 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac2b4c79-3867-4f1b-bb55-d0978cffaded" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Sep 30 14:36:38 crc kubenswrapper[4676]: I0930 14:36:38.066673 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w6g6m" Sep 30 14:36:38 crc kubenswrapper[4676]: I0930 14:36:38.071704 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 14:36:38 crc kubenswrapper[4676]: I0930 14:36:38.071835 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Sep 30 14:36:38 crc kubenswrapper[4676]: I0930 14:36:38.071835 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Sep 30 14:36:38 crc kubenswrapper[4676]: I0930 14:36:38.072186 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 14:36:38 crc kubenswrapper[4676]: I0930 14:36:38.072314 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mzpth" Sep 30 14:36:38 crc kubenswrapper[4676]: I0930 14:36:38.073278 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 14:36:38 crc kubenswrapper[4676]: I0930 14:36:38.076314 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w6g6m"] Sep 30 14:36:38 crc kubenswrapper[4676]: I0930 14:36:38.179020 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87967da4-c3f2-46e1-ae80-230612ebe6af-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w6g6m\" (UID: \"87967da4-c3f2-46e1-ae80-230612ebe6af\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w6g6m" Sep 30 14:36:38 crc kubenswrapper[4676]: I0930 14:36:38.179111 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/87967da4-c3f2-46e1-ae80-230612ebe6af-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w6g6m\" (UID: \"87967da4-c3f2-46e1-ae80-230612ebe6af\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w6g6m" Sep 30 14:36:38 crc kubenswrapper[4676]: I0930 14:36:38.179173 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/87967da4-c3f2-46e1-ae80-230612ebe6af-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w6g6m\" (UID: \"87967da4-c3f2-46e1-ae80-230612ebe6af\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w6g6m" Sep 30 14:36:38 crc kubenswrapper[4676]: I0930 14:36:38.179362 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/87967da4-c3f2-46e1-ae80-230612ebe6af-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w6g6m\" (UID: \"87967da4-c3f2-46e1-ae80-230612ebe6af\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w6g6m" Sep 30 14:36:38 crc kubenswrapper[4676]: I0930 14:36:38.179468 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgjhd\" (UniqueName: \"kubernetes.io/projected/87967da4-c3f2-46e1-ae80-230612ebe6af-kube-api-access-cgjhd\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w6g6m\" (UID: \"87967da4-c3f2-46e1-ae80-230612ebe6af\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w6g6m" Sep 30 14:36:38 crc kubenswrapper[4676]: I0930 14:36:38.179728 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/87967da4-c3f2-46e1-ae80-230612ebe6af-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w6g6m\" (UID: \"87967da4-c3f2-46e1-ae80-230612ebe6af\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w6g6m" Sep 30 14:36:38 crc kubenswrapper[4676]: I0930 14:36:38.281545 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87967da4-c3f2-46e1-ae80-230612ebe6af-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w6g6m\" (UID: \"87967da4-c3f2-46e1-ae80-230612ebe6af\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w6g6m" Sep 30 14:36:38 crc kubenswrapper[4676]: I0930 14:36:38.281598 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/87967da4-c3f2-46e1-ae80-230612ebe6af-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w6g6m\" (UID: \"87967da4-c3f2-46e1-ae80-230612ebe6af\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w6g6m" Sep 30 14:36:38 crc kubenswrapper[4676]: I0930 14:36:38.281634 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/87967da4-c3f2-46e1-ae80-230612ebe6af-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w6g6m\" (UID: \"87967da4-c3f2-46e1-ae80-230612ebe6af\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w6g6m" Sep 30 14:36:38 crc kubenswrapper[4676]: I0930 14:36:38.281685 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/87967da4-c3f2-46e1-ae80-230612ebe6af-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w6g6m\" (UID: \"87967da4-c3f2-46e1-ae80-230612ebe6af\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w6g6m" Sep 30 14:36:38 crc kubenswrapper[4676]: I0930 14:36:38.281738 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgjhd\" (UniqueName: \"kubernetes.io/projected/87967da4-c3f2-46e1-ae80-230612ebe6af-kube-api-access-cgjhd\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w6g6m\" (UID: \"87967da4-c3f2-46e1-ae80-230612ebe6af\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w6g6m" Sep 30 14:36:38 crc kubenswrapper[4676]: I0930 14:36:38.281829 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/87967da4-c3f2-46e1-ae80-230612ebe6af-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w6g6m\" (UID: \"87967da4-c3f2-46e1-ae80-230612ebe6af\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w6g6m" Sep 30 14:36:38 crc kubenswrapper[4676]: I0930 14:36:38.286356 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/87967da4-c3f2-46e1-ae80-230612ebe6af-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w6g6m\" (UID: \"87967da4-c3f2-46e1-ae80-230612ebe6af\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w6g6m" Sep 30 14:36:38 crc kubenswrapper[4676]: I0930 14:36:38.286356 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/87967da4-c3f2-46e1-ae80-230612ebe6af-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w6g6m\" (UID: \"87967da4-c3f2-46e1-ae80-230612ebe6af\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w6g6m" Sep 30 14:36:38 crc kubenswrapper[4676]: I0930 14:36:38.286862 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/87967da4-c3f2-46e1-ae80-230612ebe6af-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w6g6m\" (UID: \"87967da4-c3f2-46e1-ae80-230612ebe6af\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w6g6m" Sep 30 14:36:38 crc kubenswrapper[4676]: I0930 14:36:38.287310 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87967da4-c3f2-46e1-ae80-230612ebe6af-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w6g6m\" (UID: \"87967da4-c3f2-46e1-ae80-230612ebe6af\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w6g6m" Sep 30 14:36:38 crc kubenswrapper[4676]: I0930 14:36:38.287994 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/87967da4-c3f2-46e1-ae80-230612ebe6af-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w6g6m\" (UID: \"87967da4-c3f2-46e1-ae80-230612ebe6af\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w6g6m" Sep 30 14:36:38 crc kubenswrapper[4676]: I0930 14:36:38.301270 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgjhd\" (UniqueName: \"kubernetes.io/projected/87967da4-c3f2-46e1-ae80-230612ebe6af-kube-api-access-cgjhd\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w6g6m\" (UID: \"87967da4-c3f2-46e1-ae80-230612ebe6af\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w6g6m" Sep 30 14:36:38 crc kubenswrapper[4676]: I0930 14:36:38.399050 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w6g6m" Sep 30 14:36:38 crc kubenswrapper[4676]: I0930 14:36:38.968473 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w6g6m"] Sep 30 14:36:39 crc kubenswrapper[4676]: I0930 14:36:39.937570 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w6g6m" event={"ID":"87967da4-c3f2-46e1-ae80-230612ebe6af","Type":"ContainerStarted","Data":"1e961680eeb4d8f95f9e2d94d254ea1add9675da777dc1b1ae650768ea31c119"} Sep 30 14:36:39 crc kubenswrapper[4676]: I0930 14:36:39.938038 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w6g6m" event={"ID":"87967da4-c3f2-46e1-ae80-230612ebe6af","Type":"ContainerStarted","Data":"0dde4d4af553270ef7692f3e6aa54503ada953215a8ee0368f790ac062ce65b4"} Sep 30 14:36:39 crc kubenswrapper[4676]: I0930 14:36:39.967649 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w6g6m" podStartSLOduration=1.495552541 podStartE2EDuration="1.967606012s" podCreationTimestamp="2025-09-30 14:36:38 +0000 UTC" firstStartedPulling="2025-09-30 14:36:38.962649911 +0000 UTC m=+2302.945738340" lastFinishedPulling="2025-09-30 14:36:39.434703392 +0000 UTC m=+2303.417791811" observedRunningTime="2025-09-30 14:36:39.953249912 +0000 UTC m=+2303.936338341" watchObservedRunningTime="2025-09-30 14:36:39.967606012 +0000 UTC m=+2303.950694441" Sep 30 14:37:26 crc kubenswrapper[4676]: I0930 14:37:26.335870 4676 generic.go:334] "Generic (PLEG): container finished" podID="87967da4-c3f2-46e1-ae80-230612ebe6af" containerID="1e961680eeb4d8f95f9e2d94d254ea1add9675da777dc1b1ae650768ea31c119" exitCode=0 Sep 30 14:37:26 crc kubenswrapper[4676]: I0930 14:37:26.337081 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w6g6m" event={"ID":"87967da4-c3f2-46e1-ae80-230612ebe6af","Type":"ContainerDied","Data":"1e961680eeb4d8f95f9e2d94d254ea1add9675da777dc1b1ae650768ea31c119"} Sep 30 14:37:27 crc kubenswrapper[4676]: I0930 14:37:27.767333 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w6g6m" Sep 30 14:37:27 crc kubenswrapper[4676]: I0930 14:37:27.883740 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgjhd\" (UniqueName: \"kubernetes.io/projected/87967da4-c3f2-46e1-ae80-230612ebe6af-kube-api-access-cgjhd\") pod \"87967da4-c3f2-46e1-ae80-230612ebe6af\" (UID: \"87967da4-c3f2-46e1-ae80-230612ebe6af\") " Sep 30 14:37:27 crc kubenswrapper[4676]: I0930 14:37:27.884265 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/87967da4-c3f2-46e1-ae80-230612ebe6af-neutron-ovn-metadata-agent-neutron-config-0\") pod \"87967da4-c3f2-46e1-ae80-230612ebe6af\" (UID: \"87967da4-c3f2-46e1-ae80-230612ebe6af\") " Sep 30 14:37:27 crc kubenswrapper[4676]: I0930 14:37:27.884377 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/87967da4-c3f2-46e1-ae80-230612ebe6af-inventory\") pod \"87967da4-c3f2-46e1-ae80-230612ebe6af\" (UID: \"87967da4-c3f2-46e1-ae80-230612ebe6af\") " Sep 30 14:37:27 crc kubenswrapper[4676]: I0930 14:37:27.884524 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/87967da4-c3f2-46e1-ae80-230612ebe6af-nova-metadata-neutron-config-0\") pod \"87967da4-c3f2-46e1-ae80-230612ebe6af\" (UID: \"87967da4-c3f2-46e1-ae80-230612ebe6af\") " Sep 30 14:37:27 crc kubenswrapper[4676]: I0930 14:37:27.884714 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/87967da4-c3f2-46e1-ae80-230612ebe6af-ssh-key\") pod \"87967da4-c3f2-46e1-ae80-230612ebe6af\" (UID: \"87967da4-c3f2-46e1-ae80-230612ebe6af\") " Sep 30 14:37:27 crc kubenswrapper[4676]: I0930 14:37:27.885245 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87967da4-c3f2-46e1-ae80-230612ebe6af-neutron-metadata-combined-ca-bundle\") pod \"87967da4-c3f2-46e1-ae80-230612ebe6af\" (UID: \"87967da4-c3f2-46e1-ae80-230612ebe6af\") " Sep 30 14:37:27 crc kubenswrapper[4676]: I0930 14:37:27.890670 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87967da4-c3f2-46e1-ae80-230612ebe6af-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "87967da4-c3f2-46e1-ae80-230612ebe6af" (UID: "87967da4-c3f2-46e1-ae80-230612ebe6af"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:37:27 crc kubenswrapper[4676]: I0930 14:37:27.893933 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87967da4-c3f2-46e1-ae80-230612ebe6af-kube-api-access-cgjhd" (OuterVolumeSpecName: "kube-api-access-cgjhd") pod "87967da4-c3f2-46e1-ae80-230612ebe6af" (UID: "87967da4-c3f2-46e1-ae80-230612ebe6af"). InnerVolumeSpecName "kube-api-access-cgjhd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:37:27 crc kubenswrapper[4676]: I0930 14:37:27.914757 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87967da4-c3f2-46e1-ae80-230612ebe6af-inventory" (OuterVolumeSpecName: "inventory") pod "87967da4-c3f2-46e1-ae80-230612ebe6af" (UID: "87967da4-c3f2-46e1-ae80-230612ebe6af"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:37:27 crc kubenswrapper[4676]: I0930 14:37:27.917045 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87967da4-c3f2-46e1-ae80-230612ebe6af-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "87967da4-c3f2-46e1-ae80-230612ebe6af" (UID: "87967da4-c3f2-46e1-ae80-230612ebe6af"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:37:27 crc kubenswrapper[4676]: I0930 14:37:27.919052 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87967da4-c3f2-46e1-ae80-230612ebe6af-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "87967da4-c3f2-46e1-ae80-230612ebe6af" (UID: "87967da4-c3f2-46e1-ae80-230612ebe6af"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:37:27 crc kubenswrapper[4676]: I0930 14:37:27.919619 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87967da4-c3f2-46e1-ae80-230612ebe6af-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "87967da4-c3f2-46e1-ae80-230612ebe6af" (UID: "87967da4-c3f2-46e1-ae80-230612ebe6af"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:37:27 crc kubenswrapper[4676]: I0930 14:37:27.987698 4676 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/87967da4-c3f2-46e1-ae80-230612ebe6af-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Sep 30 14:37:27 crc kubenswrapper[4676]: I0930 14:37:27.987746 4676 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/87967da4-c3f2-46e1-ae80-230612ebe6af-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 14:37:27 crc kubenswrapper[4676]: I0930 14:37:27.987761 4676 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/87967da4-c3f2-46e1-ae80-230612ebe6af-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Sep 30 14:37:27 crc kubenswrapper[4676]: I0930 14:37:27.987773 4676 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/87967da4-c3f2-46e1-ae80-230612ebe6af-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 14:37:27 crc kubenswrapper[4676]: I0930 14:37:27.987799 4676 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87967da4-c3f2-46e1-ae80-230612ebe6af-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:37:27 crc kubenswrapper[4676]: I0930 14:37:27.987814 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgjhd\" (UniqueName: \"kubernetes.io/projected/87967da4-c3f2-46e1-ae80-230612ebe6af-kube-api-access-cgjhd\") on node \"crc\" DevicePath \"\"" Sep 30 14:37:28 crc kubenswrapper[4676]: I0930 14:37:28.357451 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w6g6m" event={"ID":"87967da4-c3f2-46e1-ae80-230612ebe6af","Type":"ContainerDied","Data":"0dde4d4af553270ef7692f3e6aa54503ada953215a8ee0368f790ac062ce65b4"} Sep 30 14:37:28 crc kubenswrapper[4676]: I0930 14:37:28.357500 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0dde4d4af553270ef7692f3e6aa54503ada953215a8ee0368f790ac062ce65b4" Sep 30 14:37:28 crc kubenswrapper[4676]: I0930 14:37:28.357595 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w6g6m" Sep 30 14:37:28 crc kubenswrapper[4676]: I0930 14:37:28.465537 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dr7m6"] Sep 30 14:37:28 crc kubenswrapper[4676]: E0930 14:37:28.465990 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87967da4-c3f2-46e1-ae80-230612ebe6af" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Sep 30 14:37:28 crc kubenswrapper[4676]: I0930 14:37:28.466013 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="87967da4-c3f2-46e1-ae80-230612ebe6af" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Sep 30 14:37:28 crc kubenswrapper[4676]: I0930 14:37:28.466201 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="87967da4-c3f2-46e1-ae80-230612ebe6af" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Sep 30 14:37:28 crc kubenswrapper[4676]: I0930 14:37:28.467062 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dr7m6" Sep 30 14:37:28 crc kubenswrapper[4676]: I0930 14:37:28.469306 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mzpth" Sep 30 14:37:28 crc kubenswrapper[4676]: I0930 14:37:28.469545 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 14:37:28 crc kubenswrapper[4676]: I0930 14:37:28.471128 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 14:37:28 crc kubenswrapper[4676]: I0930 14:37:28.471626 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Sep 30 14:37:28 crc kubenswrapper[4676]: I0930 14:37:28.476499 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 14:37:28 crc kubenswrapper[4676]: I0930 14:37:28.480605 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dr7m6"] Sep 30 14:37:28 crc kubenswrapper[4676]: I0930 14:37:28.598679 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/53729d22-521b-4f61-a225-832492a98b7b-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dr7m6\" (UID: \"53729d22-521b-4f61-a225-832492a98b7b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dr7m6" Sep 30 14:37:28 crc kubenswrapper[4676]: I0930 14:37:28.598864 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/53729d22-521b-4f61-a225-832492a98b7b-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dr7m6\" (UID: \"53729d22-521b-4f61-a225-832492a98b7b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dr7m6" Sep 30 14:37:28 crc kubenswrapper[4676]: I0930 14:37:28.598937 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glrc4\" (UniqueName: \"kubernetes.io/projected/53729d22-521b-4f61-a225-832492a98b7b-kube-api-access-glrc4\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dr7m6\" (UID: \"53729d22-521b-4f61-a225-832492a98b7b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dr7m6" Sep 30 14:37:28 crc kubenswrapper[4676]: I0930 14:37:28.598965 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/53729d22-521b-4f61-a225-832492a98b7b-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dr7m6\" (UID: \"53729d22-521b-4f61-a225-832492a98b7b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dr7m6" Sep 30 14:37:28 crc kubenswrapper[4676]: I0930 14:37:28.598997 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53729d22-521b-4f61-a225-832492a98b7b-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dr7m6\" (UID: \"53729d22-521b-4f61-a225-832492a98b7b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dr7m6" Sep 30 14:37:28 crc kubenswrapper[4676]: I0930 14:37:28.701994 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/53729d22-521b-4f61-a225-832492a98b7b-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dr7m6\" (UID: \"53729d22-521b-4f61-a225-832492a98b7b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dr7m6" Sep 30 14:37:28 crc kubenswrapper[4676]: I0930 14:37:28.702073 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glrc4\" (UniqueName: \"kubernetes.io/projected/53729d22-521b-4f61-a225-832492a98b7b-kube-api-access-glrc4\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dr7m6\" (UID: \"53729d22-521b-4f61-a225-832492a98b7b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dr7m6" Sep 30 14:37:28 crc kubenswrapper[4676]: I0930 14:37:28.702113 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/53729d22-521b-4f61-a225-832492a98b7b-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dr7m6\" (UID: \"53729d22-521b-4f61-a225-832492a98b7b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dr7m6" Sep 30 14:37:28 crc kubenswrapper[4676]: I0930 14:37:28.702149 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53729d22-521b-4f61-a225-832492a98b7b-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dr7m6\" (UID: \"53729d22-521b-4f61-a225-832492a98b7b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dr7m6" Sep 30 14:37:28 crc kubenswrapper[4676]: I0930 14:37:28.702230 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/53729d22-521b-4f61-a225-832492a98b7b-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dr7m6\" (UID: \"53729d22-521b-4f61-a225-832492a98b7b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dr7m6" Sep 30 14:37:28 crc kubenswrapper[4676]: I0930 14:37:28.706978 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/53729d22-521b-4f61-a225-832492a98b7b-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dr7m6\" (UID: \"53729d22-521b-4f61-a225-832492a98b7b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dr7m6" Sep 30 14:37:28 crc kubenswrapper[4676]: I0930 14:37:28.707572 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/53729d22-521b-4f61-a225-832492a98b7b-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dr7m6\" (UID: \"53729d22-521b-4f61-a225-832492a98b7b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dr7m6" Sep 30 14:37:28 crc kubenswrapper[4676]: I0930 14:37:28.709117 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53729d22-521b-4f61-a225-832492a98b7b-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dr7m6\" (UID: \"53729d22-521b-4f61-a225-832492a98b7b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dr7m6" Sep 30 14:37:28 crc kubenswrapper[4676]: I0930 14:37:28.709854 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/53729d22-521b-4f61-a225-832492a98b7b-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dr7m6\" (UID: \"53729d22-521b-4f61-a225-832492a98b7b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dr7m6" Sep 30 14:37:28 crc kubenswrapper[4676]: I0930 14:37:28.724377 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glrc4\" (UniqueName: \"kubernetes.io/projected/53729d22-521b-4f61-a225-832492a98b7b-kube-api-access-glrc4\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dr7m6\" (UID: \"53729d22-521b-4f61-a225-832492a98b7b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dr7m6" Sep 30 14:37:28 crc kubenswrapper[4676]: I0930 14:37:28.789825 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dr7m6" Sep 30 14:37:29 crc kubenswrapper[4676]: I0930 14:37:29.354614 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dr7m6"] Sep 30 14:37:30 crc kubenswrapper[4676]: I0930 14:37:30.425486 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dr7m6" event={"ID":"53729d22-521b-4f61-a225-832492a98b7b","Type":"ContainerStarted","Data":"2e2286f0d95c5f4b56bfeb8b1a30e2f1b9e6a3f0ed95580f3d53529837fa1e1e"} Sep 30 14:37:30 crc kubenswrapper[4676]: I0930 14:37:30.426101 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dr7m6" event={"ID":"53729d22-521b-4f61-a225-832492a98b7b","Type":"ContainerStarted","Data":"874be3f0e982847d6fb4fe3971442b61963db208a68c9dd7ccee27eee3a9a509"} Sep 30 14:37:30 crc kubenswrapper[4676]: I0930 14:37:30.447679 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dr7m6" podStartSLOduration=1.951166578 podStartE2EDuration="2.447652367s" podCreationTimestamp="2025-09-30 14:37:28 +0000 UTC" firstStartedPulling="2025-09-30 14:37:29.364535869 +0000 UTC m=+2353.347624298" lastFinishedPulling="2025-09-30 14:37:29.861021658 +0000 UTC m=+2353.844110087" observedRunningTime="2025-09-30 14:37:30.441407359 +0000 UTC m=+2354.424495788" watchObservedRunningTime="2025-09-30 14:37:30.447652367 +0000 UTC m=+2354.430740796" Sep 30 14:38:29 crc kubenswrapper[4676]: I0930 14:38:29.919600 4676 patch_prober.go:28] interesting pod/machine-config-daemon-4k2dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:38:29 crc kubenswrapper[4676]: I0930 14:38:29.920236 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:38:59 crc kubenswrapper[4676]: I0930 14:38:59.919666 4676 patch_prober.go:28] interesting pod/machine-config-daemon-4k2dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:38:59 crc kubenswrapper[4676]: I0930 14:38:59.920295 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:39:29 crc kubenswrapper[4676]: I0930 14:39:29.920016 4676 patch_prober.go:28] interesting pod/machine-config-daemon-4k2dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:39:29 crc kubenswrapper[4676]: I0930 14:39:29.921167 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:39:29 crc kubenswrapper[4676]: I0930 14:39:29.921256 4676 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" Sep 30 14:39:29 crc kubenswrapper[4676]: I0930 14:39:29.922559 4676 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"477c0a9d00ef4eb891b1ca42013ec95f070550ef2a4a839127e5b5c32ca124a9"} pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 14:39:29 crc kubenswrapper[4676]: I0930 14:39:29.922653 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerName="machine-config-daemon" containerID="cri-o://477c0a9d00ef4eb891b1ca42013ec95f070550ef2a4a839127e5b5c32ca124a9" gracePeriod=600 Sep 30 14:39:30 crc kubenswrapper[4676]: E0930 14:39:30.071449 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:39:30 crc kubenswrapper[4676]: I0930 14:39:30.483267 4676 generic.go:334] "Generic (PLEG): container finished" podID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerID="477c0a9d00ef4eb891b1ca42013ec95f070550ef2a4a839127e5b5c32ca124a9" exitCode=0 Sep 30 14:39:30 crc kubenswrapper[4676]: I0930 14:39:30.483323 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" event={"ID":"af133cb7-f0e4-428e-b348-c6e81493fc1d","Type":"ContainerDied","Data":"477c0a9d00ef4eb891b1ca42013ec95f070550ef2a4a839127e5b5c32ca124a9"} Sep 30 14:39:30 crc kubenswrapper[4676]: I0930 14:39:30.483364 4676 scope.go:117] "RemoveContainer" containerID="587ffe0c6c9654995cc627899f735a2c46498e902ecdd2084cd7e9cf322d7a4f" Sep 30 14:39:30 crc kubenswrapper[4676]: I0930 14:39:30.484194 4676 scope.go:117] "RemoveContainer" containerID="477c0a9d00ef4eb891b1ca42013ec95f070550ef2a4a839127e5b5c32ca124a9" Sep 30 14:39:30 crc kubenswrapper[4676]: E0930 14:39:30.484600 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:39:45 crc kubenswrapper[4676]: I0930 14:39:45.433249 4676 scope.go:117] "RemoveContainer" containerID="477c0a9d00ef4eb891b1ca42013ec95f070550ef2a4a839127e5b5c32ca124a9" Sep 30 14:39:45 crc kubenswrapper[4676]: E0930 14:39:45.434005 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:39:59 crc kubenswrapper[4676]: I0930 14:39:59.433541 4676 scope.go:117] "RemoveContainer" containerID="477c0a9d00ef4eb891b1ca42013ec95f070550ef2a4a839127e5b5c32ca124a9" Sep 30 14:39:59 crc kubenswrapper[4676]: E0930 14:39:59.434340 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:40:12 crc kubenswrapper[4676]: I0930 14:40:12.434097 4676 scope.go:117] "RemoveContainer" containerID="477c0a9d00ef4eb891b1ca42013ec95f070550ef2a4a839127e5b5c32ca124a9" Sep 30 14:40:12 crc kubenswrapper[4676]: E0930 14:40:12.435454 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:40:25 crc kubenswrapper[4676]: I0930 14:40:25.433370 4676 scope.go:117] "RemoveContainer" containerID="477c0a9d00ef4eb891b1ca42013ec95f070550ef2a4a839127e5b5c32ca124a9" Sep 30 14:40:25 crc kubenswrapper[4676]: E0930 14:40:25.434222 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:40:37 crc kubenswrapper[4676]: I0930 14:40:37.442508 4676 scope.go:117] "RemoveContainer" containerID="477c0a9d00ef4eb891b1ca42013ec95f070550ef2a4a839127e5b5c32ca124a9" Sep 30 14:40:37 crc kubenswrapper[4676]: E0930 14:40:37.445311 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:40:50 crc kubenswrapper[4676]: I0930 14:40:50.434027 4676 scope.go:117] "RemoveContainer" containerID="477c0a9d00ef4eb891b1ca42013ec95f070550ef2a4a839127e5b5c32ca124a9" Sep 30 14:40:50 crc kubenswrapper[4676]: E0930 14:40:50.435181 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:41:05 crc kubenswrapper[4676]: I0930 14:41:05.434194 4676 scope.go:117] "RemoveContainer" containerID="477c0a9d00ef4eb891b1ca42013ec95f070550ef2a4a839127e5b5c32ca124a9" Sep 30 14:41:05 crc kubenswrapper[4676]: E0930 14:41:05.435315 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:41:18 crc kubenswrapper[4676]: I0930 14:41:18.433207 4676 scope.go:117] "RemoveContainer" containerID="477c0a9d00ef4eb891b1ca42013ec95f070550ef2a4a839127e5b5c32ca124a9" Sep 30 14:41:18 crc kubenswrapper[4676]: E0930 14:41:18.434292 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:41:20 crc kubenswrapper[4676]: I0930 14:41:20.481043 4676 generic.go:334] "Generic (PLEG): container finished" podID="53729d22-521b-4f61-a225-832492a98b7b" containerID="2e2286f0d95c5f4b56bfeb8b1a30e2f1b9e6a3f0ed95580f3d53529837fa1e1e" exitCode=0 Sep 30 14:41:20 crc kubenswrapper[4676]: I0930 14:41:20.481097 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dr7m6" event={"ID":"53729d22-521b-4f61-a225-832492a98b7b","Type":"ContainerDied","Data":"2e2286f0d95c5f4b56bfeb8b1a30e2f1b9e6a3f0ed95580f3d53529837fa1e1e"} Sep 30 14:41:21 crc kubenswrapper[4676]: I0930 14:41:21.891637 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dr7m6" Sep 30 14:41:21 crc kubenswrapper[4676]: I0930 14:41:21.977700 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/53729d22-521b-4f61-a225-832492a98b7b-ssh-key\") pod \"53729d22-521b-4f61-a225-832492a98b7b\" (UID: \"53729d22-521b-4f61-a225-832492a98b7b\") " Sep 30 14:41:21 crc kubenswrapper[4676]: I0930 14:41:21.977765 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glrc4\" (UniqueName: \"kubernetes.io/projected/53729d22-521b-4f61-a225-832492a98b7b-kube-api-access-glrc4\") pod \"53729d22-521b-4f61-a225-832492a98b7b\" (UID: \"53729d22-521b-4f61-a225-832492a98b7b\") " Sep 30 14:41:21 crc kubenswrapper[4676]: I0930 14:41:21.977829 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/53729d22-521b-4f61-a225-832492a98b7b-inventory\") pod \"53729d22-521b-4f61-a225-832492a98b7b\" (UID: \"53729d22-521b-4f61-a225-832492a98b7b\") " Sep 30 14:41:21 crc kubenswrapper[4676]: I0930 14:41:21.977870 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/53729d22-521b-4f61-a225-832492a98b7b-libvirt-secret-0\") pod \"53729d22-521b-4f61-a225-832492a98b7b\" (UID: \"53729d22-521b-4f61-a225-832492a98b7b\") " Sep 30 14:41:21 crc kubenswrapper[4676]: I0930 14:41:21.977989 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53729d22-521b-4f61-a225-832492a98b7b-libvirt-combined-ca-bundle\") pod \"53729d22-521b-4f61-a225-832492a98b7b\" (UID: \"53729d22-521b-4f61-a225-832492a98b7b\") " Sep 30 14:41:21 crc kubenswrapper[4676]: I0930 14:41:21.984746 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53729d22-521b-4f61-a225-832492a98b7b-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "53729d22-521b-4f61-a225-832492a98b7b" (UID: "53729d22-521b-4f61-a225-832492a98b7b"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:41:21 crc kubenswrapper[4676]: I0930 14:41:21.984975 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53729d22-521b-4f61-a225-832492a98b7b-kube-api-access-glrc4" (OuterVolumeSpecName: "kube-api-access-glrc4") pod "53729d22-521b-4f61-a225-832492a98b7b" (UID: "53729d22-521b-4f61-a225-832492a98b7b"). InnerVolumeSpecName "kube-api-access-glrc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:41:22 crc kubenswrapper[4676]: I0930 14:41:22.010640 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53729d22-521b-4f61-a225-832492a98b7b-inventory" (OuterVolumeSpecName: "inventory") pod "53729d22-521b-4f61-a225-832492a98b7b" (UID: "53729d22-521b-4f61-a225-832492a98b7b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:41:22 crc kubenswrapper[4676]: I0930 14:41:22.012554 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53729d22-521b-4f61-a225-832492a98b7b-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "53729d22-521b-4f61-a225-832492a98b7b" (UID: "53729d22-521b-4f61-a225-832492a98b7b"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:41:22 crc kubenswrapper[4676]: I0930 14:41:22.013906 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53729d22-521b-4f61-a225-832492a98b7b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "53729d22-521b-4f61-a225-832492a98b7b" (UID: "53729d22-521b-4f61-a225-832492a98b7b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:41:22 crc kubenswrapper[4676]: I0930 14:41:22.080647 4676 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/53729d22-521b-4f61-a225-832492a98b7b-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 14:41:22 crc kubenswrapper[4676]: I0930 14:41:22.081051 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-glrc4\" (UniqueName: \"kubernetes.io/projected/53729d22-521b-4f61-a225-832492a98b7b-kube-api-access-glrc4\") on node \"crc\" DevicePath \"\"" Sep 30 14:41:22 crc kubenswrapper[4676]: I0930 14:41:22.081065 4676 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/53729d22-521b-4f61-a225-832492a98b7b-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 14:41:22 crc kubenswrapper[4676]: I0930 14:41:22.081075 4676 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/53729d22-521b-4f61-a225-832492a98b7b-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Sep 30 14:41:22 crc kubenswrapper[4676]: I0930 14:41:22.081084 4676 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53729d22-521b-4f61-a225-832492a98b7b-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:41:22 crc kubenswrapper[4676]: I0930 14:41:22.504257 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dr7m6" event={"ID":"53729d22-521b-4f61-a225-832492a98b7b","Type":"ContainerDied","Data":"874be3f0e982847d6fb4fe3971442b61963db208a68c9dd7ccee27eee3a9a509"} Sep 30 14:41:22 crc kubenswrapper[4676]: I0930 14:41:22.504307 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="874be3f0e982847d6fb4fe3971442b61963db208a68c9dd7ccee27eee3a9a509" Sep 30 14:41:22 crc kubenswrapper[4676]: I0930 14:41:22.504438 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dr7m6" Sep 30 14:41:22 crc kubenswrapper[4676]: I0930 14:41:22.586311 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-gjtn5"] Sep 30 14:41:22 crc kubenswrapper[4676]: E0930 14:41:22.587126 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53729d22-521b-4f61-a225-832492a98b7b" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Sep 30 14:41:22 crc kubenswrapper[4676]: I0930 14:41:22.587151 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="53729d22-521b-4f61-a225-832492a98b7b" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Sep 30 14:41:22 crc kubenswrapper[4676]: I0930 14:41:22.587371 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="53729d22-521b-4f61-a225-832492a98b7b" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Sep 30 14:41:22 crc kubenswrapper[4676]: I0930 14:41:22.588188 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gjtn5" Sep 30 14:41:22 crc kubenswrapper[4676]: I0930 14:41:22.592096 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 14:41:22 crc kubenswrapper[4676]: I0930 14:41:22.592255 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Sep 30 14:41:22 crc kubenswrapper[4676]: I0930 14:41:22.592274 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 14:41:22 crc kubenswrapper[4676]: I0930 14:41:22.592347 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mzpth" Sep 30 14:41:22 crc kubenswrapper[4676]: I0930 14:41:22.592526 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Sep 30 14:41:22 crc kubenswrapper[4676]: I0930 14:41:22.593586 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 14:41:22 crc kubenswrapper[4676]: I0930 14:41:22.593692 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Sep 30 14:41:22 crc kubenswrapper[4676]: I0930 14:41:22.599210 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-gjtn5"] Sep 30 14:41:22 crc kubenswrapper[4676]: I0930 14:41:22.690393 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e4cb3ae8-cc50-4de4-b279-51105c6fc45c-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gjtn5\" (UID: \"e4cb3ae8-cc50-4de4-b279-51105c6fc45c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gjtn5" Sep 30 14:41:22 crc kubenswrapper[4676]: I0930 14:41:22.690466 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldvkg\" (UniqueName: \"kubernetes.io/projected/e4cb3ae8-cc50-4de4-b279-51105c6fc45c-kube-api-access-ldvkg\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gjtn5\" (UID: \"e4cb3ae8-cc50-4de4-b279-51105c6fc45c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gjtn5" Sep 30 14:41:22 crc kubenswrapper[4676]: I0930 14:41:22.690502 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e4cb3ae8-cc50-4de4-b279-51105c6fc45c-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gjtn5\" (UID: \"e4cb3ae8-cc50-4de4-b279-51105c6fc45c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gjtn5" Sep 30 14:41:22 crc kubenswrapper[4676]: I0930 14:41:22.690519 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e4cb3ae8-cc50-4de4-b279-51105c6fc45c-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gjtn5\" (UID: \"e4cb3ae8-cc50-4de4-b279-51105c6fc45c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gjtn5" Sep 30 14:41:22 crc kubenswrapper[4676]: I0930 14:41:22.690559 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/e4cb3ae8-cc50-4de4-b279-51105c6fc45c-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gjtn5\" (UID: \"e4cb3ae8-cc50-4de4-b279-51105c6fc45c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gjtn5" Sep 30 14:41:22 crc kubenswrapper[4676]: I0930 14:41:22.690577 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4cb3ae8-cc50-4de4-b279-51105c6fc45c-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gjtn5\" (UID: \"e4cb3ae8-cc50-4de4-b279-51105c6fc45c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gjtn5" Sep 30 14:41:22 crc kubenswrapper[4676]: I0930 14:41:22.690600 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e4cb3ae8-cc50-4de4-b279-51105c6fc45c-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gjtn5\" (UID: \"e4cb3ae8-cc50-4de4-b279-51105c6fc45c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gjtn5" Sep 30 14:41:22 crc kubenswrapper[4676]: I0930 14:41:22.690615 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e4cb3ae8-cc50-4de4-b279-51105c6fc45c-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gjtn5\" (UID: \"e4cb3ae8-cc50-4de4-b279-51105c6fc45c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gjtn5" Sep 30 14:41:22 crc kubenswrapper[4676]: I0930 14:41:22.690634 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e4cb3ae8-cc50-4de4-b279-51105c6fc45c-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gjtn5\" (UID: \"e4cb3ae8-cc50-4de4-b279-51105c6fc45c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gjtn5" Sep 30 14:41:22 crc kubenswrapper[4676]: I0930 14:41:22.792535 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e4cb3ae8-cc50-4de4-b279-51105c6fc45c-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gjtn5\" (UID: \"e4cb3ae8-cc50-4de4-b279-51105c6fc45c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gjtn5" Sep 30 14:41:22 crc kubenswrapper[4676]: I0930 14:41:22.792602 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldvkg\" (UniqueName: \"kubernetes.io/projected/e4cb3ae8-cc50-4de4-b279-51105c6fc45c-kube-api-access-ldvkg\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gjtn5\" (UID: \"e4cb3ae8-cc50-4de4-b279-51105c6fc45c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gjtn5" Sep 30 14:41:22 crc kubenswrapper[4676]: I0930 14:41:22.792637 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e4cb3ae8-cc50-4de4-b279-51105c6fc45c-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gjtn5\" (UID: \"e4cb3ae8-cc50-4de4-b279-51105c6fc45c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gjtn5" Sep 30 14:41:22 crc kubenswrapper[4676]: I0930 14:41:22.792655 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e4cb3ae8-cc50-4de4-b279-51105c6fc45c-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gjtn5\" (UID: \"e4cb3ae8-cc50-4de4-b279-51105c6fc45c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gjtn5" Sep 30 14:41:22 crc kubenswrapper[4676]: I0930 14:41:22.792702 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/e4cb3ae8-cc50-4de4-b279-51105c6fc45c-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gjtn5\" (UID: \"e4cb3ae8-cc50-4de4-b279-51105c6fc45c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gjtn5" Sep 30 14:41:22 crc kubenswrapper[4676]: I0930 14:41:22.792720 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4cb3ae8-cc50-4de4-b279-51105c6fc45c-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gjtn5\" (UID: \"e4cb3ae8-cc50-4de4-b279-51105c6fc45c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gjtn5" Sep 30 14:41:22 crc kubenswrapper[4676]: I0930 14:41:22.792743 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e4cb3ae8-cc50-4de4-b279-51105c6fc45c-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gjtn5\" (UID: \"e4cb3ae8-cc50-4de4-b279-51105c6fc45c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gjtn5" Sep 30 14:41:22 crc kubenswrapper[4676]: I0930 14:41:22.792761 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e4cb3ae8-cc50-4de4-b279-51105c6fc45c-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gjtn5\" (UID: \"e4cb3ae8-cc50-4de4-b279-51105c6fc45c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gjtn5" Sep 30 14:41:22 crc kubenswrapper[4676]: I0930 14:41:22.792780 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e4cb3ae8-cc50-4de4-b279-51105c6fc45c-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gjtn5\" (UID: \"e4cb3ae8-cc50-4de4-b279-51105c6fc45c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gjtn5" Sep 30 14:41:22 crc kubenswrapper[4676]: I0930 14:41:22.794332 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/e4cb3ae8-cc50-4de4-b279-51105c6fc45c-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gjtn5\" (UID: \"e4cb3ae8-cc50-4de4-b279-51105c6fc45c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gjtn5" Sep 30 14:41:22 crc kubenswrapper[4676]: I0930 14:41:22.797215 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e4cb3ae8-cc50-4de4-b279-51105c6fc45c-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gjtn5\" (UID: \"e4cb3ae8-cc50-4de4-b279-51105c6fc45c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gjtn5" Sep 30 14:41:22 crc kubenswrapper[4676]: I0930 14:41:22.797264 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e4cb3ae8-cc50-4de4-b279-51105c6fc45c-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gjtn5\" (UID: \"e4cb3ae8-cc50-4de4-b279-51105c6fc45c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gjtn5" Sep 30 14:41:22 crc kubenswrapper[4676]: I0930 14:41:22.798164 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4cb3ae8-cc50-4de4-b279-51105c6fc45c-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gjtn5\" (UID: \"e4cb3ae8-cc50-4de4-b279-51105c6fc45c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gjtn5" Sep 30 14:41:22 crc kubenswrapper[4676]: I0930 14:41:22.802051 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e4cb3ae8-cc50-4de4-b279-51105c6fc45c-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gjtn5\" (UID: \"e4cb3ae8-cc50-4de4-b279-51105c6fc45c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gjtn5" Sep 30 14:41:22 crc kubenswrapper[4676]: I0930 14:41:22.802630 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e4cb3ae8-cc50-4de4-b279-51105c6fc45c-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gjtn5\" (UID: \"e4cb3ae8-cc50-4de4-b279-51105c6fc45c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gjtn5" Sep 30 14:41:22 crc kubenswrapper[4676]: I0930 14:41:22.802284 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e4cb3ae8-cc50-4de4-b279-51105c6fc45c-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gjtn5\" (UID: \"e4cb3ae8-cc50-4de4-b279-51105c6fc45c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gjtn5" Sep 30 14:41:22 crc kubenswrapper[4676]: I0930 14:41:22.806388 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e4cb3ae8-cc50-4de4-b279-51105c6fc45c-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gjtn5\" (UID: \"e4cb3ae8-cc50-4de4-b279-51105c6fc45c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gjtn5" Sep 30 14:41:22 crc kubenswrapper[4676]: I0930 14:41:22.810566 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldvkg\" (UniqueName: \"kubernetes.io/projected/e4cb3ae8-cc50-4de4-b279-51105c6fc45c-kube-api-access-ldvkg\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gjtn5\" (UID: \"e4cb3ae8-cc50-4de4-b279-51105c6fc45c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gjtn5" Sep 30 14:41:22 crc kubenswrapper[4676]: I0930 14:41:22.910248 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gjtn5" Sep 30 14:41:23 crc kubenswrapper[4676]: I0930 14:41:23.425478 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-gjtn5"] Sep 30 14:41:23 crc kubenswrapper[4676]: I0930 14:41:23.427649 4676 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 14:41:23 crc kubenswrapper[4676]: I0930 14:41:23.515058 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gjtn5" event={"ID":"e4cb3ae8-cc50-4de4-b279-51105c6fc45c","Type":"ContainerStarted","Data":"5ca043888e772796f9a63564d182423eef98190793fad9758fac8a50000d1ed2"} Sep 30 14:41:24 crc kubenswrapper[4676]: I0930 14:41:24.524162 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gjtn5" event={"ID":"e4cb3ae8-cc50-4de4-b279-51105c6fc45c","Type":"ContainerStarted","Data":"6f708a493cc4943bbb2dd6033149696682ac14a1879733b65cb3ce3bbe2e7062"} Sep 30 14:41:24 crc kubenswrapper[4676]: I0930 14:41:24.541713 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gjtn5" podStartSLOduration=1.981081073 podStartE2EDuration="2.541694049s" podCreationTimestamp="2025-09-30 14:41:22 +0000 UTC" firstStartedPulling="2025-09-30 14:41:23.427306929 +0000 UTC m=+2587.410395358" lastFinishedPulling="2025-09-30 14:41:23.987919905 +0000 UTC m=+2587.971008334" observedRunningTime="2025-09-30 14:41:24.541128614 +0000 UTC m=+2588.524217043" watchObservedRunningTime="2025-09-30 14:41:24.541694049 +0000 UTC m=+2588.524782488" Sep 30 14:41:30 crc kubenswrapper[4676]: I0930 14:41:30.433452 4676 scope.go:117] "RemoveContainer" containerID="477c0a9d00ef4eb891b1ca42013ec95f070550ef2a4a839127e5b5c32ca124a9" Sep 30 14:41:30 crc kubenswrapper[4676]: E0930 14:41:30.434514 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:41:41 crc kubenswrapper[4676]: I0930 14:41:41.433621 4676 scope.go:117] "RemoveContainer" containerID="477c0a9d00ef4eb891b1ca42013ec95f070550ef2a4a839127e5b5c32ca124a9" Sep 30 14:41:41 crc kubenswrapper[4676]: E0930 14:41:41.435672 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:41:52 crc kubenswrapper[4676]: I0930 14:41:52.433416 4676 scope.go:117] "RemoveContainer" containerID="477c0a9d00ef4eb891b1ca42013ec95f070550ef2a4a839127e5b5c32ca124a9" Sep 30 14:41:52 crc kubenswrapper[4676]: E0930 14:41:52.435139 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:41:58 crc kubenswrapper[4676]: I0930 14:41:58.911536 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nwnpr"] Sep 30 14:41:58 crc kubenswrapper[4676]: I0930 14:41:58.916170 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nwnpr" Sep 30 14:41:58 crc kubenswrapper[4676]: I0930 14:41:58.921519 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nwnpr"] Sep 30 14:41:59 crc kubenswrapper[4676]: I0930 14:41:59.000288 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fv5mm\" (UniqueName: \"kubernetes.io/projected/057cf1fb-06a1-4b6e-b04b-f955a9f312d6-kube-api-access-fv5mm\") pod \"redhat-operators-nwnpr\" (UID: \"057cf1fb-06a1-4b6e-b04b-f955a9f312d6\") " pod="openshift-marketplace/redhat-operators-nwnpr" Sep 30 14:41:59 crc kubenswrapper[4676]: I0930 14:41:59.000376 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/057cf1fb-06a1-4b6e-b04b-f955a9f312d6-catalog-content\") pod \"redhat-operators-nwnpr\" (UID: \"057cf1fb-06a1-4b6e-b04b-f955a9f312d6\") " pod="openshift-marketplace/redhat-operators-nwnpr" Sep 30 14:41:59 crc kubenswrapper[4676]: I0930 14:41:59.000481 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/057cf1fb-06a1-4b6e-b04b-f955a9f312d6-utilities\") pod \"redhat-operators-nwnpr\" (UID: \"057cf1fb-06a1-4b6e-b04b-f955a9f312d6\") " pod="openshift-marketplace/redhat-operators-nwnpr" Sep 30 14:41:59 crc kubenswrapper[4676]: I0930 14:41:59.102847 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/057cf1fb-06a1-4b6e-b04b-f955a9f312d6-utilities\") pod \"redhat-operators-nwnpr\" (UID: \"057cf1fb-06a1-4b6e-b04b-f955a9f312d6\") " pod="openshift-marketplace/redhat-operators-nwnpr" Sep 30 14:41:59 crc kubenswrapper[4676]: I0930 14:41:59.102993 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fv5mm\" (UniqueName: \"kubernetes.io/projected/057cf1fb-06a1-4b6e-b04b-f955a9f312d6-kube-api-access-fv5mm\") pod \"redhat-operators-nwnpr\" (UID: \"057cf1fb-06a1-4b6e-b04b-f955a9f312d6\") " pod="openshift-marketplace/redhat-operators-nwnpr" Sep 30 14:41:59 crc kubenswrapper[4676]: I0930 14:41:59.103052 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/057cf1fb-06a1-4b6e-b04b-f955a9f312d6-catalog-content\") pod \"redhat-operators-nwnpr\" (UID: \"057cf1fb-06a1-4b6e-b04b-f955a9f312d6\") " pod="openshift-marketplace/redhat-operators-nwnpr" Sep 30 14:41:59 crc kubenswrapper[4676]: I0930 14:41:59.103621 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/057cf1fb-06a1-4b6e-b04b-f955a9f312d6-catalog-content\") pod \"redhat-operators-nwnpr\" (UID: \"057cf1fb-06a1-4b6e-b04b-f955a9f312d6\") " pod="openshift-marketplace/redhat-operators-nwnpr" Sep 30 14:41:59 crc kubenswrapper[4676]: I0930 14:41:59.103904 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/057cf1fb-06a1-4b6e-b04b-f955a9f312d6-utilities\") pod \"redhat-operators-nwnpr\" (UID: \"057cf1fb-06a1-4b6e-b04b-f955a9f312d6\") " pod="openshift-marketplace/redhat-operators-nwnpr" Sep 30 14:41:59 crc kubenswrapper[4676]: I0930 14:41:59.124588 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fv5mm\" (UniqueName: \"kubernetes.io/projected/057cf1fb-06a1-4b6e-b04b-f955a9f312d6-kube-api-access-fv5mm\") pod \"redhat-operators-nwnpr\" (UID: \"057cf1fb-06a1-4b6e-b04b-f955a9f312d6\") " pod="openshift-marketplace/redhat-operators-nwnpr" Sep 30 14:41:59 crc kubenswrapper[4676]: I0930 14:41:59.246035 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nwnpr" Sep 30 14:41:59 crc kubenswrapper[4676]: I0930 14:41:59.684273 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nwnpr"] Sep 30 14:41:59 crc kubenswrapper[4676]: I0930 14:41:59.825825 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nwnpr" event={"ID":"057cf1fb-06a1-4b6e-b04b-f955a9f312d6","Type":"ContainerStarted","Data":"c049f1b19b7e3eea4cdcc357bdafeb5dadea53a6d7ae8e1d3ea5839dadc08185"} Sep 30 14:42:00 crc kubenswrapper[4676]: I0930 14:42:00.839958 4676 generic.go:334] "Generic (PLEG): container finished" podID="057cf1fb-06a1-4b6e-b04b-f955a9f312d6" containerID="731fa0ff900b727f9943157142d7a0752040cd2093a20c7e45e7709d816be47e" exitCode=0 Sep 30 14:42:00 crc kubenswrapper[4676]: I0930 14:42:00.840069 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nwnpr" event={"ID":"057cf1fb-06a1-4b6e-b04b-f955a9f312d6","Type":"ContainerDied","Data":"731fa0ff900b727f9943157142d7a0752040cd2093a20c7e45e7709d816be47e"} Sep 30 14:42:01 crc kubenswrapper[4676]: I0930 14:42:01.850972 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nwnpr" event={"ID":"057cf1fb-06a1-4b6e-b04b-f955a9f312d6","Type":"ContainerStarted","Data":"f1740d838f9d76ae9c6345c784a18f4320f05b22c0ddfcd2af31fe3dd1094d85"} Sep 30 14:42:04 crc kubenswrapper[4676]: I0930 14:42:04.875265 4676 generic.go:334] "Generic (PLEG): container finished" podID="057cf1fb-06a1-4b6e-b04b-f955a9f312d6" containerID="f1740d838f9d76ae9c6345c784a18f4320f05b22c0ddfcd2af31fe3dd1094d85" exitCode=0 Sep 30 14:42:04 crc kubenswrapper[4676]: I0930 14:42:04.875343 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nwnpr" event={"ID":"057cf1fb-06a1-4b6e-b04b-f955a9f312d6","Type":"ContainerDied","Data":"f1740d838f9d76ae9c6345c784a18f4320f05b22c0ddfcd2af31fe3dd1094d85"} Sep 30 14:42:05 crc kubenswrapper[4676]: I0930 14:42:05.887292 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nwnpr" event={"ID":"057cf1fb-06a1-4b6e-b04b-f955a9f312d6","Type":"ContainerStarted","Data":"68335730216a54c212ca45c0a061e50b6cf382384e133ed314da896c3184adb4"} Sep 30 14:42:05 crc kubenswrapper[4676]: I0930 14:42:05.904039 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nwnpr" podStartSLOduration=3.224855388 podStartE2EDuration="7.904017745s" podCreationTimestamp="2025-09-30 14:41:58 +0000 UTC" firstStartedPulling="2025-09-30 14:42:00.84181897 +0000 UTC m=+2624.824907399" lastFinishedPulling="2025-09-30 14:42:05.520981327 +0000 UTC m=+2629.504069756" observedRunningTime="2025-09-30 14:42:05.902660729 +0000 UTC m=+2629.885749168" watchObservedRunningTime="2025-09-30 14:42:05.904017745 +0000 UTC m=+2629.887106174" Sep 30 14:42:06 crc kubenswrapper[4676]: I0930 14:42:06.432839 4676 scope.go:117] "RemoveContainer" containerID="477c0a9d00ef4eb891b1ca42013ec95f070550ef2a4a839127e5b5c32ca124a9" Sep 30 14:42:06 crc kubenswrapper[4676]: E0930 14:42:06.433122 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:42:09 crc kubenswrapper[4676]: I0930 14:42:09.246571 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nwnpr" Sep 30 14:42:09 crc kubenswrapper[4676]: I0930 14:42:09.247220 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nwnpr" Sep 30 14:42:10 crc kubenswrapper[4676]: I0930 14:42:10.346737 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nwnpr" podUID="057cf1fb-06a1-4b6e-b04b-f955a9f312d6" containerName="registry-server" probeResult="failure" output=< Sep 30 14:42:10 crc kubenswrapper[4676]: timeout: failed to connect service ":50051" within 1s Sep 30 14:42:10 crc kubenswrapper[4676]: > Sep 30 14:42:19 crc kubenswrapper[4676]: I0930 14:42:19.300320 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nwnpr" Sep 30 14:42:19 crc kubenswrapper[4676]: I0930 14:42:19.346700 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nwnpr" Sep 30 14:42:19 crc kubenswrapper[4676]: I0930 14:42:19.434352 4676 scope.go:117] "RemoveContainer" containerID="477c0a9d00ef4eb891b1ca42013ec95f070550ef2a4a839127e5b5c32ca124a9" Sep 30 14:42:19 crc kubenswrapper[4676]: E0930 14:42:19.434741 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:42:19 crc kubenswrapper[4676]: I0930 14:42:19.550465 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nwnpr"] Sep 30 14:42:21 crc kubenswrapper[4676]: I0930 14:42:21.025274 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nwnpr" podUID="057cf1fb-06a1-4b6e-b04b-f955a9f312d6" containerName="registry-server" containerID="cri-o://68335730216a54c212ca45c0a061e50b6cf382384e133ed314da896c3184adb4" gracePeriod=2 Sep 30 14:42:21 crc kubenswrapper[4676]: I0930 14:42:21.513217 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nwnpr" Sep 30 14:42:21 crc kubenswrapper[4676]: I0930 14:42:21.660228 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/057cf1fb-06a1-4b6e-b04b-f955a9f312d6-catalog-content\") pod \"057cf1fb-06a1-4b6e-b04b-f955a9f312d6\" (UID: \"057cf1fb-06a1-4b6e-b04b-f955a9f312d6\") " Sep 30 14:42:21 crc kubenswrapper[4676]: I0930 14:42:21.660587 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/057cf1fb-06a1-4b6e-b04b-f955a9f312d6-utilities\") pod \"057cf1fb-06a1-4b6e-b04b-f955a9f312d6\" (UID: \"057cf1fb-06a1-4b6e-b04b-f955a9f312d6\") " Sep 30 14:42:21 crc kubenswrapper[4676]: I0930 14:42:21.660640 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fv5mm\" (UniqueName: \"kubernetes.io/projected/057cf1fb-06a1-4b6e-b04b-f955a9f312d6-kube-api-access-fv5mm\") pod \"057cf1fb-06a1-4b6e-b04b-f955a9f312d6\" (UID: \"057cf1fb-06a1-4b6e-b04b-f955a9f312d6\") " Sep 30 14:42:21 crc kubenswrapper[4676]: I0930 14:42:21.661161 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/057cf1fb-06a1-4b6e-b04b-f955a9f312d6-utilities" (OuterVolumeSpecName: "utilities") pod "057cf1fb-06a1-4b6e-b04b-f955a9f312d6" (UID: "057cf1fb-06a1-4b6e-b04b-f955a9f312d6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:42:21 crc kubenswrapper[4676]: I0930 14:42:21.661477 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/057cf1fb-06a1-4b6e-b04b-f955a9f312d6-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 14:42:21 crc kubenswrapper[4676]: I0930 14:42:21.665949 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/057cf1fb-06a1-4b6e-b04b-f955a9f312d6-kube-api-access-fv5mm" (OuterVolumeSpecName: "kube-api-access-fv5mm") pod "057cf1fb-06a1-4b6e-b04b-f955a9f312d6" (UID: "057cf1fb-06a1-4b6e-b04b-f955a9f312d6"). InnerVolumeSpecName "kube-api-access-fv5mm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:42:21 crc kubenswrapper[4676]: I0930 14:42:21.741657 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/057cf1fb-06a1-4b6e-b04b-f955a9f312d6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "057cf1fb-06a1-4b6e-b04b-f955a9f312d6" (UID: "057cf1fb-06a1-4b6e-b04b-f955a9f312d6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:42:21 crc kubenswrapper[4676]: I0930 14:42:21.762821 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/057cf1fb-06a1-4b6e-b04b-f955a9f312d6-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 14:42:21 crc kubenswrapper[4676]: I0930 14:42:21.762872 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fv5mm\" (UniqueName: \"kubernetes.io/projected/057cf1fb-06a1-4b6e-b04b-f955a9f312d6-kube-api-access-fv5mm\") on node \"crc\" DevicePath \"\"" Sep 30 14:42:22 crc kubenswrapper[4676]: I0930 14:42:22.036508 4676 generic.go:334] "Generic (PLEG): container finished" podID="057cf1fb-06a1-4b6e-b04b-f955a9f312d6" containerID="68335730216a54c212ca45c0a061e50b6cf382384e133ed314da896c3184adb4" exitCode=0 Sep 30 14:42:22 crc kubenswrapper[4676]: I0930 14:42:22.036822 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nwnpr" event={"ID":"057cf1fb-06a1-4b6e-b04b-f955a9f312d6","Type":"ContainerDied","Data":"68335730216a54c212ca45c0a061e50b6cf382384e133ed314da896c3184adb4"} Sep 30 14:42:22 crc kubenswrapper[4676]: I0930 14:42:22.036977 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nwnpr" event={"ID":"057cf1fb-06a1-4b6e-b04b-f955a9f312d6","Type":"ContainerDied","Data":"c049f1b19b7e3eea4cdcc357bdafeb5dadea53a6d7ae8e1d3ea5839dadc08185"} Sep 30 14:42:22 crc kubenswrapper[4676]: I0930 14:42:22.037006 4676 scope.go:117] "RemoveContainer" containerID="68335730216a54c212ca45c0a061e50b6cf382384e133ed314da896c3184adb4" Sep 30 14:42:22 crc kubenswrapper[4676]: I0930 14:42:22.037134 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nwnpr" Sep 30 14:42:22 crc kubenswrapper[4676]: I0930 14:42:22.058488 4676 scope.go:117] "RemoveContainer" containerID="f1740d838f9d76ae9c6345c784a18f4320f05b22c0ddfcd2af31fe3dd1094d85" Sep 30 14:42:22 crc kubenswrapper[4676]: I0930 14:42:22.071071 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nwnpr"] Sep 30 14:42:22 crc kubenswrapper[4676]: I0930 14:42:22.079488 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nwnpr"] Sep 30 14:42:22 crc kubenswrapper[4676]: I0930 14:42:22.094144 4676 scope.go:117] "RemoveContainer" containerID="731fa0ff900b727f9943157142d7a0752040cd2093a20c7e45e7709d816be47e" Sep 30 14:42:22 crc kubenswrapper[4676]: I0930 14:42:22.129003 4676 scope.go:117] "RemoveContainer" containerID="68335730216a54c212ca45c0a061e50b6cf382384e133ed314da896c3184adb4" Sep 30 14:42:22 crc kubenswrapper[4676]: E0930 14:42:22.129397 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68335730216a54c212ca45c0a061e50b6cf382384e133ed314da896c3184adb4\": container with ID starting with 68335730216a54c212ca45c0a061e50b6cf382384e133ed314da896c3184adb4 not found: ID does not exist" containerID="68335730216a54c212ca45c0a061e50b6cf382384e133ed314da896c3184adb4" Sep 30 14:42:22 crc kubenswrapper[4676]: I0930 14:42:22.129441 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68335730216a54c212ca45c0a061e50b6cf382384e133ed314da896c3184adb4"} err="failed to get container status \"68335730216a54c212ca45c0a061e50b6cf382384e133ed314da896c3184adb4\": rpc error: code = NotFound desc = could not find container \"68335730216a54c212ca45c0a061e50b6cf382384e133ed314da896c3184adb4\": container with ID starting with 68335730216a54c212ca45c0a061e50b6cf382384e133ed314da896c3184adb4 not found: ID does not exist" Sep 30 14:42:22 crc kubenswrapper[4676]: I0930 14:42:22.129467 4676 scope.go:117] "RemoveContainer" containerID="f1740d838f9d76ae9c6345c784a18f4320f05b22c0ddfcd2af31fe3dd1094d85" Sep 30 14:42:22 crc kubenswrapper[4676]: E0930 14:42:22.130177 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1740d838f9d76ae9c6345c784a18f4320f05b22c0ddfcd2af31fe3dd1094d85\": container with ID starting with f1740d838f9d76ae9c6345c784a18f4320f05b22c0ddfcd2af31fe3dd1094d85 not found: ID does not exist" containerID="f1740d838f9d76ae9c6345c784a18f4320f05b22c0ddfcd2af31fe3dd1094d85" Sep 30 14:42:22 crc kubenswrapper[4676]: I0930 14:42:22.130207 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1740d838f9d76ae9c6345c784a18f4320f05b22c0ddfcd2af31fe3dd1094d85"} err="failed to get container status \"f1740d838f9d76ae9c6345c784a18f4320f05b22c0ddfcd2af31fe3dd1094d85\": rpc error: code = NotFound desc = could not find container \"f1740d838f9d76ae9c6345c784a18f4320f05b22c0ddfcd2af31fe3dd1094d85\": container with ID starting with f1740d838f9d76ae9c6345c784a18f4320f05b22c0ddfcd2af31fe3dd1094d85 not found: ID does not exist" Sep 30 14:42:22 crc kubenswrapper[4676]: I0930 14:42:22.130227 4676 scope.go:117] "RemoveContainer" containerID="731fa0ff900b727f9943157142d7a0752040cd2093a20c7e45e7709d816be47e" Sep 30 14:42:22 crc kubenswrapper[4676]: E0930 14:42:22.130558 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"731fa0ff900b727f9943157142d7a0752040cd2093a20c7e45e7709d816be47e\": container with ID starting with 731fa0ff900b727f9943157142d7a0752040cd2093a20c7e45e7709d816be47e not found: ID does not exist" containerID="731fa0ff900b727f9943157142d7a0752040cd2093a20c7e45e7709d816be47e" Sep 30 14:42:22 crc kubenswrapper[4676]: I0930 14:42:22.130585 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"731fa0ff900b727f9943157142d7a0752040cd2093a20c7e45e7709d816be47e"} err="failed to get container status \"731fa0ff900b727f9943157142d7a0752040cd2093a20c7e45e7709d816be47e\": rpc error: code = NotFound desc = could not find container \"731fa0ff900b727f9943157142d7a0752040cd2093a20c7e45e7709d816be47e\": container with ID starting with 731fa0ff900b727f9943157142d7a0752040cd2093a20c7e45e7709d816be47e not found: ID does not exist" Sep 30 14:42:23 crc kubenswrapper[4676]: I0930 14:42:23.446403 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="057cf1fb-06a1-4b6e-b04b-f955a9f312d6" path="/var/lib/kubelet/pods/057cf1fb-06a1-4b6e-b04b-f955a9f312d6/volumes" Sep 30 14:42:30 crc kubenswrapper[4676]: I0930 14:42:30.434026 4676 scope.go:117] "RemoveContainer" containerID="477c0a9d00ef4eb891b1ca42013ec95f070550ef2a4a839127e5b5c32ca124a9" Sep 30 14:42:30 crc kubenswrapper[4676]: E0930 14:42:30.436384 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:42:33 crc kubenswrapper[4676]: I0930 14:42:33.065651 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-t4pwg"] Sep 30 14:42:33 crc kubenswrapper[4676]: E0930 14:42:33.066686 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="057cf1fb-06a1-4b6e-b04b-f955a9f312d6" containerName="extract-utilities" Sep 30 14:42:33 crc kubenswrapper[4676]: I0930 14:42:33.066700 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="057cf1fb-06a1-4b6e-b04b-f955a9f312d6" containerName="extract-utilities" Sep 30 14:42:33 crc kubenswrapper[4676]: E0930 14:42:33.066720 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="057cf1fb-06a1-4b6e-b04b-f955a9f312d6" containerName="registry-server" Sep 30 14:42:33 crc kubenswrapper[4676]: I0930 14:42:33.066727 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="057cf1fb-06a1-4b6e-b04b-f955a9f312d6" containerName="registry-server" Sep 30 14:42:33 crc kubenswrapper[4676]: E0930 14:42:33.066750 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="057cf1fb-06a1-4b6e-b04b-f955a9f312d6" containerName="extract-content" Sep 30 14:42:33 crc kubenswrapper[4676]: I0930 14:42:33.066758 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="057cf1fb-06a1-4b6e-b04b-f955a9f312d6" containerName="extract-content" Sep 30 14:42:33 crc kubenswrapper[4676]: I0930 14:42:33.066986 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="057cf1fb-06a1-4b6e-b04b-f955a9f312d6" containerName="registry-server" Sep 30 14:42:33 crc kubenswrapper[4676]: I0930 14:42:33.068416 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t4pwg" Sep 30 14:42:33 crc kubenswrapper[4676]: I0930 14:42:33.078129 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t4pwg"] Sep 30 14:42:33 crc kubenswrapper[4676]: I0930 14:42:33.217184 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7qzx\" (UniqueName: \"kubernetes.io/projected/164b828c-b3cd-4a7a-8dca-dbc3302e90be-kube-api-access-w7qzx\") pod \"certified-operators-t4pwg\" (UID: \"164b828c-b3cd-4a7a-8dca-dbc3302e90be\") " pod="openshift-marketplace/certified-operators-t4pwg" Sep 30 14:42:33 crc kubenswrapper[4676]: I0930 14:42:33.217330 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/164b828c-b3cd-4a7a-8dca-dbc3302e90be-utilities\") pod \"certified-operators-t4pwg\" (UID: \"164b828c-b3cd-4a7a-8dca-dbc3302e90be\") " pod="openshift-marketplace/certified-operators-t4pwg" Sep 30 14:42:33 crc kubenswrapper[4676]: I0930 14:42:33.217416 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/164b828c-b3cd-4a7a-8dca-dbc3302e90be-catalog-content\") pod \"certified-operators-t4pwg\" (UID: \"164b828c-b3cd-4a7a-8dca-dbc3302e90be\") " pod="openshift-marketplace/certified-operators-t4pwg" Sep 30 14:42:33 crc kubenswrapper[4676]: I0930 14:42:33.319420 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7qzx\" (UniqueName: \"kubernetes.io/projected/164b828c-b3cd-4a7a-8dca-dbc3302e90be-kube-api-access-w7qzx\") pod \"certified-operators-t4pwg\" (UID: \"164b828c-b3cd-4a7a-8dca-dbc3302e90be\") " pod="openshift-marketplace/certified-operators-t4pwg" Sep 30 14:42:33 crc kubenswrapper[4676]: I0930 14:42:33.319552 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/164b828c-b3cd-4a7a-8dca-dbc3302e90be-utilities\") pod \"certified-operators-t4pwg\" (UID: \"164b828c-b3cd-4a7a-8dca-dbc3302e90be\") " pod="openshift-marketplace/certified-operators-t4pwg" Sep 30 14:42:33 crc kubenswrapper[4676]: I0930 14:42:33.319636 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/164b828c-b3cd-4a7a-8dca-dbc3302e90be-catalog-content\") pod \"certified-operators-t4pwg\" (UID: \"164b828c-b3cd-4a7a-8dca-dbc3302e90be\") " pod="openshift-marketplace/certified-operators-t4pwg" Sep 30 14:42:33 crc kubenswrapper[4676]: I0930 14:42:33.320318 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/164b828c-b3cd-4a7a-8dca-dbc3302e90be-catalog-content\") pod \"certified-operators-t4pwg\" (UID: \"164b828c-b3cd-4a7a-8dca-dbc3302e90be\") " pod="openshift-marketplace/certified-operators-t4pwg" Sep 30 14:42:33 crc kubenswrapper[4676]: I0930 14:42:33.320645 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/164b828c-b3cd-4a7a-8dca-dbc3302e90be-utilities\") pod \"certified-operators-t4pwg\" (UID: \"164b828c-b3cd-4a7a-8dca-dbc3302e90be\") " pod="openshift-marketplace/certified-operators-t4pwg" Sep 30 14:42:33 crc kubenswrapper[4676]: I0930 14:42:33.348278 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7qzx\" (UniqueName: \"kubernetes.io/projected/164b828c-b3cd-4a7a-8dca-dbc3302e90be-kube-api-access-w7qzx\") pod \"certified-operators-t4pwg\" (UID: \"164b828c-b3cd-4a7a-8dca-dbc3302e90be\") " pod="openshift-marketplace/certified-operators-t4pwg" Sep 30 14:42:33 crc kubenswrapper[4676]: I0930 14:42:33.404166 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t4pwg" Sep 30 14:42:34 crc kubenswrapper[4676]: I0930 14:42:34.040186 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t4pwg"] Sep 30 14:42:34 crc kubenswrapper[4676]: I0930 14:42:34.164830 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t4pwg" event={"ID":"164b828c-b3cd-4a7a-8dca-dbc3302e90be","Type":"ContainerStarted","Data":"b0203dfc7e07b3ee90ea1441bb9c9a835c8153fafb60660d2f9c222bb9cfc517"} Sep 30 14:42:35 crc kubenswrapper[4676]: I0930 14:42:35.175263 4676 generic.go:334] "Generic (PLEG): container finished" podID="164b828c-b3cd-4a7a-8dca-dbc3302e90be" containerID="ce4899c94503aab59a31179dd6573d331f224a18e3d982e5a13c811063fc0140" exitCode=0 Sep 30 14:42:35 crc kubenswrapper[4676]: I0930 14:42:35.175319 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t4pwg" event={"ID":"164b828c-b3cd-4a7a-8dca-dbc3302e90be","Type":"ContainerDied","Data":"ce4899c94503aab59a31179dd6573d331f224a18e3d982e5a13c811063fc0140"} Sep 30 14:42:36 crc kubenswrapper[4676]: I0930 14:42:36.187476 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t4pwg" event={"ID":"164b828c-b3cd-4a7a-8dca-dbc3302e90be","Type":"ContainerStarted","Data":"a0872f8224cb8ba9a4127a003e834e2245c7cc41579ef2a7d2f4303f833e40f3"} Sep 30 14:42:37 crc kubenswrapper[4676]: I0930 14:42:37.204594 4676 generic.go:334] "Generic (PLEG): container finished" podID="164b828c-b3cd-4a7a-8dca-dbc3302e90be" containerID="a0872f8224cb8ba9a4127a003e834e2245c7cc41579ef2a7d2f4303f833e40f3" exitCode=0 Sep 30 14:42:37 crc kubenswrapper[4676]: I0930 14:42:37.204642 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t4pwg" event={"ID":"164b828c-b3cd-4a7a-8dca-dbc3302e90be","Type":"ContainerDied","Data":"a0872f8224cb8ba9a4127a003e834e2245c7cc41579ef2a7d2f4303f833e40f3"} Sep 30 14:42:39 crc kubenswrapper[4676]: I0930 14:42:39.227578 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t4pwg" event={"ID":"164b828c-b3cd-4a7a-8dca-dbc3302e90be","Type":"ContainerStarted","Data":"c7a68e0435c0b25fa8c5a280b0271d5ced66f51dd2fa79966d455caff5bde914"} Sep 30 14:42:39 crc kubenswrapper[4676]: I0930 14:42:39.260765 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-t4pwg" podStartSLOduration=3.524195864 podStartE2EDuration="6.260737343s" podCreationTimestamp="2025-09-30 14:42:33 +0000 UTC" firstStartedPulling="2025-09-30 14:42:35.178439988 +0000 UTC m=+2659.161528417" lastFinishedPulling="2025-09-30 14:42:37.914981467 +0000 UTC m=+2661.898069896" observedRunningTime="2025-09-30 14:42:39.250811152 +0000 UTC m=+2663.233899591" watchObservedRunningTime="2025-09-30 14:42:39.260737343 +0000 UTC m=+2663.243825772" Sep 30 14:42:41 crc kubenswrapper[4676]: I0930 14:42:41.433087 4676 scope.go:117] "RemoveContainer" containerID="477c0a9d00ef4eb891b1ca42013ec95f070550ef2a4a839127e5b5c32ca124a9" Sep 30 14:42:41 crc kubenswrapper[4676]: E0930 14:42:41.433605 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:42:43 crc kubenswrapper[4676]: I0930 14:42:43.404949 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-t4pwg" Sep 30 14:42:43 crc kubenswrapper[4676]: I0930 14:42:43.405280 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-t4pwg" Sep 30 14:42:43 crc kubenswrapper[4676]: I0930 14:42:43.451249 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-t4pwg" Sep 30 14:42:44 crc kubenswrapper[4676]: I0930 14:42:44.326233 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-t4pwg" Sep 30 14:42:44 crc kubenswrapper[4676]: I0930 14:42:44.376245 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t4pwg"] Sep 30 14:42:46 crc kubenswrapper[4676]: I0930 14:42:46.292378 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-t4pwg" podUID="164b828c-b3cd-4a7a-8dca-dbc3302e90be" containerName="registry-server" containerID="cri-o://c7a68e0435c0b25fa8c5a280b0271d5ced66f51dd2fa79966d455caff5bde914" gracePeriod=2 Sep 30 14:42:46 crc kubenswrapper[4676]: I0930 14:42:46.786073 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t4pwg" Sep 30 14:42:46 crc kubenswrapper[4676]: I0930 14:42:46.892505 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/164b828c-b3cd-4a7a-8dca-dbc3302e90be-utilities\") pod \"164b828c-b3cd-4a7a-8dca-dbc3302e90be\" (UID: \"164b828c-b3cd-4a7a-8dca-dbc3302e90be\") " Sep 30 14:42:46 crc kubenswrapper[4676]: I0930 14:42:46.892713 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7qzx\" (UniqueName: \"kubernetes.io/projected/164b828c-b3cd-4a7a-8dca-dbc3302e90be-kube-api-access-w7qzx\") pod \"164b828c-b3cd-4a7a-8dca-dbc3302e90be\" (UID: \"164b828c-b3cd-4a7a-8dca-dbc3302e90be\") " Sep 30 14:42:46 crc kubenswrapper[4676]: I0930 14:42:46.892814 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/164b828c-b3cd-4a7a-8dca-dbc3302e90be-catalog-content\") pod \"164b828c-b3cd-4a7a-8dca-dbc3302e90be\" (UID: \"164b828c-b3cd-4a7a-8dca-dbc3302e90be\") " Sep 30 14:42:46 crc kubenswrapper[4676]: I0930 14:42:46.893840 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/164b828c-b3cd-4a7a-8dca-dbc3302e90be-utilities" (OuterVolumeSpecName: "utilities") pod "164b828c-b3cd-4a7a-8dca-dbc3302e90be" (UID: "164b828c-b3cd-4a7a-8dca-dbc3302e90be"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:42:46 crc kubenswrapper[4676]: I0930 14:42:46.894496 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/164b828c-b3cd-4a7a-8dca-dbc3302e90be-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 14:42:46 crc kubenswrapper[4676]: I0930 14:42:46.898811 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/164b828c-b3cd-4a7a-8dca-dbc3302e90be-kube-api-access-w7qzx" (OuterVolumeSpecName: "kube-api-access-w7qzx") pod "164b828c-b3cd-4a7a-8dca-dbc3302e90be" (UID: "164b828c-b3cd-4a7a-8dca-dbc3302e90be"). InnerVolumeSpecName "kube-api-access-w7qzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:42:46 crc kubenswrapper[4676]: I0930 14:42:46.996358 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7qzx\" (UniqueName: \"kubernetes.io/projected/164b828c-b3cd-4a7a-8dca-dbc3302e90be-kube-api-access-w7qzx\") on node \"crc\" DevicePath \"\"" Sep 30 14:42:47 crc kubenswrapper[4676]: I0930 14:42:47.035766 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/164b828c-b3cd-4a7a-8dca-dbc3302e90be-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "164b828c-b3cd-4a7a-8dca-dbc3302e90be" (UID: "164b828c-b3cd-4a7a-8dca-dbc3302e90be"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:42:47 crc kubenswrapper[4676]: I0930 14:42:47.098746 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/164b828c-b3cd-4a7a-8dca-dbc3302e90be-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 14:42:47 crc kubenswrapper[4676]: I0930 14:42:47.302229 4676 generic.go:334] "Generic (PLEG): container finished" podID="164b828c-b3cd-4a7a-8dca-dbc3302e90be" containerID="c7a68e0435c0b25fa8c5a280b0271d5ced66f51dd2fa79966d455caff5bde914" exitCode=0 Sep 30 14:42:47 crc kubenswrapper[4676]: I0930 14:42:47.302275 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t4pwg" event={"ID":"164b828c-b3cd-4a7a-8dca-dbc3302e90be","Type":"ContainerDied","Data":"c7a68e0435c0b25fa8c5a280b0271d5ced66f51dd2fa79966d455caff5bde914"} Sep 30 14:42:47 crc kubenswrapper[4676]: I0930 14:42:47.302293 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t4pwg" Sep 30 14:42:47 crc kubenswrapper[4676]: I0930 14:42:47.302315 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t4pwg" event={"ID":"164b828c-b3cd-4a7a-8dca-dbc3302e90be","Type":"ContainerDied","Data":"b0203dfc7e07b3ee90ea1441bb9c9a835c8153fafb60660d2f9c222bb9cfc517"} Sep 30 14:42:47 crc kubenswrapper[4676]: I0930 14:42:47.302339 4676 scope.go:117] "RemoveContainer" containerID="c7a68e0435c0b25fa8c5a280b0271d5ced66f51dd2fa79966d455caff5bde914" Sep 30 14:42:47 crc kubenswrapper[4676]: I0930 14:42:47.326135 4676 scope.go:117] "RemoveContainer" containerID="a0872f8224cb8ba9a4127a003e834e2245c7cc41579ef2a7d2f4303f833e40f3" Sep 30 14:42:47 crc kubenswrapper[4676]: I0930 14:42:47.337529 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t4pwg"] Sep 30 14:42:47 crc kubenswrapper[4676]: I0930 14:42:47.347045 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-t4pwg"] Sep 30 14:42:47 crc kubenswrapper[4676]: I0930 14:42:47.355857 4676 scope.go:117] "RemoveContainer" containerID="ce4899c94503aab59a31179dd6573d331f224a18e3d982e5a13c811063fc0140" Sep 30 14:42:47 crc kubenswrapper[4676]: I0930 14:42:47.405435 4676 scope.go:117] "RemoveContainer" containerID="c7a68e0435c0b25fa8c5a280b0271d5ced66f51dd2fa79966d455caff5bde914" Sep 30 14:42:47 crc kubenswrapper[4676]: E0930 14:42:47.405951 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7a68e0435c0b25fa8c5a280b0271d5ced66f51dd2fa79966d455caff5bde914\": container with ID starting with c7a68e0435c0b25fa8c5a280b0271d5ced66f51dd2fa79966d455caff5bde914 not found: ID does not exist" containerID="c7a68e0435c0b25fa8c5a280b0271d5ced66f51dd2fa79966d455caff5bde914" Sep 30 14:42:47 crc kubenswrapper[4676]: I0930 14:42:47.406040 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7a68e0435c0b25fa8c5a280b0271d5ced66f51dd2fa79966d455caff5bde914"} err="failed to get container status \"c7a68e0435c0b25fa8c5a280b0271d5ced66f51dd2fa79966d455caff5bde914\": rpc error: code = NotFound desc = could not find container \"c7a68e0435c0b25fa8c5a280b0271d5ced66f51dd2fa79966d455caff5bde914\": container with ID starting with c7a68e0435c0b25fa8c5a280b0271d5ced66f51dd2fa79966d455caff5bde914 not found: ID does not exist" Sep 30 14:42:47 crc kubenswrapper[4676]: I0930 14:42:47.406064 4676 scope.go:117] "RemoveContainer" containerID="a0872f8224cb8ba9a4127a003e834e2245c7cc41579ef2a7d2f4303f833e40f3" Sep 30 14:42:47 crc kubenswrapper[4676]: E0930 14:42:47.406406 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0872f8224cb8ba9a4127a003e834e2245c7cc41579ef2a7d2f4303f833e40f3\": container with ID starting with a0872f8224cb8ba9a4127a003e834e2245c7cc41579ef2a7d2f4303f833e40f3 not found: ID does not exist" containerID="a0872f8224cb8ba9a4127a003e834e2245c7cc41579ef2a7d2f4303f833e40f3" Sep 30 14:42:47 crc kubenswrapper[4676]: I0930 14:42:47.406425 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0872f8224cb8ba9a4127a003e834e2245c7cc41579ef2a7d2f4303f833e40f3"} err="failed to get container status \"a0872f8224cb8ba9a4127a003e834e2245c7cc41579ef2a7d2f4303f833e40f3\": rpc error: code = NotFound desc = could not find container \"a0872f8224cb8ba9a4127a003e834e2245c7cc41579ef2a7d2f4303f833e40f3\": container with ID starting with a0872f8224cb8ba9a4127a003e834e2245c7cc41579ef2a7d2f4303f833e40f3 not found: ID does not exist" Sep 30 14:42:47 crc kubenswrapper[4676]: I0930 14:42:47.406438 4676 scope.go:117] "RemoveContainer" containerID="ce4899c94503aab59a31179dd6573d331f224a18e3d982e5a13c811063fc0140" Sep 30 14:42:47 crc kubenswrapper[4676]: E0930 14:42:47.406927 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce4899c94503aab59a31179dd6573d331f224a18e3d982e5a13c811063fc0140\": container with ID starting with ce4899c94503aab59a31179dd6573d331f224a18e3d982e5a13c811063fc0140 not found: ID does not exist" containerID="ce4899c94503aab59a31179dd6573d331f224a18e3d982e5a13c811063fc0140" Sep 30 14:42:47 crc kubenswrapper[4676]: I0930 14:42:47.406951 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce4899c94503aab59a31179dd6573d331f224a18e3d982e5a13c811063fc0140"} err="failed to get container status \"ce4899c94503aab59a31179dd6573d331f224a18e3d982e5a13c811063fc0140\": rpc error: code = NotFound desc = could not find container \"ce4899c94503aab59a31179dd6573d331f224a18e3d982e5a13c811063fc0140\": container with ID starting with ce4899c94503aab59a31179dd6573d331f224a18e3d982e5a13c811063fc0140 not found: ID does not exist" Sep 30 14:42:47 crc kubenswrapper[4676]: I0930 14:42:47.454029 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="164b828c-b3cd-4a7a-8dca-dbc3302e90be" path="/var/lib/kubelet/pods/164b828c-b3cd-4a7a-8dca-dbc3302e90be/volumes" Sep 30 14:42:52 crc kubenswrapper[4676]: I0930 14:42:52.433608 4676 scope.go:117] "RemoveContainer" containerID="477c0a9d00ef4eb891b1ca42013ec95f070550ef2a4a839127e5b5c32ca124a9" Sep 30 14:42:52 crc kubenswrapper[4676]: E0930 14:42:52.434410 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:43:03 crc kubenswrapper[4676]: I0930 14:43:03.434250 4676 scope.go:117] "RemoveContainer" containerID="477c0a9d00ef4eb891b1ca42013ec95f070550ef2a4a839127e5b5c32ca124a9" Sep 30 14:43:03 crc kubenswrapper[4676]: E0930 14:43:03.435318 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:43:14 crc kubenswrapper[4676]: I0930 14:43:14.433734 4676 scope.go:117] "RemoveContainer" containerID="477c0a9d00ef4eb891b1ca42013ec95f070550ef2a4a839127e5b5c32ca124a9" Sep 30 14:43:14 crc kubenswrapper[4676]: E0930 14:43:14.435219 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:43:26 crc kubenswrapper[4676]: I0930 14:43:26.432835 4676 scope.go:117] "RemoveContainer" containerID="477c0a9d00ef4eb891b1ca42013ec95f070550ef2a4a839127e5b5c32ca124a9" Sep 30 14:43:26 crc kubenswrapper[4676]: E0930 14:43:26.433690 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:43:39 crc kubenswrapper[4676]: I0930 14:43:39.433220 4676 scope.go:117] "RemoveContainer" containerID="477c0a9d00ef4eb891b1ca42013ec95f070550ef2a4a839127e5b5c32ca124a9" Sep 30 14:43:39 crc kubenswrapper[4676]: E0930 14:43:39.434083 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:43:45 crc kubenswrapper[4676]: I0930 14:43:45.806154 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-x2f9t"] Sep 30 14:43:45 crc kubenswrapper[4676]: E0930 14:43:45.807109 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="164b828c-b3cd-4a7a-8dca-dbc3302e90be" containerName="extract-content" Sep 30 14:43:45 crc kubenswrapper[4676]: I0930 14:43:45.807138 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="164b828c-b3cd-4a7a-8dca-dbc3302e90be" containerName="extract-content" Sep 30 14:43:45 crc kubenswrapper[4676]: E0930 14:43:45.807152 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="164b828c-b3cd-4a7a-8dca-dbc3302e90be" containerName="registry-server" Sep 30 14:43:45 crc kubenswrapper[4676]: I0930 14:43:45.807158 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="164b828c-b3cd-4a7a-8dca-dbc3302e90be" containerName="registry-server" Sep 30 14:43:45 crc kubenswrapper[4676]: E0930 14:43:45.807184 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="164b828c-b3cd-4a7a-8dca-dbc3302e90be" containerName="extract-utilities" Sep 30 14:43:45 crc kubenswrapper[4676]: I0930 14:43:45.807192 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="164b828c-b3cd-4a7a-8dca-dbc3302e90be" containerName="extract-utilities" Sep 30 14:43:45 crc kubenswrapper[4676]: I0930 14:43:45.807371 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="164b828c-b3cd-4a7a-8dca-dbc3302e90be" containerName="registry-server" Sep 30 14:43:45 crc kubenswrapper[4676]: I0930 14:43:45.808933 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x2f9t" Sep 30 14:43:45 crc kubenswrapper[4676]: I0930 14:43:45.856173 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-x2f9t"] Sep 30 14:43:45 crc kubenswrapper[4676]: I0930 14:43:45.932992 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85eaa0fe-9999-4ea8-a55e-5718cc41846c-catalog-content\") pod \"community-operators-x2f9t\" (UID: \"85eaa0fe-9999-4ea8-a55e-5718cc41846c\") " pod="openshift-marketplace/community-operators-x2f9t" Sep 30 14:43:45 crc kubenswrapper[4676]: I0930 14:43:45.933100 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5q2m\" (UniqueName: \"kubernetes.io/projected/85eaa0fe-9999-4ea8-a55e-5718cc41846c-kube-api-access-k5q2m\") pod \"community-operators-x2f9t\" (UID: \"85eaa0fe-9999-4ea8-a55e-5718cc41846c\") " pod="openshift-marketplace/community-operators-x2f9t" Sep 30 14:43:45 crc kubenswrapper[4676]: I0930 14:43:45.933756 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85eaa0fe-9999-4ea8-a55e-5718cc41846c-utilities\") pod \"community-operators-x2f9t\" (UID: \"85eaa0fe-9999-4ea8-a55e-5718cc41846c\") " pod="openshift-marketplace/community-operators-x2f9t" Sep 30 14:43:46 crc kubenswrapper[4676]: I0930 14:43:46.035387 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85eaa0fe-9999-4ea8-a55e-5718cc41846c-catalog-content\") pod \"community-operators-x2f9t\" (UID: \"85eaa0fe-9999-4ea8-a55e-5718cc41846c\") " pod="openshift-marketplace/community-operators-x2f9t" Sep 30 14:43:46 crc kubenswrapper[4676]: I0930 14:43:46.035474 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5q2m\" (UniqueName: \"kubernetes.io/projected/85eaa0fe-9999-4ea8-a55e-5718cc41846c-kube-api-access-k5q2m\") pod \"community-operators-x2f9t\" (UID: \"85eaa0fe-9999-4ea8-a55e-5718cc41846c\") " pod="openshift-marketplace/community-operators-x2f9t" Sep 30 14:43:46 crc kubenswrapper[4676]: I0930 14:43:46.035525 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85eaa0fe-9999-4ea8-a55e-5718cc41846c-utilities\") pod \"community-operators-x2f9t\" (UID: \"85eaa0fe-9999-4ea8-a55e-5718cc41846c\") " pod="openshift-marketplace/community-operators-x2f9t" Sep 30 14:43:46 crc kubenswrapper[4676]: I0930 14:43:46.035979 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85eaa0fe-9999-4ea8-a55e-5718cc41846c-catalog-content\") pod \"community-operators-x2f9t\" (UID: \"85eaa0fe-9999-4ea8-a55e-5718cc41846c\") " pod="openshift-marketplace/community-operators-x2f9t" Sep 30 14:43:46 crc kubenswrapper[4676]: I0930 14:43:46.036022 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85eaa0fe-9999-4ea8-a55e-5718cc41846c-utilities\") pod \"community-operators-x2f9t\" (UID: \"85eaa0fe-9999-4ea8-a55e-5718cc41846c\") " pod="openshift-marketplace/community-operators-x2f9t" Sep 30 14:43:46 crc kubenswrapper[4676]: I0930 14:43:46.056236 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5q2m\" (UniqueName: \"kubernetes.io/projected/85eaa0fe-9999-4ea8-a55e-5718cc41846c-kube-api-access-k5q2m\") pod \"community-operators-x2f9t\" (UID: \"85eaa0fe-9999-4ea8-a55e-5718cc41846c\") " pod="openshift-marketplace/community-operators-x2f9t" Sep 30 14:43:46 crc kubenswrapper[4676]: I0930 14:43:46.153680 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x2f9t" Sep 30 14:43:46 crc kubenswrapper[4676]: I0930 14:43:46.897210 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-x2f9t"] Sep 30 14:43:47 crc kubenswrapper[4676]: I0930 14:43:47.871928 4676 generic.go:334] "Generic (PLEG): container finished" podID="85eaa0fe-9999-4ea8-a55e-5718cc41846c" containerID="77eaf1da4af75fb69d0a6106982dcf42eae80b45bbe192f4b0c870fefab31021" exitCode=0 Sep 30 14:43:47 crc kubenswrapper[4676]: I0930 14:43:47.872026 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x2f9t" event={"ID":"85eaa0fe-9999-4ea8-a55e-5718cc41846c","Type":"ContainerDied","Data":"77eaf1da4af75fb69d0a6106982dcf42eae80b45bbe192f4b0c870fefab31021"} Sep 30 14:43:47 crc kubenswrapper[4676]: I0930 14:43:47.872936 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x2f9t" event={"ID":"85eaa0fe-9999-4ea8-a55e-5718cc41846c","Type":"ContainerStarted","Data":"f3d56af74e9557fd7b5a97e8be26a86e10cc841631154432a81f9c6497bced24"} Sep 30 14:43:48 crc kubenswrapper[4676]: I0930 14:43:48.888529 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x2f9t" event={"ID":"85eaa0fe-9999-4ea8-a55e-5718cc41846c","Type":"ContainerStarted","Data":"650a09f75fec3eb6b468a94bb3c30095651a39cb617abdd30dbac8c40b3825b5"} Sep 30 14:43:49 crc kubenswrapper[4676]: I0930 14:43:49.897709 4676 generic.go:334] "Generic (PLEG): container finished" podID="85eaa0fe-9999-4ea8-a55e-5718cc41846c" containerID="650a09f75fec3eb6b468a94bb3c30095651a39cb617abdd30dbac8c40b3825b5" exitCode=0 Sep 30 14:43:49 crc kubenswrapper[4676]: I0930 14:43:49.897786 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x2f9t" event={"ID":"85eaa0fe-9999-4ea8-a55e-5718cc41846c","Type":"ContainerDied","Data":"650a09f75fec3eb6b468a94bb3c30095651a39cb617abdd30dbac8c40b3825b5"} Sep 30 14:43:50 crc kubenswrapper[4676]: I0930 14:43:50.909741 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x2f9t" event={"ID":"85eaa0fe-9999-4ea8-a55e-5718cc41846c","Type":"ContainerStarted","Data":"e1b478687c09180e662396e0baf1cdcc355879d48900b25ee180310c83829306"} Sep 30 14:43:50 crc kubenswrapper[4676]: I0930 14:43:50.934086 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-x2f9t" podStartSLOduration=3.426911851 podStartE2EDuration="5.934062002s" podCreationTimestamp="2025-09-30 14:43:45 +0000 UTC" firstStartedPulling="2025-09-30 14:43:47.874305469 +0000 UTC m=+2731.857393898" lastFinishedPulling="2025-09-30 14:43:50.38145562 +0000 UTC m=+2734.364544049" observedRunningTime="2025-09-30 14:43:50.930158641 +0000 UTC m=+2734.913247080" watchObservedRunningTime="2025-09-30 14:43:50.934062002 +0000 UTC m=+2734.917150431" Sep 30 14:43:53 crc kubenswrapper[4676]: I0930 14:43:53.434045 4676 scope.go:117] "RemoveContainer" containerID="477c0a9d00ef4eb891b1ca42013ec95f070550ef2a4a839127e5b5c32ca124a9" Sep 30 14:43:53 crc kubenswrapper[4676]: E0930 14:43:53.434598 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:43:56 crc kubenswrapper[4676]: I0930 14:43:56.154375 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-x2f9t" Sep 30 14:43:56 crc kubenswrapper[4676]: I0930 14:43:56.154720 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-x2f9t" Sep 30 14:43:56 crc kubenswrapper[4676]: I0930 14:43:56.201957 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-x2f9t" Sep 30 14:43:57 crc kubenswrapper[4676]: I0930 14:43:57.021188 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-x2f9t" Sep 30 14:43:57 crc kubenswrapper[4676]: I0930 14:43:57.071088 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-x2f9t"] Sep 30 14:43:58 crc kubenswrapper[4676]: I0930 14:43:58.987140 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-x2f9t" podUID="85eaa0fe-9999-4ea8-a55e-5718cc41846c" containerName="registry-server" containerID="cri-o://e1b478687c09180e662396e0baf1cdcc355879d48900b25ee180310c83829306" gracePeriod=2 Sep 30 14:43:59 crc kubenswrapper[4676]: I0930 14:43:59.399066 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-26tld"] Sep 30 14:43:59 crc kubenswrapper[4676]: I0930 14:43:59.402535 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-26tld" Sep 30 14:43:59 crc kubenswrapper[4676]: I0930 14:43:59.418924 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-26tld"] Sep 30 14:43:59 crc kubenswrapper[4676]: I0930 14:43:59.521339 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b85gc\" (UniqueName: \"kubernetes.io/projected/1eecc90a-d003-4e27-96c5-4e26a42ccda4-kube-api-access-b85gc\") pod \"redhat-marketplace-26tld\" (UID: \"1eecc90a-d003-4e27-96c5-4e26a42ccda4\") " pod="openshift-marketplace/redhat-marketplace-26tld" Sep 30 14:43:59 crc kubenswrapper[4676]: I0930 14:43:59.521690 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1eecc90a-d003-4e27-96c5-4e26a42ccda4-utilities\") pod \"redhat-marketplace-26tld\" (UID: \"1eecc90a-d003-4e27-96c5-4e26a42ccda4\") " pod="openshift-marketplace/redhat-marketplace-26tld" Sep 30 14:43:59 crc kubenswrapper[4676]: I0930 14:43:59.521773 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1eecc90a-d003-4e27-96c5-4e26a42ccda4-catalog-content\") pod \"redhat-marketplace-26tld\" (UID: \"1eecc90a-d003-4e27-96c5-4e26a42ccda4\") " pod="openshift-marketplace/redhat-marketplace-26tld" Sep 30 14:43:59 crc kubenswrapper[4676]: I0930 14:43:59.624297 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b85gc\" (UniqueName: \"kubernetes.io/projected/1eecc90a-d003-4e27-96c5-4e26a42ccda4-kube-api-access-b85gc\") pod \"redhat-marketplace-26tld\" (UID: \"1eecc90a-d003-4e27-96c5-4e26a42ccda4\") " pod="openshift-marketplace/redhat-marketplace-26tld" Sep 30 14:43:59 crc kubenswrapper[4676]: I0930 14:43:59.624469 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1eecc90a-d003-4e27-96c5-4e26a42ccda4-utilities\") pod \"redhat-marketplace-26tld\" (UID: \"1eecc90a-d003-4e27-96c5-4e26a42ccda4\") " pod="openshift-marketplace/redhat-marketplace-26tld" Sep 30 14:43:59 crc kubenswrapper[4676]: I0930 14:43:59.624504 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1eecc90a-d003-4e27-96c5-4e26a42ccda4-catalog-content\") pod \"redhat-marketplace-26tld\" (UID: \"1eecc90a-d003-4e27-96c5-4e26a42ccda4\") " pod="openshift-marketplace/redhat-marketplace-26tld" Sep 30 14:43:59 crc kubenswrapper[4676]: I0930 14:43:59.625432 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1eecc90a-d003-4e27-96c5-4e26a42ccda4-catalog-content\") pod \"redhat-marketplace-26tld\" (UID: \"1eecc90a-d003-4e27-96c5-4e26a42ccda4\") " pod="openshift-marketplace/redhat-marketplace-26tld" Sep 30 14:43:59 crc kubenswrapper[4676]: I0930 14:43:59.625571 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1eecc90a-d003-4e27-96c5-4e26a42ccda4-utilities\") pod \"redhat-marketplace-26tld\" (UID: \"1eecc90a-d003-4e27-96c5-4e26a42ccda4\") " pod="openshift-marketplace/redhat-marketplace-26tld" Sep 30 14:43:59 crc kubenswrapper[4676]: I0930 14:43:59.661305 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b85gc\" (UniqueName: \"kubernetes.io/projected/1eecc90a-d003-4e27-96c5-4e26a42ccda4-kube-api-access-b85gc\") pod \"redhat-marketplace-26tld\" (UID: \"1eecc90a-d003-4e27-96c5-4e26a42ccda4\") " pod="openshift-marketplace/redhat-marketplace-26tld" Sep 30 14:43:59 crc kubenswrapper[4676]: I0930 14:43:59.734788 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-26tld" Sep 30 14:43:59 crc kubenswrapper[4676]: I0930 14:43:59.981539 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x2f9t" Sep 30 14:44:00 crc kubenswrapper[4676]: I0930 14:44:00.010642 4676 generic.go:334] "Generic (PLEG): container finished" podID="85eaa0fe-9999-4ea8-a55e-5718cc41846c" containerID="e1b478687c09180e662396e0baf1cdcc355879d48900b25ee180310c83829306" exitCode=0 Sep 30 14:44:00 crc kubenswrapper[4676]: I0930 14:44:00.010693 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x2f9t" event={"ID":"85eaa0fe-9999-4ea8-a55e-5718cc41846c","Type":"ContainerDied","Data":"e1b478687c09180e662396e0baf1cdcc355879d48900b25ee180310c83829306"} Sep 30 14:44:00 crc kubenswrapper[4676]: I0930 14:44:00.010721 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x2f9t" event={"ID":"85eaa0fe-9999-4ea8-a55e-5718cc41846c","Type":"ContainerDied","Data":"f3d56af74e9557fd7b5a97e8be26a86e10cc841631154432a81f9c6497bced24"} Sep 30 14:44:00 crc kubenswrapper[4676]: I0930 14:44:00.010740 4676 scope.go:117] "RemoveContainer" containerID="e1b478687c09180e662396e0baf1cdcc355879d48900b25ee180310c83829306" Sep 30 14:44:00 crc kubenswrapper[4676]: I0930 14:44:00.010994 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x2f9t" Sep 30 14:44:00 crc kubenswrapper[4676]: I0930 14:44:00.053846 4676 scope.go:117] "RemoveContainer" containerID="650a09f75fec3eb6b468a94bb3c30095651a39cb617abdd30dbac8c40b3825b5" Sep 30 14:44:00 crc kubenswrapper[4676]: I0930 14:44:00.078254 4676 scope.go:117] "RemoveContainer" containerID="77eaf1da4af75fb69d0a6106982dcf42eae80b45bbe192f4b0c870fefab31021" Sep 30 14:44:00 crc kubenswrapper[4676]: I0930 14:44:00.096747 4676 scope.go:117] "RemoveContainer" containerID="e1b478687c09180e662396e0baf1cdcc355879d48900b25ee180310c83829306" Sep 30 14:44:00 crc kubenswrapper[4676]: E0930 14:44:00.097450 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1b478687c09180e662396e0baf1cdcc355879d48900b25ee180310c83829306\": container with ID starting with e1b478687c09180e662396e0baf1cdcc355879d48900b25ee180310c83829306 not found: ID does not exist" containerID="e1b478687c09180e662396e0baf1cdcc355879d48900b25ee180310c83829306" Sep 30 14:44:00 crc kubenswrapper[4676]: I0930 14:44:00.097522 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1b478687c09180e662396e0baf1cdcc355879d48900b25ee180310c83829306"} err="failed to get container status \"e1b478687c09180e662396e0baf1cdcc355879d48900b25ee180310c83829306\": rpc error: code = NotFound desc = could not find container \"e1b478687c09180e662396e0baf1cdcc355879d48900b25ee180310c83829306\": container with ID starting with e1b478687c09180e662396e0baf1cdcc355879d48900b25ee180310c83829306 not found: ID does not exist" Sep 30 14:44:00 crc kubenswrapper[4676]: I0930 14:44:00.097557 4676 scope.go:117] "RemoveContainer" containerID="650a09f75fec3eb6b468a94bb3c30095651a39cb617abdd30dbac8c40b3825b5" Sep 30 14:44:00 crc kubenswrapper[4676]: E0930 14:44:00.097931 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"650a09f75fec3eb6b468a94bb3c30095651a39cb617abdd30dbac8c40b3825b5\": container with ID starting with 650a09f75fec3eb6b468a94bb3c30095651a39cb617abdd30dbac8c40b3825b5 not found: ID does not exist" containerID="650a09f75fec3eb6b468a94bb3c30095651a39cb617abdd30dbac8c40b3825b5" Sep 30 14:44:00 crc kubenswrapper[4676]: I0930 14:44:00.097981 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"650a09f75fec3eb6b468a94bb3c30095651a39cb617abdd30dbac8c40b3825b5"} err="failed to get container status \"650a09f75fec3eb6b468a94bb3c30095651a39cb617abdd30dbac8c40b3825b5\": rpc error: code = NotFound desc = could not find container \"650a09f75fec3eb6b468a94bb3c30095651a39cb617abdd30dbac8c40b3825b5\": container with ID starting with 650a09f75fec3eb6b468a94bb3c30095651a39cb617abdd30dbac8c40b3825b5 not found: ID does not exist" Sep 30 14:44:00 crc kubenswrapper[4676]: I0930 14:44:00.098016 4676 scope.go:117] "RemoveContainer" containerID="77eaf1da4af75fb69d0a6106982dcf42eae80b45bbe192f4b0c870fefab31021" Sep 30 14:44:00 crc kubenswrapper[4676]: E0930 14:44:00.098423 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77eaf1da4af75fb69d0a6106982dcf42eae80b45bbe192f4b0c870fefab31021\": container with ID starting with 77eaf1da4af75fb69d0a6106982dcf42eae80b45bbe192f4b0c870fefab31021 not found: ID does not exist" containerID="77eaf1da4af75fb69d0a6106982dcf42eae80b45bbe192f4b0c870fefab31021" Sep 30 14:44:00 crc kubenswrapper[4676]: I0930 14:44:00.098454 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77eaf1da4af75fb69d0a6106982dcf42eae80b45bbe192f4b0c870fefab31021"} err="failed to get container status \"77eaf1da4af75fb69d0a6106982dcf42eae80b45bbe192f4b0c870fefab31021\": rpc error: code = NotFound desc = could not find container \"77eaf1da4af75fb69d0a6106982dcf42eae80b45bbe192f4b0c870fefab31021\": container with ID starting with 77eaf1da4af75fb69d0a6106982dcf42eae80b45bbe192f4b0c870fefab31021 not found: ID does not exist" Sep 30 14:44:00 crc kubenswrapper[4676]: I0930 14:44:00.135711 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85eaa0fe-9999-4ea8-a55e-5718cc41846c-utilities\") pod \"85eaa0fe-9999-4ea8-a55e-5718cc41846c\" (UID: \"85eaa0fe-9999-4ea8-a55e-5718cc41846c\") " Sep 30 14:44:00 crc kubenswrapper[4676]: I0930 14:44:00.135780 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5q2m\" (UniqueName: \"kubernetes.io/projected/85eaa0fe-9999-4ea8-a55e-5718cc41846c-kube-api-access-k5q2m\") pod \"85eaa0fe-9999-4ea8-a55e-5718cc41846c\" (UID: \"85eaa0fe-9999-4ea8-a55e-5718cc41846c\") " Sep 30 14:44:00 crc kubenswrapper[4676]: I0930 14:44:00.135810 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85eaa0fe-9999-4ea8-a55e-5718cc41846c-catalog-content\") pod \"85eaa0fe-9999-4ea8-a55e-5718cc41846c\" (UID: \"85eaa0fe-9999-4ea8-a55e-5718cc41846c\") " Sep 30 14:44:00 crc kubenswrapper[4676]: I0930 14:44:00.136810 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85eaa0fe-9999-4ea8-a55e-5718cc41846c-utilities" (OuterVolumeSpecName: "utilities") pod "85eaa0fe-9999-4ea8-a55e-5718cc41846c" (UID: "85eaa0fe-9999-4ea8-a55e-5718cc41846c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:44:00 crc kubenswrapper[4676]: I0930 14:44:00.141477 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85eaa0fe-9999-4ea8-a55e-5718cc41846c-kube-api-access-k5q2m" (OuterVolumeSpecName: "kube-api-access-k5q2m") pod "85eaa0fe-9999-4ea8-a55e-5718cc41846c" (UID: "85eaa0fe-9999-4ea8-a55e-5718cc41846c"). InnerVolumeSpecName "kube-api-access-k5q2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:44:00 crc kubenswrapper[4676]: I0930 14:44:00.183088 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85eaa0fe-9999-4ea8-a55e-5718cc41846c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "85eaa0fe-9999-4ea8-a55e-5718cc41846c" (UID: "85eaa0fe-9999-4ea8-a55e-5718cc41846c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:44:00 crc kubenswrapper[4676]: I0930 14:44:00.221955 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-26tld"] Sep 30 14:44:00 crc kubenswrapper[4676]: I0930 14:44:00.238390 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85eaa0fe-9999-4ea8-a55e-5718cc41846c-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 14:44:00 crc kubenswrapper[4676]: I0930 14:44:00.238427 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5q2m\" (UniqueName: \"kubernetes.io/projected/85eaa0fe-9999-4ea8-a55e-5718cc41846c-kube-api-access-k5q2m\") on node \"crc\" DevicePath \"\"" Sep 30 14:44:00 crc kubenswrapper[4676]: I0930 14:44:00.238441 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85eaa0fe-9999-4ea8-a55e-5718cc41846c-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 14:44:00 crc kubenswrapper[4676]: I0930 14:44:00.371659 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-x2f9t"] Sep 30 14:44:00 crc kubenswrapper[4676]: I0930 14:44:00.383720 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-x2f9t"] Sep 30 14:44:01 crc kubenswrapper[4676]: I0930 14:44:01.023507 4676 generic.go:334] "Generic (PLEG): container finished" podID="1eecc90a-d003-4e27-96c5-4e26a42ccda4" containerID="272e42d3b7c6a6b589a12ddef935457b0dda7820ef751f7a6bd95ff8552b4d21" exitCode=0 Sep 30 14:44:01 crc kubenswrapper[4676]: I0930 14:44:01.023619 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-26tld" event={"ID":"1eecc90a-d003-4e27-96c5-4e26a42ccda4","Type":"ContainerDied","Data":"272e42d3b7c6a6b589a12ddef935457b0dda7820ef751f7a6bd95ff8552b4d21"} Sep 30 14:44:01 crc kubenswrapper[4676]: I0930 14:44:01.023872 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-26tld" event={"ID":"1eecc90a-d003-4e27-96c5-4e26a42ccda4","Type":"ContainerStarted","Data":"214441de910df8050e1257ae52740734e7ad5a972f762429187101e1f7f7324e"} Sep 30 14:44:01 crc kubenswrapper[4676]: I0930 14:44:01.446265 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85eaa0fe-9999-4ea8-a55e-5718cc41846c" path="/var/lib/kubelet/pods/85eaa0fe-9999-4ea8-a55e-5718cc41846c/volumes" Sep 30 14:44:03 crc kubenswrapper[4676]: I0930 14:44:03.070172 4676 generic.go:334] "Generic (PLEG): container finished" podID="1eecc90a-d003-4e27-96c5-4e26a42ccda4" containerID="27c74e13e2a12d8e18166c8cfbc66e009fc1ce0a36fa0732f1b9481f290b2658" exitCode=0 Sep 30 14:44:03 crc kubenswrapper[4676]: I0930 14:44:03.070270 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-26tld" event={"ID":"1eecc90a-d003-4e27-96c5-4e26a42ccda4","Type":"ContainerDied","Data":"27c74e13e2a12d8e18166c8cfbc66e009fc1ce0a36fa0732f1b9481f290b2658"} Sep 30 14:44:04 crc kubenswrapper[4676]: I0930 14:44:04.081405 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-26tld" event={"ID":"1eecc90a-d003-4e27-96c5-4e26a42ccda4","Type":"ContainerStarted","Data":"9603c551a9cf3cfa277bf2ec9c1282677b83ab1e68b3760f48c9b715322fcf63"} Sep 30 14:44:04 crc kubenswrapper[4676]: I0930 14:44:04.102210 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-26tld" podStartSLOduration=2.626714389 podStartE2EDuration="5.102185842s" podCreationTimestamp="2025-09-30 14:43:59 +0000 UTC" firstStartedPulling="2025-09-30 14:44:01.025619853 +0000 UTC m=+2745.008708282" lastFinishedPulling="2025-09-30 14:44:03.501091306 +0000 UTC m=+2747.484179735" observedRunningTime="2025-09-30 14:44:04.100450392 +0000 UTC m=+2748.083538821" watchObservedRunningTime="2025-09-30 14:44:04.102185842 +0000 UTC m=+2748.085274271" Sep 30 14:44:08 crc kubenswrapper[4676]: I0930 14:44:08.433771 4676 scope.go:117] "RemoveContainer" containerID="477c0a9d00ef4eb891b1ca42013ec95f070550ef2a4a839127e5b5c32ca124a9" Sep 30 14:44:08 crc kubenswrapper[4676]: E0930 14:44:08.434504 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:44:09 crc kubenswrapper[4676]: I0930 14:44:09.736410 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-26tld" Sep 30 14:44:09 crc kubenswrapper[4676]: I0930 14:44:09.736718 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-26tld" Sep 30 14:44:09 crc kubenswrapper[4676]: I0930 14:44:09.805842 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-26tld" Sep 30 14:44:10 crc kubenswrapper[4676]: I0930 14:44:10.180338 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-26tld" Sep 30 14:44:10 crc kubenswrapper[4676]: I0930 14:44:10.232975 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-26tld"] Sep 30 14:44:12 crc kubenswrapper[4676]: I0930 14:44:12.155466 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-26tld" podUID="1eecc90a-d003-4e27-96c5-4e26a42ccda4" containerName="registry-server" containerID="cri-o://9603c551a9cf3cfa277bf2ec9c1282677b83ab1e68b3760f48c9b715322fcf63" gracePeriod=2 Sep 30 14:44:12 crc kubenswrapper[4676]: I0930 14:44:12.632017 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-26tld" Sep 30 14:44:12 crc kubenswrapper[4676]: I0930 14:44:12.705845 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b85gc\" (UniqueName: \"kubernetes.io/projected/1eecc90a-d003-4e27-96c5-4e26a42ccda4-kube-api-access-b85gc\") pod \"1eecc90a-d003-4e27-96c5-4e26a42ccda4\" (UID: \"1eecc90a-d003-4e27-96c5-4e26a42ccda4\") " Sep 30 14:44:12 crc kubenswrapper[4676]: I0930 14:44:12.705939 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1eecc90a-d003-4e27-96c5-4e26a42ccda4-catalog-content\") pod \"1eecc90a-d003-4e27-96c5-4e26a42ccda4\" (UID: \"1eecc90a-d003-4e27-96c5-4e26a42ccda4\") " Sep 30 14:44:12 crc kubenswrapper[4676]: I0930 14:44:12.706107 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1eecc90a-d003-4e27-96c5-4e26a42ccda4-utilities\") pod \"1eecc90a-d003-4e27-96c5-4e26a42ccda4\" (UID: \"1eecc90a-d003-4e27-96c5-4e26a42ccda4\") " Sep 30 14:44:12 crc kubenswrapper[4676]: I0930 14:44:12.707028 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1eecc90a-d003-4e27-96c5-4e26a42ccda4-utilities" (OuterVolumeSpecName: "utilities") pod "1eecc90a-d003-4e27-96c5-4e26a42ccda4" (UID: "1eecc90a-d003-4e27-96c5-4e26a42ccda4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:44:12 crc kubenswrapper[4676]: I0930 14:44:12.711656 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1eecc90a-d003-4e27-96c5-4e26a42ccda4-kube-api-access-b85gc" (OuterVolumeSpecName: "kube-api-access-b85gc") pod "1eecc90a-d003-4e27-96c5-4e26a42ccda4" (UID: "1eecc90a-d003-4e27-96c5-4e26a42ccda4"). InnerVolumeSpecName "kube-api-access-b85gc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:44:12 crc kubenswrapper[4676]: I0930 14:44:12.719920 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1eecc90a-d003-4e27-96c5-4e26a42ccda4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1eecc90a-d003-4e27-96c5-4e26a42ccda4" (UID: "1eecc90a-d003-4e27-96c5-4e26a42ccda4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:44:12 crc kubenswrapper[4676]: I0930 14:44:12.808178 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b85gc\" (UniqueName: \"kubernetes.io/projected/1eecc90a-d003-4e27-96c5-4e26a42ccda4-kube-api-access-b85gc\") on node \"crc\" DevicePath \"\"" Sep 30 14:44:12 crc kubenswrapper[4676]: I0930 14:44:12.808208 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1eecc90a-d003-4e27-96c5-4e26a42ccda4-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 14:44:12 crc kubenswrapper[4676]: I0930 14:44:12.808217 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1eecc90a-d003-4e27-96c5-4e26a42ccda4-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 14:44:13 crc kubenswrapper[4676]: I0930 14:44:13.168084 4676 generic.go:334] "Generic (PLEG): container finished" podID="1eecc90a-d003-4e27-96c5-4e26a42ccda4" containerID="9603c551a9cf3cfa277bf2ec9c1282677b83ab1e68b3760f48c9b715322fcf63" exitCode=0 Sep 30 14:44:13 crc kubenswrapper[4676]: I0930 14:44:13.168151 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-26tld" event={"ID":"1eecc90a-d003-4e27-96c5-4e26a42ccda4","Type":"ContainerDied","Data":"9603c551a9cf3cfa277bf2ec9c1282677b83ab1e68b3760f48c9b715322fcf63"} Sep 30 14:44:13 crc kubenswrapper[4676]: I0930 14:44:13.168201 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-26tld" Sep 30 14:44:13 crc kubenswrapper[4676]: I0930 14:44:13.168235 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-26tld" event={"ID":"1eecc90a-d003-4e27-96c5-4e26a42ccda4","Type":"ContainerDied","Data":"214441de910df8050e1257ae52740734e7ad5a972f762429187101e1f7f7324e"} Sep 30 14:44:13 crc kubenswrapper[4676]: I0930 14:44:13.168274 4676 scope.go:117] "RemoveContainer" containerID="9603c551a9cf3cfa277bf2ec9c1282677b83ab1e68b3760f48c9b715322fcf63" Sep 30 14:44:13 crc kubenswrapper[4676]: I0930 14:44:13.196944 4676 scope.go:117] "RemoveContainer" containerID="27c74e13e2a12d8e18166c8cfbc66e009fc1ce0a36fa0732f1b9481f290b2658" Sep 30 14:44:13 crc kubenswrapper[4676]: I0930 14:44:13.213402 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-26tld"] Sep 30 14:44:13 crc kubenswrapper[4676]: I0930 14:44:13.225416 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-26tld"] Sep 30 14:44:13 crc kubenswrapper[4676]: I0930 14:44:13.232250 4676 scope.go:117] "RemoveContainer" containerID="272e42d3b7c6a6b589a12ddef935457b0dda7820ef751f7a6bd95ff8552b4d21" Sep 30 14:44:13 crc kubenswrapper[4676]: I0930 14:44:13.282361 4676 scope.go:117] "RemoveContainer" containerID="9603c551a9cf3cfa277bf2ec9c1282677b83ab1e68b3760f48c9b715322fcf63" Sep 30 14:44:13 crc kubenswrapper[4676]: E0930 14:44:13.283825 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9603c551a9cf3cfa277bf2ec9c1282677b83ab1e68b3760f48c9b715322fcf63\": container with ID starting with 9603c551a9cf3cfa277bf2ec9c1282677b83ab1e68b3760f48c9b715322fcf63 not found: ID does not exist" containerID="9603c551a9cf3cfa277bf2ec9c1282677b83ab1e68b3760f48c9b715322fcf63" Sep 30 14:44:13 crc kubenswrapper[4676]: I0930 14:44:13.283918 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9603c551a9cf3cfa277bf2ec9c1282677b83ab1e68b3760f48c9b715322fcf63"} err="failed to get container status \"9603c551a9cf3cfa277bf2ec9c1282677b83ab1e68b3760f48c9b715322fcf63\": rpc error: code = NotFound desc = could not find container \"9603c551a9cf3cfa277bf2ec9c1282677b83ab1e68b3760f48c9b715322fcf63\": container with ID starting with 9603c551a9cf3cfa277bf2ec9c1282677b83ab1e68b3760f48c9b715322fcf63 not found: ID does not exist" Sep 30 14:44:13 crc kubenswrapper[4676]: I0930 14:44:13.283982 4676 scope.go:117] "RemoveContainer" containerID="27c74e13e2a12d8e18166c8cfbc66e009fc1ce0a36fa0732f1b9481f290b2658" Sep 30 14:44:13 crc kubenswrapper[4676]: E0930 14:44:13.284528 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27c74e13e2a12d8e18166c8cfbc66e009fc1ce0a36fa0732f1b9481f290b2658\": container with ID starting with 27c74e13e2a12d8e18166c8cfbc66e009fc1ce0a36fa0732f1b9481f290b2658 not found: ID does not exist" containerID="27c74e13e2a12d8e18166c8cfbc66e009fc1ce0a36fa0732f1b9481f290b2658" Sep 30 14:44:13 crc kubenswrapper[4676]: I0930 14:44:13.284604 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27c74e13e2a12d8e18166c8cfbc66e009fc1ce0a36fa0732f1b9481f290b2658"} err="failed to get container status \"27c74e13e2a12d8e18166c8cfbc66e009fc1ce0a36fa0732f1b9481f290b2658\": rpc error: code = NotFound desc = could not find container \"27c74e13e2a12d8e18166c8cfbc66e009fc1ce0a36fa0732f1b9481f290b2658\": container with ID starting with 27c74e13e2a12d8e18166c8cfbc66e009fc1ce0a36fa0732f1b9481f290b2658 not found: ID does not exist" Sep 30 14:44:13 crc kubenswrapper[4676]: I0930 14:44:13.284643 4676 scope.go:117] "RemoveContainer" containerID="272e42d3b7c6a6b589a12ddef935457b0dda7820ef751f7a6bd95ff8552b4d21" Sep 30 14:44:13 crc kubenswrapper[4676]: E0930 14:44:13.285477 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"272e42d3b7c6a6b589a12ddef935457b0dda7820ef751f7a6bd95ff8552b4d21\": container with ID starting with 272e42d3b7c6a6b589a12ddef935457b0dda7820ef751f7a6bd95ff8552b4d21 not found: ID does not exist" containerID="272e42d3b7c6a6b589a12ddef935457b0dda7820ef751f7a6bd95ff8552b4d21" Sep 30 14:44:13 crc kubenswrapper[4676]: I0930 14:44:13.285551 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"272e42d3b7c6a6b589a12ddef935457b0dda7820ef751f7a6bd95ff8552b4d21"} err="failed to get container status \"272e42d3b7c6a6b589a12ddef935457b0dda7820ef751f7a6bd95ff8552b4d21\": rpc error: code = NotFound desc = could not find container \"272e42d3b7c6a6b589a12ddef935457b0dda7820ef751f7a6bd95ff8552b4d21\": container with ID starting with 272e42d3b7c6a6b589a12ddef935457b0dda7820ef751f7a6bd95ff8552b4d21 not found: ID does not exist" Sep 30 14:44:13 crc kubenswrapper[4676]: I0930 14:44:13.446048 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1eecc90a-d003-4e27-96c5-4e26a42ccda4" path="/var/lib/kubelet/pods/1eecc90a-d003-4e27-96c5-4e26a42ccda4/volumes" Sep 30 14:44:22 crc kubenswrapper[4676]: I0930 14:44:22.433841 4676 scope.go:117] "RemoveContainer" containerID="477c0a9d00ef4eb891b1ca42013ec95f070550ef2a4a839127e5b5c32ca124a9" Sep 30 14:44:22 crc kubenswrapper[4676]: E0930 14:44:22.435807 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:44:24 crc kubenswrapper[4676]: I0930 14:44:24.272716 4676 generic.go:334] "Generic (PLEG): container finished" podID="e4cb3ae8-cc50-4de4-b279-51105c6fc45c" containerID="6f708a493cc4943bbb2dd6033149696682ac14a1879733b65cb3ce3bbe2e7062" exitCode=0 Sep 30 14:44:24 crc kubenswrapper[4676]: I0930 14:44:24.272793 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gjtn5" event={"ID":"e4cb3ae8-cc50-4de4-b279-51105c6fc45c","Type":"ContainerDied","Data":"6f708a493cc4943bbb2dd6033149696682ac14a1879733b65cb3ce3bbe2e7062"} Sep 30 14:44:25 crc kubenswrapper[4676]: I0930 14:44:25.716567 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gjtn5" Sep 30 14:44:25 crc kubenswrapper[4676]: I0930 14:44:25.809818 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4cb3ae8-cc50-4de4-b279-51105c6fc45c-nova-combined-ca-bundle\") pod \"e4cb3ae8-cc50-4de4-b279-51105c6fc45c\" (UID: \"e4cb3ae8-cc50-4de4-b279-51105c6fc45c\") " Sep 30 14:44:25 crc kubenswrapper[4676]: I0930 14:44:25.809970 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e4cb3ae8-cc50-4de4-b279-51105c6fc45c-nova-migration-ssh-key-0\") pod \"e4cb3ae8-cc50-4de4-b279-51105c6fc45c\" (UID: \"e4cb3ae8-cc50-4de4-b279-51105c6fc45c\") " Sep 30 14:44:25 crc kubenswrapper[4676]: I0930 14:44:25.810020 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e4cb3ae8-cc50-4de4-b279-51105c6fc45c-nova-cell1-compute-config-1\") pod \"e4cb3ae8-cc50-4de4-b279-51105c6fc45c\" (UID: \"e4cb3ae8-cc50-4de4-b279-51105c6fc45c\") " Sep 30 14:44:25 crc kubenswrapper[4676]: I0930 14:44:25.810051 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/e4cb3ae8-cc50-4de4-b279-51105c6fc45c-nova-extra-config-0\") pod \"e4cb3ae8-cc50-4de4-b279-51105c6fc45c\" (UID: \"e4cb3ae8-cc50-4de4-b279-51105c6fc45c\") " Sep 30 14:44:25 crc kubenswrapper[4676]: I0930 14:44:25.810086 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldvkg\" (UniqueName: \"kubernetes.io/projected/e4cb3ae8-cc50-4de4-b279-51105c6fc45c-kube-api-access-ldvkg\") pod \"e4cb3ae8-cc50-4de4-b279-51105c6fc45c\" (UID: \"e4cb3ae8-cc50-4de4-b279-51105c6fc45c\") " Sep 30 14:44:25 crc kubenswrapper[4676]: I0930 14:44:25.810170 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e4cb3ae8-cc50-4de4-b279-51105c6fc45c-ssh-key\") pod \"e4cb3ae8-cc50-4de4-b279-51105c6fc45c\" (UID: \"e4cb3ae8-cc50-4de4-b279-51105c6fc45c\") " Sep 30 14:44:25 crc kubenswrapper[4676]: I0930 14:44:25.810202 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e4cb3ae8-cc50-4de4-b279-51105c6fc45c-inventory\") pod \"e4cb3ae8-cc50-4de4-b279-51105c6fc45c\" (UID: \"e4cb3ae8-cc50-4de4-b279-51105c6fc45c\") " Sep 30 14:44:25 crc kubenswrapper[4676]: I0930 14:44:25.810219 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e4cb3ae8-cc50-4de4-b279-51105c6fc45c-nova-migration-ssh-key-1\") pod \"e4cb3ae8-cc50-4de4-b279-51105c6fc45c\" (UID: \"e4cb3ae8-cc50-4de4-b279-51105c6fc45c\") " Sep 30 14:44:25 crc kubenswrapper[4676]: I0930 14:44:25.810322 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e4cb3ae8-cc50-4de4-b279-51105c6fc45c-nova-cell1-compute-config-0\") pod \"e4cb3ae8-cc50-4de4-b279-51105c6fc45c\" (UID: \"e4cb3ae8-cc50-4de4-b279-51105c6fc45c\") " Sep 30 14:44:25 crc kubenswrapper[4676]: I0930 14:44:25.817674 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4cb3ae8-cc50-4de4-b279-51105c6fc45c-kube-api-access-ldvkg" (OuterVolumeSpecName: "kube-api-access-ldvkg") pod "e4cb3ae8-cc50-4de4-b279-51105c6fc45c" (UID: "e4cb3ae8-cc50-4de4-b279-51105c6fc45c"). InnerVolumeSpecName "kube-api-access-ldvkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:44:25 crc kubenswrapper[4676]: I0930 14:44:25.818577 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4cb3ae8-cc50-4de4-b279-51105c6fc45c-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "e4cb3ae8-cc50-4de4-b279-51105c6fc45c" (UID: "e4cb3ae8-cc50-4de4-b279-51105c6fc45c"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:44:25 crc kubenswrapper[4676]: I0930 14:44:25.846900 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4cb3ae8-cc50-4de4-b279-51105c6fc45c-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "e4cb3ae8-cc50-4de4-b279-51105c6fc45c" (UID: "e4cb3ae8-cc50-4de4-b279-51105c6fc45c"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:44:25 crc kubenswrapper[4676]: I0930 14:44:25.847331 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4cb3ae8-cc50-4de4-b279-51105c6fc45c-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "e4cb3ae8-cc50-4de4-b279-51105c6fc45c" (UID: "e4cb3ae8-cc50-4de4-b279-51105c6fc45c"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:44:25 crc kubenswrapper[4676]: I0930 14:44:25.851981 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4cb3ae8-cc50-4de4-b279-51105c6fc45c-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "e4cb3ae8-cc50-4de4-b279-51105c6fc45c" (UID: "e4cb3ae8-cc50-4de4-b279-51105c6fc45c"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:44:25 crc kubenswrapper[4676]: I0930 14:44:25.852449 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4cb3ae8-cc50-4de4-b279-51105c6fc45c-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "e4cb3ae8-cc50-4de4-b279-51105c6fc45c" (UID: "e4cb3ae8-cc50-4de4-b279-51105c6fc45c"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:44:25 crc kubenswrapper[4676]: I0930 14:44:25.852806 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4cb3ae8-cc50-4de4-b279-51105c6fc45c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e4cb3ae8-cc50-4de4-b279-51105c6fc45c" (UID: "e4cb3ae8-cc50-4de4-b279-51105c6fc45c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:44:25 crc kubenswrapper[4676]: I0930 14:44:25.856047 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4cb3ae8-cc50-4de4-b279-51105c6fc45c-inventory" (OuterVolumeSpecName: "inventory") pod "e4cb3ae8-cc50-4de4-b279-51105c6fc45c" (UID: "e4cb3ae8-cc50-4de4-b279-51105c6fc45c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:44:25 crc kubenswrapper[4676]: I0930 14:44:25.857574 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4cb3ae8-cc50-4de4-b279-51105c6fc45c-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "e4cb3ae8-cc50-4de4-b279-51105c6fc45c" (UID: "e4cb3ae8-cc50-4de4-b279-51105c6fc45c"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:44:25 crc kubenswrapper[4676]: I0930 14:44:25.912712 4676 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e4cb3ae8-cc50-4de4-b279-51105c6fc45c-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 14:44:25 crc kubenswrapper[4676]: I0930 14:44:25.912747 4676 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e4cb3ae8-cc50-4de4-b279-51105c6fc45c-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 14:44:25 crc kubenswrapper[4676]: I0930 14:44:25.912757 4676 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e4cb3ae8-cc50-4de4-b279-51105c6fc45c-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Sep 30 14:44:25 crc kubenswrapper[4676]: I0930 14:44:25.912769 4676 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e4cb3ae8-cc50-4de4-b279-51105c6fc45c-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Sep 30 14:44:25 crc kubenswrapper[4676]: I0930 14:44:25.912779 4676 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4cb3ae8-cc50-4de4-b279-51105c6fc45c-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:44:25 crc kubenswrapper[4676]: I0930 14:44:25.912787 4676 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e4cb3ae8-cc50-4de4-b279-51105c6fc45c-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Sep 30 14:44:25 crc kubenswrapper[4676]: I0930 14:44:25.912795 4676 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e4cb3ae8-cc50-4de4-b279-51105c6fc45c-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Sep 30 14:44:25 crc kubenswrapper[4676]: I0930 14:44:25.912804 4676 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/e4cb3ae8-cc50-4de4-b279-51105c6fc45c-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Sep 30 14:44:25 crc kubenswrapper[4676]: I0930 14:44:25.912813 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ldvkg\" (UniqueName: \"kubernetes.io/projected/e4cb3ae8-cc50-4de4-b279-51105c6fc45c-kube-api-access-ldvkg\") on node \"crc\" DevicePath \"\"" Sep 30 14:44:26 crc kubenswrapper[4676]: I0930 14:44:26.292701 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gjtn5" event={"ID":"e4cb3ae8-cc50-4de4-b279-51105c6fc45c","Type":"ContainerDied","Data":"5ca043888e772796f9a63564d182423eef98190793fad9758fac8a50000d1ed2"} Sep 30 14:44:26 crc kubenswrapper[4676]: I0930 14:44:26.293103 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ca043888e772796f9a63564d182423eef98190793fad9758fac8a50000d1ed2" Sep 30 14:44:26 crc kubenswrapper[4676]: I0930 14:44:26.293015 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gjtn5" Sep 30 14:44:26 crc kubenswrapper[4676]: I0930 14:44:26.402430 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hzvwk"] Sep 30 14:44:26 crc kubenswrapper[4676]: E0930 14:44:26.402974 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1eecc90a-d003-4e27-96c5-4e26a42ccda4" containerName="extract-utilities" Sep 30 14:44:26 crc kubenswrapper[4676]: I0930 14:44:26.403051 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="1eecc90a-d003-4e27-96c5-4e26a42ccda4" containerName="extract-utilities" Sep 30 14:44:26 crc kubenswrapper[4676]: E0930 14:44:26.403080 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85eaa0fe-9999-4ea8-a55e-5718cc41846c" containerName="registry-server" Sep 30 14:44:26 crc kubenswrapper[4676]: I0930 14:44:26.403091 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="85eaa0fe-9999-4ea8-a55e-5718cc41846c" containerName="registry-server" Sep 30 14:44:26 crc kubenswrapper[4676]: E0930 14:44:26.403111 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1eecc90a-d003-4e27-96c5-4e26a42ccda4" containerName="extract-content" Sep 30 14:44:26 crc kubenswrapper[4676]: I0930 14:44:26.403119 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="1eecc90a-d003-4e27-96c5-4e26a42ccda4" containerName="extract-content" Sep 30 14:44:26 crc kubenswrapper[4676]: E0930 14:44:26.403137 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85eaa0fe-9999-4ea8-a55e-5718cc41846c" containerName="extract-content" Sep 30 14:44:26 crc kubenswrapper[4676]: I0930 14:44:26.403146 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="85eaa0fe-9999-4ea8-a55e-5718cc41846c" containerName="extract-content" Sep 30 14:44:26 crc kubenswrapper[4676]: E0930 14:44:26.403160 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4cb3ae8-cc50-4de4-b279-51105c6fc45c" containerName="nova-edpm-deployment-openstack-edpm-ipam" Sep 30 14:44:26 crc kubenswrapper[4676]: I0930 14:44:26.403168 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4cb3ae8-cc50-4de4-b279-51105c6fc45c" containerName="nova-edpm-deployment-openstack-edpm-ipam" Sep 30 14:44:26 crc kubenswrapper[4676]: E0930 14:44:26.403182 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85eaa0fe-9999-4ea8-a55e-5718cc41846c" containerName="extract-utilities" Sep 30 14:44:26 crc kubenswrapper[4676]: I0930 14:44:26.403190 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="85eaa0fe-9999-4ea8-a55e-5718cc41846c" containerName="extract-utilities" Sep 30 14:44:26 crc kubenswrapper[4676]: E0930 14:44:26.403202 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1eecc90a-d003-4e27-96c5-4e26a42ccda4" containerName="registry-server" Sep 30 14:44:26 crc kubenswrapper[4676]: I0930 14:44:26.403210 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="1eecc90a-d003-4e27-96c5-4e26a42ccda4" containerName="registry-server" Sep 30 14:44:26 crc kubenswrapper[4676]: I0930 14:44:26.403433 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="1eecc90a-d003-4e27-96c5-4e26a42ccda4" containerName="registry-server" Sep 30 14:44:26 crc kubenswrapper[4676]: I0930 14:44:26.403465 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="85eaa0fe-9999-4ea8-a55e-5718cc41846c" containerName="registry-server" Sep 30 14:44:26 crc kubenswrapper[4676]: I0930 14:44:26.403483 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4cb3ae8-cc50-4de4-b279-51105c6fc45c" containerName="nova-edpm-deployment-openstack-edpm-ipam" Sep 30 14:44:26 crc kubenswrapper[4676]: I0930 14:44:26.404373 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hzvwk" Sep 30 14:44:26 crc kubenswrapper[4676]: I0930 14:44:26.406723 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 14:44:26 crc kubenswrapper[4676]: I0930 14:44:26.410201 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mzpth" Sep 30 14:44:26 crc kubenswrapper[4676]: I0930 14:44:26.411198 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 14:44:26 crc kubenswrapper[4676]: I0930 14:44:26.411250 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Sep 30 14:44:26 crc kubenswrapper[4676]: I0930 14:44:26.411265 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 14:44:26 crc kubenswrapper[4676]: I0930 14:44:26.412816 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hzvwk"] Sep 30 14:44:26 crc kubenswrapper[4676]: I0930 14:44:26.523779 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/6b813464-177e-4354-af90-edefef63c05c-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-hzvwk\" (UID: \"6b813464-177e-4354-af90-edefef63c05c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hzvwk" Sep 30 14:44:26 crc kubenswrapper[4676]: I0930 14:44:26.523942 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b813464-177e-4354-af90-edefef63c05c-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-hzvwk\" (UID: \"6b813464-177e-4354-af90-edefef63c05c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hzvwk" Sep 30 14:44:26 crc kubenswrapper[4676]: I0930 14:44:26.524101 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fjbc\" (UniqueName: \"kubernetes.io/projected/6b813464-177e-4354-af90-edefef63c05c-kube-api-access-4fjbc\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-hzvwk\" (UID: \"6b813464-177e-4354-af90-edefef63c05c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hzvwk" Sep 30 14:44:26 crc kubenswrapper[4676]: I0930 14:44:26.524181 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/6b813464-177e-4354-af90-edefef63c05c-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-hzvwk\" (UID: \"6b813464-177e-4354-af90-edefef63c05c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hzvwk" Sep 30 14:44:26 crc kubenswrapper[4676]: I0930 14:44:26.524493 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6b813464-177e-4354-af90-edefef63c05c-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-hzvwk\" (UID: \"6b813464-177e-4354-af90-edefef63c05c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hzvwk" Sep 30 14:44:26 crc kubenswrapper[4676]: I0930 14:44:26.524584 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/6b813464-177e-4354-af90-edefef63c05c-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-hzvwk\" (UID: \"6b813464-177e-4354-af90-edefef63c05c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hzvwk" Sep 30 14:44:26 crc kubenswrapper[4676]: I0930 14:44:26.524634 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b813464-177e-4354-af90-edefef63c05c-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-hzvwk\" (UID: \"6b813464-177e-4354-af90-edefef63c05c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hzvwk" Sep 30 14:44:26 crc kubenswrapper[4676]: I0930 14:44:26.626627 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b813464-177e-4354-af90-edefef63c05c-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-hzvwk\" (UID: \"6b813464-177e-4354-af90-edefef63c05c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hzvwk" Sep 30 14:44:26 crc kubenswrapper[4676]: I0930 14:44:26.626701 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fjbc\" (UniqueName: \"kubernetes.io/projected/6b813464-177e-4354-af90-edefef63c05c-kube-api-access-4fjbc\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-hzvwk\" (UID: \"6b813464-177e-4354-af90-edefef63c05c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hzvwk" Sep 30 14:44:26 crc kubenswrapper[4676]: I0930 14:44:26.626739 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/6b813464-177e-4354-af90-edefef63c05c-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-hzvwk\" (UID: \"6b813464-177e-4354-af90-edefef63c05c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hzvwk" Sep 30 14:44:26 crc kubenswrapper[4676]: I0930 14:44:26.626812 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6b813464-177e-4354-af90-edefef63c05c-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-hzvwk\" (UID: \"6b813464-177e-4354-af90-edefef63c05c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hzvwk" Sep 30 14:44:26 crc kubenswrapper[4676]: I0930 14:44:26.626846 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/6b813464-177e-4354-af90-edefef63c05c-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-hzvwk\" (UID: \"6b813464-177e-4354-af90-edefef63c05c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hzvwk" Sep 30 14:44:26 crc kubenswrapper[4676]: I0930 14:44:26.626898 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b813464-177e-4354-af90-edefef63c05c-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-hzvwk\" (UID: \"6b813464-177e-4354-af90-edefef63c05c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hzvwk" Sep 30 14:44:26 crc kubenswrapper[4676]: I0930 14:44:26.626961 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/6b813464-177e-4354-af90-edefef63c05c-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-hzvwk\" (UID: \"6b813464-177e-4354-af90-edefef63c05c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hzvwk" Sep 30 14:44:26 crc kubenswrapper[4676]: I0930 14:44:26.631712 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/6b813464-177e-4354-af90-edefef63c05c-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-hzvwk\" (UID: \"6b813464-177e-4354-af90-edefef63c05c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hzvwk" Sep 30 14:44:26 crc kubenswrapper[4676]: I0930 14:44:26.632275 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/6b813464-177e-4354-af90-edefef63c05c-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-hzvwk\" (UID: \"6b813464-177e-4354-af90-edefef63c05c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hzvwk" Sep 30 14:44:26 crc kubenswrapper[4676]: I0930 14:44:26.632310 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6b813464-177e-4354-af90-edefef63c05c-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-hzvwk\" (UID: \"6b813464-177e-4354-af90-edefef63c05c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hzvwk" Sep 30 14:44:26 crc kubenswrapper[4676]: I0930 14:44:26.633483 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b813464-177e-4354-af90-edefef63c05c-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-hzvwk\" (UID: \"6b813464-177e-4354-af90-edefef63c05c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hzvwk" Sep 30 14:44:26 crc kubenswrapper[4676]: I0930 14:44:26.633821 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b813464-177e-4354-af90-edefef63c05c-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-hzvwk\" (UID: \"6b813464-177e-4354-af90-edefef63c05c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hzvwk" Sep 30 14:44:26 crc kubenswrapper[4676]: I0930 14:44:26.661831 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/6b813464-177e-4354-af90-edefef63c05c-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-hzvwk\" (UID: \"6b813464-177e-4354-af90-edefef63c05c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hzvwk" Sep 30 14:44:26 crc kubenswrapper[4676]: I0930 14:44:26.666583 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fjbc\" (UniqueName: \"kubernetes.io/projected/6b813464-177e-4354-af90-edefef63c05c-kube-api-access-4fjbc\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-hzvwk\" (UID: \"6b813464-177e-4354-af90-edefef63c05c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hzvwk" Sep 30 14:44:26 crc kubenswrapper[4676]: I0930 14:44:26.729665 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hzvwk" Sep 30 14:44:27 crc kubenswrapper[4676]: I0930 14:44:27.298415 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hzvwk"] Sep 30 14:44:28 crc kubenswrapper[4676]: I0930 14:44:28.315267 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hzvwk" event={"ID":"6b813464-177e-4354-af90-edefef63c05c","Type":"ContainerStarted","Data":"00d401e790bd294948bd55648d1d21179432239dbfa923f9e4f79fce2cc8badd"} Sep 30 14:44:28 crc kubenswrapper[4676]: I0930 14:44:28.315568 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hzvwk" event={"ID":"6b813464-177e-4354-af90-edefef63c05c","Type":"ContainerStarted","Data":"9e9e8390471c22757bbb71d7dcc81a13d0bba6731050bdf1e916914824fb658a"} Sep 30 14:44:28 crc kubenswrapper[4676]: I0930 14:44:28.352124 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hzvwk" podStartSLOduration=1.786856256 podStartE2EDuration="2.352093744s" podCreationTimestamp="2025-09-30 14:44:26 +0000 UTC" firstStartedPulling="2025-09-30 14:44:27.30682576 +0000 UTC m=+2771.289914189" lastFinishedPulling="2025-09-30 14:44:27.872063248 +0000 UTC m=+2771.855151677" observedRunningTime="2025-09-30 14:44:28.345686043 +0000 UTC m=+2772.328774512" watchObservedRunningTime="2025-09-30 14:44:28.352093744 +0000 UTC m=+2772.335182183" Sep 30 14:44:35 crc kubenswrapper[4676]: I0930 14:44:35.433824 4676 scope.go:117] "RemoveContainer" containerID="477c0a9d00ef4eb891b1ca42013ec95f070550ef2a4a839127e5b5c32ca124a9" Sep 30 14:44:36 crc kubenswrapper[4676]: I0930 14:44:36.393230 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" event={"ID":"af133cb7-f0e4-428e-b348-c6e81493fc1d","Type":"ContainerStarted","Data":"3160fb233518301ea3275b80669a99f94fc88529803fe386827f9f3c64471af5"} Sep 30 14:45:00 crc kubenswrapper[4676]: I0930 14:45:00.150452 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320725-22f76"] Sep 30 14:45:00 crc kubenswrapper[4676]: I0930 14:45:00.152410 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320725-22f76" Sep 30 14:45:00 crc kubenswrapper[4676]: I0930 14:45:00.155844 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 14:45:00 crc kubenswrapper[4676]: I0930 14:45:00.158381 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 14:45:00 crc kubenswrapper[4676]: I0930 14:45:00.163033 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320725-22f76"] Sep 30 14:45:00 crc kubenswrapper[4676]: I0930 14:45:00.238600 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6pq9\" (UniqueName: \"kubernetes.io/projected/97252acd-f0bc-408c-8093-f94fa42e5632-kube-api-access-k6pq9\") pod \"collect-profiles-29320725-22f76\" (UID: \"97252acd-f0bc-408c-8093-f94fa42e5632\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320725-22f76" Sep 30 14:45:00 crc kubenswrapper[4676]: I0930 14:45:00.238742 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/97252acd-f0bc-408c-8093-f94fa42e5632-secret-volume\") pod \"collect-profiles-29320725-22f76\" (UID: \"97252acd-f0bc-408c-8093-f94fa42e5632\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320725-22f76" Sep 30 14:45:00 crc kubenswrapper[4676]: I0930 14:45:00.238830 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/97252acd-f0bc-408c-8093-f94fa42e5632-config-volume\") pod \"collect-profiles-29320725-22f76\" (UID: \"97252acd-f0bc-408c-8093-f94fa42e5632\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320725-22f76" Sep 30 14:45:00 crc kubenswrapper[4676]: I0930 14:45:00.340933 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6pq9\" (UniqueName: \"kubernetes.io/projected/97252acd-f0bc-408c-8093-f94fa42e5632-kube-api-access-k6pq9\") pod \"collect-profiles-29320725-22f76\" (UID: \"97252acd-f0bc-408c-8093-f94fa42e5632\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320725-22f76" Sep 30 14:45:00 crc kubenswrapper[4676]: I0930 14:45:00.341073 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/97252acd-f0bc-408c-8093-f94fa42e5632-secret-volume\") pod \"collect-profiles-29320725-22f76\" (UID: \"97252acd-f0bc-408c-8093-f94fa42e5632\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320725-22f76" Sep 30 14:45:00 crc kubenswrapper[4676]: I0930 14:45:00.341143 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/97252acd-f0bc-408c-8093-f94fa42e5632-config-volume\") pod \"collect-profiles-29320725-22f76\" (UID: \"97252acd-f0bc-408c-8093-f94fa42e5632\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320725-22f76" Sep 30 14:45:00 crc kubenswrapper[4676]: I0930 14:45:00.342188 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/97252acd-f0bc-408c-8093-f94fa42e5632-config-volume\") pod \"collect-profiles-29320725-22f76\" (UID: \"97252acd-f0bc-408c-8093-f94fa42e5632\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320725-22f76" Sep 30 14:45:00 crc kubenswrapper[4676]: I0930 14:45:00.350910 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/97252acd-f0bc-408c-8093-f94fa42e5632-secret-volume\") pod \"collect-profiles-29320725-22f76\" (UID: \"97252acd-f0bc-408c-8093-f94fa42e5632\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320725-22f76" Sep 30 14:45:00 crc kubenswrapper[4676]: I0930 14:45:00.359931 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6pq9\" (UniqueName: \"kubernetes.io/projected/97252acd-f0bc-408c-8093-f94fa42e5632-kube-api-access-k6pq9\") pod \"collect-profiles-29320725-22f76\" (UID: \"97252acd-f0bc-408c-8093-f94fa42e5632\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320725-22f76" Sep 30 14:45:00 crc kubenswrapper[4676]: I0930 14:45:00.487252 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320725-22f76" Sep 30 14:45:00 crc kubenswrapper[4676]: I0930 14:45:00.960774 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320725-22f76"] Sep 30 14:45:01 crc kubenswrapper[4676]: I0930 14:45:01.634792 4676 generic.go:334] "Generic (PLEG): container finished" podID="97252acd-f0bc-408c-8093-f94fa42e5632" containerID="9b6da16987fe8c395243988862ce2a4f7cd754e06fe27fc30e8d6870434ac12d" exitCode=0 Sep 30 14:45:01 crc kubenswrapper[4676]: I0930 14:45:01.634860 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320725-22f76" event={"ID":"97252acd-f0bc-408c-8093-f94fa42e5632","Type":"ContainerDied","Data":"9b6da16987fe8c395243988862ce2a4f7cd754e06fe27fc30e8d6870434ac12d"} Sep 30 14:45:01 crc kubenswrapper[4676]: I0930 14:45:01.635146 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320725-22f76" event={"ID":"97252acd-f0bc-408c-8093-f94fa42e5632","Type":"ContainerStarted","Data":"2602ffb3dbab72fe3d44dab473ab1ffb2fb7693840f9633083e6f996b6e2c20b"} Sep 30 14:45:03 crc kubenswrapper[4676]: I0930 14:45:03.014633 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320725-22f76" Sep 30 14:45:03 crc kubenswrapper[4676]: I0930 14:45:03.100147 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/97252acd-f0bc-408c-8093-f94fa42e5632-config-volume\") pod \"97252acd-f0bc-408c-8093-f94fa42e5632\" (UID: \"97252acd-f0bc-408c-8093-f94fa42e5632\") " Sep 30 14:45:03 crc kubenswrapper[4676]: I0930 14:45:03.100367 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6pq9\" (UniqueName: \"kubernetes.io/projected/97252acd-f0bc-408c-8093-f94fa42e5632-kube-api-access-k6pq9\") pod \"97252acd-f0bc-408c-8093-f94fa42e5632\" (UID: \"97252acd-f0bc-408c-8093-f94fa42e5632\") " Sep 30 14:45:03 crc kubenswrapper[4676]: I0930 14:45:03.100465 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/97252acd-f0bc-408c-8093-f94fa42e5632-secret-volume\") pod \"97252acd-f0bc-408c-8093-f94fa42e5632\" (UID: \"97252acd-f0bc-408c-8093-f94fa42e5632\") " Sep 30 14:45:03 crc kubenswrapper[4676]: I0930 14:45:03.100985 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97252acd-f0bc-408c-8093-f94fa42e5632-config-volume" (OuterVolumeSpecName: "config-volume") pod "97252acd-f0bc-408c-8093-f94fa42e5632" (UID: "97252acd-f0bc-408c-8093-f94fa42e5632"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:45:03 crc kubenswrapper[4676]: I0930 14:45:03.106994 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97252acd-f0bc-408c-8093-f94fa42e5632-kube-api-access-k6pq9" (OuterVolumeSpecName: "kube-api-access-k6pq9") pod "97252acd-f0bc-408c-8093-f94fa42e5632" (UID: "97252acd-f0bc-408c-8093-f94fa42e5632"). InnerVolumeSpecName "kube-api-access-k6pq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:45:03 crc kubenswrapper[4676]: I0930 14:45:03.107058 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97252acd-f0bc-408c-8093-f94fa42e5632-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "97252acd-f0bc-408c-8093-f94fa42e5632" (UID: "97252acd-f0bc-408c-8093-f94fa42e5632"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:45:03 crc kubenswrapper[4676]: I0930 14:45:03.202707 4676 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/97252acd-f0bc-408c-8093-f94fa42e5632-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 14:45:03 crc kubenswrapper[4676]: I0930 14:45:03.203056 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6pq9\" (UniqueName: \"kubernetes.io/projected/97252acd-f0bc-408c-8093-f94fa42e5632-kube-api-access-k6pq9\") on node \"crc\" DevicePath \"\"" Sep 30 14:45:03 crc kubenswrapper[4676]: I0930 14:45:03.203067 4676 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/97252acd-f0bc-408c-8093-f94fa42e5632-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 14:45:03 crc kubenswrapper[4676]: I0930 14:45:03.654945 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320725-22f76" event={"ID":"97252acd-f0bc-408c-8093-f94fa42e5632","Type":"ContainerDied","Data":"2602ffb3dbab72fe3d44dab473ab1ffb2fb7693840f9633083e6f996b6e2c20b"} Sep 30 14:45:03 crc kubenswrapper[4676]: I0930 14:45:03.654995 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2602ffb3dbab72fe3d44dab473ab1ffb2fb7693840f9633083e6f996b6e2c20b" Sep 30 14:45:03 crc kubenswrapper[4676]: I0930 14:45:03.655007 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320725-22f76" Sep 30 14:45:04 crc kubenswrapper[4676]: I0930 14:45:04.087627 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320680-q9p7h"] Sep 30 14:45:04 crc kubenswrapper[4676]: I0930 14:45:04.096029 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320680-q9p7h"] Sep 30 14:45:05 crc kubenswrapper[4676]: I0930 14:45:05.445347 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6709a903-5bfa-42c6-be52-df8efb1d106e" path="/var/lib/kubelet/pods/6709a903-5bfa-42c6-be52-df8efb1d106e/volumes" Sep 30 14:45:22 crc kubenswrapper[4676]: I0930 14:45:22.732752 4676 scope.go:117] "RemoveContainer" containerID="519d01dc7fce15dba99590d1f77048e20d3892fdab8716958f9af3893e8ad4b7" Sep 30 14:46:37 crc kubenswrapper[4676]: I0930 14:46:37.530302 4676 generic.go:334] "Generic (PLEG): container finished" podID="6b813464-177e-4354-af90-edefef63c05c" containerID="00d401e790bd294948bd55648d1d21179432239dbfa923f9e4f79fce2cc8badd" exitCode=0 Sep 30 14:46:37 crc kubenswrapper[4676]: I0930 14:46:37.530401 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hzvwk" event={"ID":"6b813464-177e-4354-af90-edefef63c05c","Type":"ContainerDied","Data":"00d401e790bd294948bd55648d1d21179432239dbfa923f9e4f79fce2cc8badd"} Sep 30 14:46:38 crc kubenswrapper[4676]: I0930 14:46:38.946858 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hzvwk" Sep 30 14:46:39 crc kubenswrapper[4676]: I0930 14:46:39.091067 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/6b813464-177e-4354-af90-edefef63c05c-ceilometer-compute-config-data-2\") pod \"6b813464-177e-4354-af90-edefef63c05c\" (UID: \"6b813464-177e-4354-af90-edefef63c05c\") " Sep 30 14:46:39 crc kubenswrapper[4676]: I0930 14:46:39.091247 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fjbc\" (UniqueName: \"kubernetes.io/projected/6b813464-177e-4354-af90-edefef63c05c-kube-api-access-4fjbc\") pod \"6b813464-177e-4354-af90-edefef63c05c\" (UID: \"6b813464-177e-4354-af90-edefef63c05c\") " Sep 30 14:46:39 crc kubenswrapper[4676]: I0930 14:46:39.091318 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/6b813464-177e-4354-af90-edefef63c05c-ceilometer-compute-config-data-0\") pod \"6b813464-177e-4354-af90-edefef63c05c\" (UID: \"6b813464-177e-4354-af90-edefef63c05c\") " Sep 30 14:46:39 crc kubenswrapper[4676]: I0930 14:46:39.091376 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b813464-177e-4354-af90-edefef63c05c-telemetry-combined-ca-bundle\") pod \"6b813464-177e-4354-af90-edefef63c05c\" (UID: \"6b813464-177e-4354-af90-edefef63c05c\") " Sep 30 14:46:39 crc kubenswrapper[4676]: I0930 14:46:39.091403 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b813464-177e-4354-af90-edefef63c05c-inventory\") pod \"6b813464-177e-4354-af90-edefef63c05c\" (UID: \"6b813464-177e-4354-af90-edefef63c05c\") " Sep 30 14:46:39 crc kubenswrapper[4676]: I0930 14:46:39.091439 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/6b813464-177e-4354-af90-edefef63c05c-ceilometer-compute-config-data-1\") pod \"6b813464-177e-4354-af90-edefef63c05c\" (UID: \"6b813464-177e-4354-af90-edefef63c05c\") " Sep 30 14:46:39 crc kubenswrapper[4676]: I0930 14:46:39.091509 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6b813464-177e-4354-af90-edefef63c05c-ssh-key\") pod \"6b813464-177e-4354-af90-edefef63c05c\" (UID: \"6b813464-177e-4354-af90-edefef63c05c\") " Sep 30 14:46:39 crc kubenswrapper[4676]: I0930 14:46:39.098176 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b813464-177e-4354-af90-edefef63c05c-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "6b813464-177e-4354-af90-edefef63c05c" (UID: "6b813464-177e-4354-af90-edefef63c05c"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:46:39 crc kubenswrapper[4676]: I0930 14:46:39.098555 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b813464-177e-4354-af90-edefef63c05c-kube-api-access-4fjbc" (OuterVolumeSpecName: "kube-api-access-4fjbc") pod "6b813464-177e-4354-af90-edefef63c05c" (UID: "6b813464-177e-4354-af90-edefef63c05c"). InnerVolumeSpecName "kube-api-access-4fjbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:46:39 crc kubenswrapper[4676]: I0930 14:46:39.125300 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b813464-177e-4354-af90-edefef63c05c-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "6b813464-177e-4354-af90-edefef63c05c" (UID: "6b813464-177e-4354-af90-edefef63c05c"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:46:39 crc kubenswrapper[4676]: I0930 14:46:39.125319 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b813464-177e-4354-af90-edefef63c05c-inventory" (OuterVolumeSpecName: "inventory") pod "6b813464-177e-4354-af90-edefef63c05c" (UID: "6b813464-177e-4354-af90-edefef63c05c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:46:39 crc kubenswrapper[4676]: I0930 14:46:39.133142 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b813464-177e-4354-af90-edefef63c05c-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "6b813464-177e-4354-af90-edefef63c05c" (UID: "6b813464-177e-4354-af90-edefef63c05c"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:46:39 crc kubenswrapper[4676]: I0930 14:46:39.133653 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b813464-177e-4354-af90-edefef63c05c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6b813464-177e-4354-af90-edefef63c05c" (UID: "6b813464-177e-4354-af90-edefef63c05c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:46:39 crc kubenswrapper[4676]: I0930 14:46:39.143294 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b813464-177e-4354-af90-edefef63c05c-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "6b813464-177e-4354-af90-edefef63c05c" (UID: "6b813464-177e-4354-af90-edefef63c05c"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:46:39 crc kubenswrapper[4676]: I0930 14:46:39.193951 4676 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/6b813464-177e-4354-af90-edefef63c05c-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Sep 30 14:46:39 crc kubenswrapper[4676]: I0930 14:46:39.194000 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4fjbc\" (UniqueName: \"kubernetes.io/projected/6b813464-177e-4354-af90-edefef63c05c-kube-api-access-4fjbc\") on node \"crc\" DevicePath \"\"" Sep 30 14:46:39 crc kubenswrapper[4676]: I0930 14:46:39.194013 4676 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/6b813464-177e-4354-af90-edefef63c05c-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Sep 30 14:46:39 crc kubenswrapper[4676]: I0930 14:46:39.194131 4676 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b813464-177e-4354-af90-edefef63c05c-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 14:46:39 crc kubenswrapper[4676]: I0930 14:46:39.194151 4676 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b813464-177e-4354-af90-edefef63c05c-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 14:46:39 crc kubenswrapper[4676]: I0930 14:46:39.194165 4676 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/6b813464-177e-4354-af90-edefef63c05c-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Sep 30 14:46:39 crc kubenswrapper[4676]: I0930 14:46:39.194176 4676 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6b813464-177e-4354-af90-edefef63c05c-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 14:46:39 crc kubenswrapper[4676]: I0930 14:46:39.551545 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hzvwk" event={"ID":"6b813464-177e-4354-af90-edefef63c05c","Type":"ContainerDied","Data":"9e9e8390471c22757bbb71d7dcc81a13d0bba6731050bdf1e916914824fb658a"} Sep 30 14:46:39 crc kubenswrapper[4676]: I0930 14:46:39.551595 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e9e8390471c22757bbb71d7dcc81a13d0bba6731050bdf1e916914824fb658a" Sep 30 14:46:39 crc kubenswrapper[4676]: I0930 14:46:39.551629 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-hzvwk" Sep 30 14:46:59 crc kubenswrapper[4676]: I0930 14:46:59.919117 4676 patch_prober.go:28] interesting pod/machine-config-daemon-4k2dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:46:59 crc kubenswrapper[4676]: I0930 14:46:59.920087 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:47:23 crc kubenswrapper[4676]: I0930 14:47:23.035266 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Sep 30 14:47:23 crc kubenswrapper[4676]: E0930 14:47:23.036594 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b813464-177e-4354-af90-edefef63c05c" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Sep 30 14:47:23 crc kubenswrapper[4676]: I0930 14:47:23.036617 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b813464-177e-4354-af90-edefef63c05c" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Sep 30 14:47:23 crc kubenswrapper[4676]: E0930 14:47:23.036635 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97252acd-f0bc-408c-8093-f94fa42e5632" containerName="collect-profiles" Sep 30 14:47:23 crc kubenswrapper[4676]: I0930 14:47:23.036643 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="97252acd-f0bc-408c-8093-f94fa42e5632" containerName="collect-profiles" Sep 30 14:47:23 crc kubenswrapper[4676]: I0930 14:47:23.036936 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b813464-177e-4354-af90-edefef63c05c" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Sep 30 14:47:23 crc kubenswrapper[4676]: I0930 14:47:23.036969 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="97252acd-f0bc-408c-8093-f94fa42e5632" containerName="collect-profiles" Sep 30 14:47:23 crc kubenswrapper[4676]: I0930 14:47:23.037735 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Sep 30 14:47:23 crc kubenswrapper[4676]: I0930 14:47:23.043108 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Sep 30 14:47:23 crc kubenswrapper[4676]: I0930 14:47:23.043684 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Sep 30 14:47:23 crc kubenswrapper[4676]: I0930 14:47:23.044475 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Sep 30 14:47:23 crc kubenswrapper[4676]: I0930 14:47:23.046085 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-qlpht" Sep 30 14:47:23 crc kubenswrapper[4676]: I0930 14:47:23.049691 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Sep 30 14:47:23 crc kubenswrapper[4676]: I0930 14:47:23.180650 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vw9nq\" (UniqueName: \"kubernetes.io/projected/10b046c1-241e-4dfd-9aa3-d3e5532a6190-kube-api-access-vw9nq\") pod \"tempest-tests-tempest\" (UID: \"10b046c1-241e-4dfd-9aa3-d3e5532a6190\") " pod="openstack/tempest-tests-tempest" Sep 30 14:47:23 crc kubenswrapper[4676]: I0930 14:47:23.180717 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/10b046c1-241e-4dfd-9aa3-d3e5532a6190-config-data\") pod \"tempest-tests-tempest\" (UID: \"10b046c1-241e-4dfd-9aa3-d3e5532a6190\") " pod="openstack/tempest-tests-tempest" Sep 30 14:47:23 crc kubenswrapper[4676]: I0930 14:47:23.180743 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/10b046c1-241e-4dfd-9aa3-d3e5532a6190-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"10b046c1-241e-4dfd-9aa3-d3e5532a6190\") " pod="openstack/tempest-tests-tempest" Sep 30 14:47:23 crc kubenswrapper[4676]: I0930 14:47:23.180817 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/10b046c1-241e-4dfd-9aa3-d3e5532a6190-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"10b046c1-241e-4dfd-9aa3-d3e5532a6190\") " pod="openstack/tempest-tests-tempest" Sep 30 14:47:23 crc kubenswrapper[4676]: I0930 14:47:23.180867 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/10b046c1-241e-4dfd-9aa3-d3e5532a6190-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"10b046c1-241e-4dfd-9aa3-d3e5532a6190\") " pod="openstack/tempest-tests-tempest" Sep 30 14:47:23 crc kubenswrapper[4676]: I0930 14:47:23.181220 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/10b046c1-241e-4dfd-9aa3-d3e5532a6190-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"10b046c1-241e-4dfd-9aa3-d3e5532a6190\") " pod="openstack/tempest-tests-tempest" Sep 30 14:47:23 crc kubenswrapper[4676]: I0930 14:47:23.181328 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/10b046c1-241e-4dfd-9aa3-d3e5532a6190-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"10b046c1-241e-4dfd-9aa3-d3e5532a6190\") " pod="openstack/tempest-tests-tempest" Sep 30 14:47:23 crc kubenswrapper[4676]: I0930 14:47:23.181577 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"10b046c1-241e-4dfd-9aa3-d3e5532a6190\") " pod="openstack/tempest-tests-tempest" Sep 30 14:47:23 crc kubenswrapper[4676]: I0930 14:47:23.183037 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/10b046c1-241e-4dfd-9aa3-d3e5532a6190-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"10b046c1-241e-4dfd-9aa3-d3e5532a6190\") " pod="openstack/tempest-tests-tempest" Sep 30 14:47:23 crc kubenswrapper[4676]: I0930 14:47:23.284861 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/10b046c1-241e-4dfd-9aa3-d3e5532a6190-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"10b046c1-241e-4dfd-9aa3-d3e5532a6190\") " pod="openstack/tempest-tests-tempest" Sep 30 14:47:23 crc kubenswrapper[4676]: I0930 14:47:23.285193 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/10b046c1-241e-4dfd-9aa3-d3e5532a6190-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"10b046c1-241e-4dfd-9aa3-d3e5532a6190\") " pod="openstack/tempest-tests-tempest" Sep 30 14:47:23 crc kubenswrapper[4676]: I0930 14:47:23.285339 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/10b046c1-241e-4dfd-9aa3-d3e5532a6190-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"10b046c1-241e-4dfd-9aa3-d3e5532a6190\") " pod="openstack/tempest-tests-tempest" Sep 30 14:47:23 crc kubenswrapper[4676]: I0930 14:47:23.285445 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/10b046c1-241e-4dfd-9aa3-d3e5532a6190-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"10b046c1-241e-4dfd-9aa3-d3e5532a6190\") " pod="openstack/tempest-tests-tempest" Sep 30 14:47:23 crc kubenswrapper[4676]: I0930 14:47:23.285599 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"10b046c1-241e-4dfd-9aa3-d3e5532a6190\") " pod="openstack/tempest-tests-tempest" Sep 30 14:47:23 crc kubenswrapper[4676]: I0930 14:47:23.285721 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/10b046c1-241e-4dfd-9aa3-d3e5532a6190-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"10b046c1-241e-4dfd-9aa3-d3e5532a6190\") " pod="openstack/tempest-tests-tempest" Sep 30 14:47:23 crc kubenswrapper[4676]: I0930 14:47:23.285824 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vw9nq\" (UniqueName: \"kubernetes.io/projected/10b046c1-241e-4dfd-9aa3-d3e5532a6190-kube-api-access-vw9nq\") pod \"tempest-tests-tempest\" (UID: \"10b046c1-241e-4dfd-9aa3-d3e5532a6190\") " pod="openstack/tempest-tests-tempest" Sep 30 14:47:23 crc kubenswrapper[4676]: I0930 14:47:23.285951 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/10b046c1-241e-4dfd-9aa3-d3e5532a6190-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"10b046c1-241e-4dfd-9aa3-d3e5532a6190\") " pod="openstack/tempest-tests-tempest" Sep 30 14:47:23 crc kubenswrapper[4676]: I0930 14:47:23.286063 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/10b046c1-241e-4dfd-9aa3-d3e5532a6190-config-data\") pod \"tempest-tests-tempest\" (UID: \"10b046c1-241e-4dfd-9aa3-d3e5532a6190\") " pod="openstack/tempest-tests-tempest" Sep 30 14:47:23 crc kubenswrapper[4676]: I0930 14:47:23.286006 4676 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"10b046c1-241e-4dfd-9aa3-d3e5532a6190\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/tempest-tests-tempest" Sep 30 14:47:23 crc kubenswrapper[4676]: I0930 14:47:23.286076 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/10b046c1-241e-4dfd-9aa3-d3e5532a6190-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"10b046c1-241e-4dfd-9aa3-d3e5532a6190\") " pod="openstack/tempest-tests-tempest" Sep 30 14:47:23 crc kubenswrapper[4676]: I0930 14:47:23.285473 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/10b046c1-241e-4dfd-9aa3-d3e5532a6190-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"10b046c1-241e-4dfd-9aa3-d3e5532a6190\") " pod="openstack/tempest-tests-tempest" Sep 30 14:47:23 crc kubenswrapper[4676]: I0930 14:47:23.286674 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/10b046c1-241e-4dfd-9aa3-d3e5532a6190-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"10b046c1-241e-4dfd-9aa3-d3e5532a6190\") " pod="openstack/tempest-tests-tempest" Sep 30 14:47:23 crc kubenswrapper[4676]: I0930 14:47:23.287085 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/10b046c1-241e-4dfd-9aa3-d3e5532a6190-config-data\") pod \"tempest-tests-tempest\" (UID: \"10b046c1-241e-4dfd-9aa3-d3e5532a6190\") " pod="openstack/tempest-tests-tempest" Sep 30 14:47:23 crc kubenswrapper[4676]: I0930 14:47:23.291472 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/10b046c1-241e-4dfd-9aa3-d3e5532a6190-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"10b046c1-241e-4dfd-9aa3-d3e5532a6190\") " pod="openstack/tempest-tests-tempest" Sep 30 14:47:23 crc kubenswrapper[4676]: I0930 14:47:23.292067 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/10b046c1-241e-4dfd-9aa3-d3e5532a6190-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"10b046c1-241e-4dfd-9aa3-d3e5532a6190\") " pod="openstack/tempest-tests-tempest" Sep 30 14:47:23 crc kubenswrapper[4676]: I0930 14:47:23.294429 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/10b046c1-241e-4dfd-9aa3-d3e5532a6190-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"10b046c1-241e-4dfd-9aa3-d3e5532a6190\") " pod="openstack/tempest-tests-tempest" Sep 30 14:47:23 crc kubenswrapper[4676]: I0930 14:47:23.303051 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vw9nq\" (UniqueName: \"kubernetes.io/projected/10b046c1-241e-4dfd-9aa3-d3e5532a6190-kube-api-access-vw9nq\") pod \"tempest-tests-tempest\" (UID: \"10b046c1-241e-4dfd-9aa3-d3e5532a6190\") " pod="openstack/tempest-tests-tempest" Sep 30 14:47:23 crc kubenswrapper[4676]: I0930 14:47:23.314772 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"10b046c1-241e-4dfd-9aa3-d3e5532a6190\") " pod="openstack/tempest-tests-tempest" Sep 30 14:47:23 crc kubenswrapper[4676]: I0930 14:47:23.372847 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Sep 30 14:47:23 crc kubenswrapper[4676]: I0930 14:47:23.795532 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Sep 30 14:47:23 crc kubenswrapper[4676]: W0930 14:47:23.801126 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod10b046c1_241e_4dfd_9aa3_d3e5532a6190.slice/crio-2e6525e6220f788aead0c6e8543e2695df88158cf32b944fa2e94abf1b4369bc WatchSource:0}: Error finding container 2e6525e6220f788aead0c6e8543e2695df88158cf32b944fa2e94abf1b4369bc: Status 404 returned error can't find the container with id 2e6525e6220f788aead0c6e8543e2695df88158cf32b944fa2e94abf1b4369bc Sep 30 14:47:23 crc kubenswrapper[4676]: I0930 14:47:23.803441 4676 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 14:47:23 crc kubenswrapper[4676]: I0930 14:47:23.948373 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"10b046c1-241e-4dfd-9aa3-d3e5532a6190","Type":"ContainerStarted","Data":"2e6525e6220f788aead0c6e8543e2695df88158cf32b944fa2e94abf1b4369bc"} Sep 30 14:47:29 crc kubenswrapper[4676]: I0930 14:47:29.919777 4676 patch_prober.go:28] interesting pod/machine-config-daemon-4k2dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:47:29 crc kubenswrapper[4676]: I0930 14:47:29.920599 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:47:48 crc kubenswrapper[4676]: E0930 14:47:48.658175 4676 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Sep 30 14:47:48 crc kubenswrapper[4676]: E0930 14:47:48.659027 4676 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vw9nq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(10b046c1-241e-4dfd-9aa3-d3e5532a6190): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 14:47:48 crc kubenswrapper[4676]: E0930 14:47:48.660212 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="10b046c1-241e-4dfd-9aa3-d3e5532a6190" Sep 30 14:47:49 crc kubenswrapper[4676]: E0930 14:47:49.198570 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="10b046c1-241e-4dfd-9aa3-d3e5532a6190" Sep 30 14:47:59 crc kubenswrapper[4676]: I0930 14:47:59.919740 4676 patch_prober.go:28] interesting pod/machine-config-daemon-4k2dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:47:59 crc kubenswrapper[4676]: I0930 14:47:59.920420 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:47:59 crc kubenswrapper[4676]: I0930 14:47:59.920470 4676 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" Sep 30 14:47:59 crc kubenswrapper[4676]: I0930 14:47:59.921208 4676 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3160fb233518301ea3275b80669a99f94fc88529803fe386827f9f3c64471af5"} pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 14:47:59 crc kubenswrapper[4676]: I0930 14:47:59.921268 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerName="machine-config-daemon" containerID="cri-o://3160fb233518301ea3275b80669a99f94fc88529803fe386827f9f3c64471af5" gracePeriod=600 Sep 30 14:48:00 crc kubenswrapper[4676]: I0930 14:48:00.302369 4676 generic.go:334] "Generic (PLEG): container finished" podID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerID="3160fb233518301ea3275b80669a99f94fc88529803fe386827f9f3c64471af5" exitCode=0 Sep 30 14:48:00 crc kubenswrapper[4676]: I0930 14:48:00.302440 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" event={"ID":"af133cb7-f0e4-428e-b348-c6e81493fc1d","Type":"ContainerDied","Data":"3160fb233518301ea3275b80669a99f94fc88529803fe386827f9f3c64471af5"} Sep 30 14:48:00 crc kubenswrapper[4676]: I0930 14:48:00.303021 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" event={"ID":"af133cb7-f0e4-428e-b348-c6e81493fc1d","Type":"ContainerStarted","Data":"784fe086d452c9a174375c718c7a65e162c4962c87a59123843848bb343aa424"} Sep 30 14:48:00 crc kubenswrapper[4676]: I0930 14:48:00.303059 4676 scope.go:117] "RemoveContainer" containerID="477c0a9d00ef4eb891b1ca42013ec95f070550ef2a4a839127e5b5c32ca124a9" Sep 30 14:48:04 crc kubenswrapper[4676]: I0930 14:48:04.347325 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"10b046c1-241e-4dfd-9aa3-d3e5532a6190","Type":"ContainerStarted","Data":"bfe576d41d89b88bedf6a386cea7c8e7bdfcbe27bf4195eb56c4e647af73d0f0"} Sep 30 14:48:04 crc kubenswrapper[4676]: I0930 14:48:04.370426 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.182912682 podStartE2EDuration="42.370406925s" podCreationTimestamp="2025-09-30 14:47:22 +0000 UTC" firstStartedPulling="2025-09-30 14:47:23.803268132 +0000 UTC m=+2947.786356551" lastFinishedPulling="2025-09-30 14:48:02.990762365 +0000 UTC m=+2986.973850794" observedRunningTime="2025-09-30 14:48:04.364677797 +0000 UTC m=+2988.347766236" watchObservedRunningTime="2025-09-30 14:48:04.370406925 +0000 UTC m=+2988.353495354" Sep 30 14:50:29 crc kubenswrapper[4676]: I0930 14:50:29.919544 4676 patch_prober.go:28] interesting pod/machine-config-daemon-4k2dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:50:29 crc kubenswrapper[4676]: I0930 14:50:29.920283 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:50:59 crc kubenswrapper[4676]: I0930 14:50:59.920299 4676 patch_prober.go:28] interesting pod/machine-config-daemon-4k2dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:50:59 crc kubenswrapper[4676]: I0930 14:50:59.921434 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:51:29 crc kubenswrapper[4676]: I0930 14:51:29.919387 4676 patch_prober.go:28] interesting pod/machine-config-daemon-4k2dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:51:29 crc kubenswrapper[4676]: I0930 14:51:29.920023 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:51:29 crc kubenswrapper[4676]: I0930 14:51:29.920071 4676 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" Sep 30 14:51:29 crc kubenswrapper[4676]: I0930 14:51:29.920771 4676 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"784fe086d452c9a174375c718c7a65e162c4962c87a59123843848bb343aa424"} pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 14:51:29 crc kubenswrapper[4676]: I0930 14:51:29.920826 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerName="machine-config-daemon" containerID="cri-o://784fe086d452c9a174375c718c7a65e162c4962c87a59123843848bb343aa424" gracePeriod=600 Sep 30 14:51:30 crc kubenswrapper[4676]: E0930 14:51:30.040534 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:51:30 crc kubenswrapper[4676]: I0930 14:51:30.218799 4676 generic.go:334] "Generic (PLEG): container finished" podID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerID="784fe086d452c9a174375c718c7a65e162c4962c87a59123843848bb343aa424" exitCode=0 Sep 30 14:51:30 crc kubenswrapper[4676]: I0930 14:51:30.218902 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" event={"ID":"af133cb7-f0e4-428e-b348-c6e81493fc1d","Type":"ContainerDied","Data":"784fe086d452c9a174375c718c7a65e162c4962c87a59123843848bb343aa424"} Sep 30 14:51:30 crc kubenswrapper[4676]: I0930 14:51:30.219497 4676 scope.go:117] "RemoveContainer" containerID="3160fb233518301ea3275b80669a99f94fc88529803fe386827f9f3c64471af5" Sep 30 14:51:30 crc kubenswrapper[4676]: I0930 14:51:30.220202 4676 scope.go:117] "RemoveContainer" containerID="784fe086d452c9a174375c718c7a65e162c4962c87a59123843848bb343aa424" Sep 30 14:51:30 crc kubenswrapper[4676]: E0930 14:51:30.220446 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:51:43 crc kubenswrapper[4676]: I0930 14:51:43.433806 4676 scope.go:117] "RemoveContainer" containerID="784fe086d452c9a174375c718c7a65e162c4962c87a59123843848bb343aa424" Sep 30 14:51:43 crc kubenswrapper[4676]: E0930 14:51:43.434663 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:51:57 crc kubenswrapper[4676]: I0930 14:51:57.441042 4676 scope.go:117] "RemoveContainer" containerID="784fe086d452c9a174375c718c7a65e162c4962c87a59123843848bb343aa424" Sep 30 14:51:57 crc kubenswrapper[4676]: E0930 14:51:57.441775 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:52:03 crc kubenswrapper[4676]: I0930 14:52:03.153999 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-sjqk5"] Sep 30 14:52:03 crc kubenswrapper[4676]: I0930 14:52:03.157180 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sjqk5" Sep 30 14:52:03 crc kubenswrapper[4676]: I0930 14:52:03.164870 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81fbd89d-e247-4297-bccd-7f4e7c6939de-catalog-content\") pod \"redhat-operators-sjqk5\" (UID: \"81fbd89d-e247-4297-bccd-7f4e7c6939de\") " pod="openshift-marketplace/redhat-operators-sjqk5" Sep 30 14:52:03 crc kubenswrapper[4676]: I0930 14:52:03.164937 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85bl9\" (UniqueName: \"kubernetes.io/projected/81fbd89d-e247-4297-bccd-7f4e7c6939de-kube-api-access-85bl9\") pod \"redhat-operators-sjqk5\" (UID: \"81fbd89d-e247-4297-bccd-7f4e7c6939de\") " pod="openshift-marketplace/redhat-operators-sjqk5" Sep 30 14:52:03 crc kubenswrapper[4676]: I0930 14:52:03.165088 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81fbd89d-e247-4297-bccd-7f4e7c6939de-utilities\") pod \"redhat-operators-sjqk5\" (UID: \"81fbd89d-e247-4297-bccd-7f4e7c6939de\") " pod="openshift-marketplace/redhat-operators-sjqk5" Sep 30 14:52:03 crc kubenswrapper[4676]: I0930 14:52:03.168459 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sjqk5"] Sep 30 14:52:03 crc kubenswrapper[4676]: I0930 14:52:03.267647 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81fbd89d-e247-4297-bccd-7f4e7c6939de-utilities\") pod \"redhat-operators-sjqk5\" (UID: \"81fbd89d-e247-4297-bccd-7f4e7c6939de\") " pod="openshift-marketplace/redhat-operators-sjqk5" Sep 30 14:52:03 crc kubenswrapper[4676]: I0930 14:52:03.267942 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81fbd89d-e247-4297-bccd-7f4e7c6939de-catalog-content\") pod \"redhat-operators-sjqk5\" (UID: \"81fbd89d-e247-4297-bccd-7f4e7c6939de\") " pod="openshift-marketplace/redhat-operators-sjqk5" Sep 30 14:52:03 crc kubenswrapper[4676]: I0930 14:52:03.267980 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85bl9\" (UniqueName: \"kubernetes.io/projected/81fbd89d-e247-4297-bccd-7f4e7c6939de-kube-api-access-85bl9\") pod \"redhat-operators-sjqk5\" (UID: \"81fbd89d-e247-4297-bccd-7f4e7c6939de\") " pod="openshift-marketplace/redhat-operators-sjqk5" Sep 30 14:52:03 crc kubenswrapper[4676]: I0930 14:52:03.268318 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81fbd89d-e247-4297-bccd-7f4e7c6939de-utilities\") pod \"redhat-operators-sjqk5\" (UID: \"81fbd89d-e247-4297-bccd-7f4e7c6939de\") " pod="openshift-marketplace/redhat-operators-sjqk5" Sep 30 14:52:03 crc kubenswrapper[4676]: I0930 14:52:03.268419 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81fbd89d-e247-4297-bccd-7f4e7c6939de-catalog-content\") pod \"redhat-operators-sjqk5\" (UID: \"81fbd89d-e247-4297-bccd-7f4e7c6939de\") " pod="openshift-marketplace/redhat-operators-sjqk5" Sep 30 14:52:03 crc kubenswrapper[4676]: I0930 14:52:03.294351 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85bl9\" (UniqueName: \"kubernetes.io/projected/81fbd89d-e247-4297-bccd-7f4e7c6939de-kube-api-access-85bl9\") pod \"redhat-operators-sjqk5\" (UID: \"81fbd89d-e247-4297-bccd-7f4e7c6939de\") " pod="openshift-marketplace/redhat-operators-sjqk5" Sep 30 14:52:03 crc kubenswrapper[4676]: I0930 14:52:03.484613 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sjqk5" Sep 30 14:52:03 crc kubenswrapper[4676]: I0930 14:52:03.958492 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sjqk5"] Sep 30 14:52:04 crc kubenswrapper[4676]: I0930 14:52:04.535139 4676 generic.go:334] "Generic (PLEG): container finished" podID="81fbd89d-e247-4297-bccd-7f4e7c6939de" containerID="aeb84e8c97ab8f63d4ba498ff6dcb84dd598a3a7ac9c9690e76cb74813490a89" exitCode=0 Sep 30 14:52:04 crc kubenswrapper[4676]: I0930 14:52:04.535195 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sjqk5" event={"ID":"81fbd89d-e247-4297-bccd-7f4e7c6939de","Type":"ContainerDied","Data":"aeb84e8c97ab8f63d4ba498ff6dcb84dd598a3a7ac9c9690e76cb74813490a89"} Sep 30 14:52:04 crc kubenswrapper[4676]: I0930 14:52:04.536102 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sjqk5" event={"ID":"81fbd89d-e247-4297-bccd-7f4e7c6939de","Type":"ContainerStarted","Data":"4385ebac5f8d09ea97a1f3d66a1e01906ef2d4ae0636bf409bcfe97360570e1a"} Sep 30 14:52:06 crc kubenswrapper[4676]: I0930 14:52:06.556381 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sjqk5" event={"ID":"81fbd89d-e247-4297-bccd-7f4e7c6939de","Type":"ContainerStarted","Data":"6bb7f76ab97e057349beb752a8c68111d09640770ef457feae4d23bacb79d710"} Sep 30 14:52:09 crc kubenswrapper[4676]: I0930 14:52:09.587193 4676 generic.go:334] "Generic (PLEG): container finished" podID="81fbd89d-e247-4297-bccd-7f4e7c6939de" containerID="6bb7f76ab97e057349beb752a8c68111d09640770ef457feae4d23bacb79d710" exitCode=0 Sep 30 14:52:09 crc kubenswrapper[4676]: I0930 14:52:09.587273 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sjqk5" event={"ID":"81fbd89d-e247-4297-bccd-7f4e7c6939de","Type":"ContainerDied","Data":"6bb7f76ab97e057349beb752a8c68111d09640770ef457feae4d23bacb79d710"} Sep 30 14:52:10 crc kubenswrapper[4676]: I0930 14:52:10.433532 4676 scope.go:117] "RemoveContainer" containerID="784fe086d452c9a174375c718c7a65e162c4962c87a59123843848bb343aa424" Sep 30 14:52:10 crc kubenswrapper[4676]: E0930 14:52:10.434397 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:52:10 crc kubenswrapper[4676]: I0930 14:52:10.612241 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sjqk5" event={"ID":"81fbd89d-e247-4297-bccd-7f4e7c6939de","Type":"ContainerStarted","Data":"6832b8a15f9f497384c931a9f3d7c1342afee2edba886813a3da36825c58254b"} Sep 30 14:52:10 crc kubenswrapper[4676]: I0930 14:52:10.635216 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-sjqk5" podStartSLOduration=2.14559943 podStartE2EDuration="7.635195196s" podCreationTimestamp="2025-09-30 14:52:03 +0000 UTC" firstStartedPulling="2025-09-30 14:52:04.537129257 +0000 UTC m=+3228.520217686" lastFinishedPulling="2025-09-30 14:52:10.026725023 +0000 UTC m=+3234.009813452" observedRunningTime="2025-09-30 14:52:10.632839134 +0000 UTC m=+3234.615927563" watchObservedRunningTime="2025-09-30 14:52:10.635195196 +0000 UTC m=+3234.618283635" Sep 30 14:52:13 crc kubenswrapper[4676]: I0930 14:52:13.485407 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-sjqk5" Sep 30 14:52:13 crc kubenswrapper[4676]: I0930 14:52:13.486012 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-sjqk5" Sep 30 14:52:14 crc kubenswrapper[4676]: I0930 14:52:14.533087 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-sjqk5" podUID="81fbd89d-e247-4297-bccd-7f4e7c6939de" containerName="registry-server" probeResult="failure" output=< Sep 30 14:52:14 crc kubenswrapper[4676]: timeout: failed to connect service ":50051" within 1s Sep 30 14:52:14 crc kubenswrapper[4676]: > Sep 30 14:52:23 crc kubenswrapper[4676]: I0930 14:52:23.528080 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-sjqk5" Sep 30 14:52:23 crc kubenswrapper[4676]: I0930 14:52:23.576614 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-sjqk5" Sep 30 14:52:23 crc kubenswrapper[4676]: I0930 14:52:23.771517 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sjqk5"] Sep 30 14:52:24 crc kubenswrapper[4676]: I0930 14:52:24.434119 4676 scope.go:117] "RemoveContainer" containerID="784fe086d452c9a174375c718c7a65e162c4962c87a59123843848bb343aa424" Sep 30 14:52:24 crc kubenswrapper[4676]: E0930 14:52:24.434471 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:52:24 crc kubenswrapper[4676]: I0930 14:52:24.747229 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-sjqk5" podUID="81fbd89d-e247-4297-bccd-7f4e7c6939de" containerName="registry-server" containerID="cri-o://6832b8a15f9f497384c931a9f3d7c1342afee2edba886813a3da36825c58254b" gracePeriod=2 Sep 30 14:52:25 crc kubenswrapper[4676]: I0930 14:52:25.245155 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sjqk5" Sep 30 14:52:25 crc kubenswrapper[4676]: I0930 14:52:25.424938 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81fbd89d-e247-4297-bccd-7f4e7c6939de-utilities\") pod \"81fbd89d-e247-4297-bccd-7f4e7c6939de\" (UID: \"81fbd89d-e247-4297-bccd-7f4e7c6939de\") " Sep 30 14:52:25 crc kubenswrapper[4676]: I0930 14:52:25.425080 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81fbd89d-e247-4297-bccd-7f4e7c6939de-catalog-content\") pod \"81fbd89d-e247-4297-bccd-7f4e7c6939de\" (UID: \"81fbd89d-e247-4297-bccd-7f4e7c6939de\") " Sep 30 14:52:25 crc kubenswrapper[4676]: I0930 14:52:25.425152 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85bl9\" (UniqueName: \"kubernetes.io/projected/81fbd89d-e247-4297-bccd-7f4e7c6939de-kube-api-access-85bl9\") pod \"81fbd89d-e247-4297-bccd-7f4e7c6939de\" (UID: \"81fbd89d-e247-4297-bccd-7f4e7c6939de\") " Sep 30 14:52:25 crc kubenswrapper[4676]: I0930 14:52:25.426071 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81fbd89d-e247-4297-bccd-7f4e7c6939de-utilities" (OuterVolumeSpecName: "utilities") pod "81fbd89d-e247-4297-bccd-7f4e7c6939de" (UID: "81fbd89d-e247-4297-bccd-7f4e7c6939de"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:52:25 crc kubenswrapper[4676]: I0930 14:52:25.432608 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81fbd89d-e247-4297-bccd-7f4e7c6939de-kube-api-access-85bl9" (OuterVolumeSpecName: "kube-api-access-85bl9") pod "81fbd89d-e247-4297-bccd-7f4e7c6939de" (UID: "81fbd89d-e247-4297-bccd-7f4e7c6939de"). InnerVolumeSpecName "kube-api-access-85bl9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:52:25 crc kubenswrapper[4676]: I0930 14:52:25.515044 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81fbd89d-e247-4297-bccd-7f4e7c6939de-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "81fbd89d-e247-4297-bccd-7f4e7c6939de" (UID: "81fbd89d-e247-4297-bccd-7f4e7c6939de"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:52:25 crc kubenswrapper[4676]: I0930 14:52:25.527865 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81fbd89d-e247-4297-bccd-7f4e7c6939de-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 14:52:25 crc kubenswrapper[4676]: I0930 14:52:25.527922 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81fbd89d-e247-4297-bccd-7f4e7c6939de-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 14:52:25 crc kubenswrapper[4676]: I0930 14:52:25.527935 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85bl9\" (UniqueName: \"kubernetes.io/projected/81fbd89d-e247-4297-bccd-7f4e7c6939de-kube-api-access-85bl9\") on node \"crc\" DevicePath \"\"" Sep 30 14:52:25 crc kubenswrapper[4676]: I0930 14:52:25.758485 4676 generic.go:334] "Generic (PLEG): container finished" podID="81fbd89d-e247-4297-bccd-7f4e7c6939de" containerID="6832b8a15f9f497384c931a9f3d7c1342afee2edba886813a3da36825c58254b" exitCode=0 Sep 30 14:52:25 crc kubenswrapper[4676]: I0930 14:52:25.758562 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sjqk5" event={"ID":"81fbd89d-e247-4297-bccd-7f4e7c6939de","Type":"ContainerDied","Data":"6832b8a15f9f497384c931a9f3d7c1342afee2edba886813a3da36825c58254b"} Sep 30 14:52:25 crc kubenswrapper[4676]: I0930 14:52:25.758610 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sjqk5" event={"ID":"81fbd89d-e247-4297-bccd-7f4e7c6939de","Type":"ContainerDied","Data":"4385ebac5f8d09ea97a1f3d66a1e01906ef2d4ae0636bf409bcfe97360570e1a"} Sep 30 14:52:25 crc kubenswrapper[4676]: I0930 14:52:25.758626 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sjqk5" Sep 30 14:52:25 crc kubenswrapper[4676]: I0930 14:52:25.758639 4676 scope.go:117] "RemoveContainer" containerID="6832b8a15f9f497384c931a9f3d7c1342afee2edba886813a3da36825c58254b" Sep 30 14:52:25 crc kubenswrapper[4676]: I0930 14:52:25.780596 4676 scope.go:117] "RemoveContainer" containerID="6bb7f76ab97e057349beb752a8c68111d09640770ef457feae4d23bacb79d710" Sep 30 14:52:25 crc kubenswrapper[4676]: I0930 14:52:25.793873 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sjqk5"] Sep 30 14:52:25 crc kubenswrapper[4676]: I0930 14:52:25.801312 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-sjqk5"] Sep 30 14:52:25 crc kubenswrapper[4676]: I0930 14:52:25.804177 4676 scope.go:117] "RemoveContainer" containerID="aeb84e8c97ab8f63d4ba498ff6dcb84dd598a3a7ac9c9690e76cb74813490a89" Sep 30 14:52:25 crc kubenswrapper[4676]: I0930 14:52:25.858614 4676 scope.go:117] "RemoveContainer" containerID="6832b8a15f9f497384c931a9f3d7c1342afee2edba886813a3da36825c58254b" Sep 30 14:52:25 crc kubenswrapper[4676]: E0930 14:52:25.860481 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6832b8a15f9f497384c931a9f3d7c1342afee2edba886813a3da36825c58254b\": container with ID starting with 6832b8a15f9f497384c931a9f3d7c1342afee2edba886813a3da36825c58254b not found: ID does not exist" containerID="6832b8a15f9f497384c931a9f3d7c1342afee2edba886813a3da36825c58254b" Sep 30 14:52:25 crc kubenswrapper[4676]: I0930 14:52:25.860522 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6832b8a15f9f497384c931a9f3d7c1342afee2edba886813a3da36825c58254b"} err="failed to get container status \"6832b8a15f9f497384c931a9f3d7c1342afee2edba886813a3da36825c58254b\": rpc error: code = NotFound desc = could not find container \"6832b8a15f9f497384c931a9f3d7c1342afee2edba886813a3da36825c58254b\": container with ID starting with 6832b8a15f9f497384c931a9f3d7c1342afee2edba886813a3da36825c58254b not found: ID does not exist" Sep 30 14:52:25 crc kubenswrapper[4676]: I0930 14:52:25.860549 4676 scope.go:117] "RemoveContainer" containerID="6bb7f76ab97e057349beb752a8c68111d09640770ef457feae4d23bacb79d710" Sep 30 14:52:25 crc kubenswrapper[4676]: E0930 14:52:25.861218 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bb7f76ab97e057349beb752a8c68111d09640770ef457feae4d23bacb79d710\": container with ID starting with 6bb7f76ab97e057349beb752a8c68111d09640770ef457feae4d23bacb79d710 not found: ID does not exist" containerID="6bb7f76ab97e057349beb752a8c68111d09640770ef457feae4d23bacb79d710" Sep 30 14:52:25 crc kubenswrapper[4676]: I0930 14:52:25.861249 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bb7f76ab97e057349beb752a8c68111d09640770ef457feae4d23bacb79d710"} err="failed to get container status \"6bb7f76ab97e057349beb752a8c68111d09640770ef457feae4d23bacb79d710\": rpc error: code = NotFound desc = could not find container \"6bb7f76ab97e057349beb752a8c68111d09640770ef457feae4d23bacb79d710\": container with ID starting with 6bb7f76ab97e057349beb752a8c68111d09640770ef457feae4d23bacb79d710 not found: ID does not exist" Sep 30 14:52:25 crc kubenswrapper[4676]: I0930 14:52:25.861312 4676 scope.go:117] "RemoveContainer" containerID="aeb84e8c97ab8f63d4ba498ff6dcb84dd598a3a7ac9c9690e76cb74813490a89" Sep 30 14:52:25 crc kubenswrapper[4676]: E0930 14:52:25.861677 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aeb84e8c97ab8f63d4ba498ff6dcb84dd598a3a7ac9c9690e76cb74813490a89\": container with ID starting with aeb84e8c97ab8f63d4ba498ff6dcb84dd598a3a7ac9c9690e76cb74813490a89 not found: ID does not exist" containerID="aeb84e8c97ab8f63d4ba498ff6dcb84dd598a3a7ac9c9690e76cb74813490a89" Sep 30 14:52:25 crc kubenswrapper[4676]: I0930 14:52:25.861720 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aeb84e8c97ab8f63d4ba498ff6dcb84dd598a3a7ac9c9690e76cb74813490a89"} err="failed to get container status \"aeb84e8c97ab8f63d4ba498ff6dcb84dd598a3a7ac9c9690e76cb74813490a89\": rpc error: code = NotFound desc = could not find container \"aeb84e8c97ab8f63d4ba498ff6dcb84dd598a3a7ac9c9690e76cb74813490a89\": container with ID starting with aeb84e8c97ab8f63d4ba498ff6dcb84dd598a3a7ac9c9690e76cb74813490a89 not found: ID does not exist" Sep 30 14:52:27 crc kubenswrapper[4676]: I0930 14:52:27.447355 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81fbd89d-e247-4297-bccd-7f4e7c6939de" path="/var/lib/kubelet/pods/81fbd89d-e247-4297-bccd-7f4e7c6939de/volumes" Sep 30 14:52:36 crc kubenswrapper[4676]: I0930 14:52:36.433166 4676 scope.go:117] "RemoveContainer" containerID="784fe086d452c9a174375c718c7a65e162c4962c87a59123843848bb343aa424" Sep 30 14:52:36 crc kubenswrapper[4676]: E0930 14:52:36.434176 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:52:51 crc kubenswrapper[4676]: I0930 14:52:51.433730 4676 scope.go:117] "RemoveContainer" containerID="784fe086d452c9a174375c718c7a65e162c4962c87a59123843848bb343aa424" Sep 30 14:52:51 crc kubenswrapper[4676]: E0930 14:52:51.434588 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:53:02 crc kubenswrapper[4676]: I0930 14:53:02.434016 4676 scope.go:117] "RemoveContainer" containerID="784fe086d452c9a174375c718c7a65e162c4962c87a59123843848bb343aa424" Sep 30 14:53:02 crc kubenswrapper[4676]: E0930 14:53:02.435305 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:53:13 crc kubenswrapper[4676]: I0930 14:53:13.432770 4676 scope.go:117] "RemoveContainer" containerID="784fe086d452c9a174375c718c7a65e162c4962c87a59123843848bb343aa424" Sep 30 14:53:13 crc kubenswrapper[4676]: E0930 14:53:13.433637 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:53:25 crc kubenswrapper[4676]: I0930 14:53:25.433844 4676 scope.go:117] "RemoveContainer" containerID="784fe086d452c9a174375c718c7a65e162c4962c87a59123843848bb343aa424" Sep 30 14:53:25 crc kubenswrapper[4676]: E0930 14:53:25.435031 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:53:36 crc kubenswrapper[4676]: I0930 14:53:36.433505 4676 scope.go:117] "RemoveContainer" containerID="784fe086d452c9a174375c718c7a65e162c4962c87a59123843848bb343aa424" Sep 30 14:53:36 crc kubenswrapper[4676]: E0930 14:53:36.435166 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:53:51 crc kubenswrapper[4676]: I0930 14:53:51.433696 4676 scope.go:117] "RemoveContainer" containerID="784fe086d452c9a174375c718c7a65e162c4962c87a59123843848bb343aa424" Sep 30 14:53:51 crc kubenswrapper[4676]: E0930 14:53:51.435268 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:54:06 crc kubenswrapper[4676]: I0930 14:54:06.433145 4676 scope.go:117] "RemoveContainer" containerID="784fe086d452c9a174375c718c7a65e162c4962c87a59123843848bb343aa424" Sep 30 14:54:06 crc kubenswrapper[4676]: E0930 14:54:06.433973 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:54:11 crc kubenswrapper[4676]: I0930 14:54:11.496104 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fglpz"] Sep 30 14:54:11 crc kubenswrapper[4676]: E0930 14:54:11.496993 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81fbd89d-e247-4297-bccd-7f4e7c6939de" containerName="registry-server" Sep 30 14:54:11 crc kubenswrapper[4676]: I0930 14:54:11.497007 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="81fbd89d-e247-4297-bccd-7f4e7c6939de" containerName="registry-server" Sep 30 14:54:11 crc kubenswrapper[4676]: E0930 14:54:11.497043 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81fbd89d-e247-4297-bccd-7f4e7c6939de" containerName="extract-utilities" Sep 30 14:54:11 crc kubenswrapper[4676]: I0930 14:54:11.497050 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="81fbd89d-e247-4297-bccd-7f4e7c6939de" containerName="extract-utilities" Sep 30 14:54:11 crc kubenswrapper[4676]: E0930 14:54:11.497063 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81fbd89d-e247-4297-bccd-7f4e7c6939de" containerName="extract-content" Sep 30 14:54:11 crc kubenswrapper[4676]: I0930 14:54:11.497069 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="81fbd89d-e247-4297-bccd-7f4e7c6939de" containerName="extract-content" Sep 30 14:54:11 crc kubenswrapper[4676]: I0930 14:54:11.497250 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="81fbd89d-e247-4297-bccd-7f4e7c6939de" containerName="registry-server" Sep 30 14:54:11 crc kubenswrapper[4676]: I0930 14:54:11.498731 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fglpz" Sep 30 14:54:11 crc kubenswrapper[4676]: I0930 14:54:11.528096 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fglpz"] Sep 30 14:54:11 crc kubenswrapper[4676]: I0930 14:54:11.627768 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/645aae53-2a92-47f1-89e0-0ecea4ff98b1-utilities\") pod \"community-operators-fglpz\" (UID: \"645aae53-2a92-47f1-89e0-0ecea4ff98b1\") " pod="openshift-marketplace/community-operators-fglpz" Sep 30 14:54:11 crc kubenswrapper[4676]: I0930 14:54:11.628090 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/645aae53-2a92-47f1-89e0-0ecea4ff98b1-catalog-content\") pod \"community-operators-fglpz\" (UID: \"645aae53-2a92-47f1-89e0-0ecea4ff98b1\") " pod="openshift-marketplace/community-operators-fglpz" Sep 30 14:54:11 crc kubenswrapper[4676]: I0930 14:54:11.628171 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxlhg\" (UniqueName: \"kubernetes.io/projected/645aae53-2a92-47f1-89e0-0ecea4ff98b1-kube-api-access-gxlhg\") pod \"community-operators-fglpz\" (UID: \"645aae53-2a92-47f1-89e0-0ecea4ff98b1\") " pod="openshift-marketplace/community-operators-fglpz" Sep 30 14:54:11 crc kubenswrapper[4676]: I0930 14:54:11.730418 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/645aae53-2a92-47f1-89e0-0ecea4ff98b1-utilities\") pod \"community-operators-fglpz\" (UID: \"645aae53-2a92-47f1-89e0-0ecea4ff98b1\") " pod="openshift-marketplace/community-operators-fglpz" Sep 30 14:54:11 crc kubenswrapper[4676]: I0930 14:54:11.730554 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/645aae53-2a92-47f1-89e0-0ecea4ff98b1-catalog-content\") pod \"community-operators-fglpz\" (UID: \"645aae53-2a92-47f1-89e0-0ecea4ff98b1\") " pod="openshift-marketplace/community-operators-fglpz" Sep 30 14:54:11 crc kubenswrapper[4676]: I0930 14:54:11.730581 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxlhg\" (UniqueName: \"kubernetes.io/projected/645aae53-2a92-47f1-89e0-0ecea4ff98b1-kube-api-access-gxlhg\") pod \"community-operators-fglpz\" (UID: \"645aae53-2a92-47f1-89e0-0ecea4ff98b1\") " pod="openshift-marketplace/community-operators-fglpz" Sep 30 14:54:11 crc kubenswrapper[4676]: I0930 14:54:11.731014 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/645aae53-2a92-47f1-89e0-0ecea4ff98b1-utilities\") pod \"community-operators-fglpz\" (UID: \"645aae53-2a92-47f1-89e0-0ecea4ff98b1\") " pod="openshift-marketplace/community-operators-fglpz" Sep 30 14:54:11 crc kubenswrapper[4676]: I0930 14:54:11.731383 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/645aae53-2a92-47f1-89e0-0ecea4ff98b1-catalog-content\") pod \"community-operators-fglpz\" (UID: \"645aae53-2a92-47f1-89e0-0ecea4ff98b1\") " pod="openshift-marketplace/community-operators-fglpz" Sep 30 14:54:11 crc kubenswrapper[4676]: I0930 14:54:11.749016 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxlhg\" (UniqueName: \"kubernetes.io/projected/645aae53-2a92-47f1-89e0-0ecea4ff98b1-kube-api-access-gxlhg\") pod \"community-operators-fglpz\" (UID: \"645aae53-2a92-47f1-89e0-0ecea4ff98b1\") " pod="openshift-marketplace/community-operators-fglpz" Sep 30 14:54:11 crc kubenswrapper[4676]: I0930 14:54:11.824948 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fglpz" Sep 30 14:54:12 crc kubenswrapper[4676]: I0930 14:54:12.318579 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fglpz"] Sep 30 14:54:12 crc kubenswrapper[4676]: I0930 14:54:12.796015 4676 generic.go:334] "Generic (PLEG): container finished" podID="645aae53-2a92-47f1-89e0-0ecea4ff98b1" containerID="9553be432e0117412fd5bc1f6a63e3fd5e9db5b9a352735b0f84a523a9e4ecbe" exitCode=0 Sep 30 14:54:12 crc kubenswrapper[4676]: I0930 14:54:12.796093 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fglpz" event={"ID":"645aae53-2a92-47f1-89e0-0ecea4ff98b1","Type":"ContainerDied","Data":"9553be432e0117412fd5bc1f6a63e3fd5e9db5b9a352735b0f84a523a9e4ecbe"} Sep 30 14:54:12 crc kubenswrapper[4676]: I0930 14:54:12.796128 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fglpz" event={"ID":"645aae53-2a92-47f1-89e0-0ecea4ff98b1","Type":"ContainerStarted","Data":"2a1e808f2452cc787e66fc43ea2333e153278f260c0d88c45be13ca39d29598e"} Sep 30 14:54:12 crc kubenswrapper[4676]: I0930 14:54:12.798806 4676 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 14:54:14 crc kubenswrapper[4676]: I0930 14:54:14.815330 4676 generic.go:334] "Generic (PLEG): container finished" podID="645aae53-2a92-47f1-89e0-0ecea4ff98b1" containerID="159a29ed49966e3aa58e48dafcb9be987bbbaf0275b1cfb19e4c2537873344f2" exitCode=0 Sep 30 14:54:14 crc kubenswrapper[4676]: I0930 14:54:14.815387 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fglpz" event={"ID":"645aae53-2a92-47f1-89e0-0ecea4ff98b1","Type":"ContainerDied","Data":"159a29ed49966e3aa58e48dafcb9be987bbbaf0275b1cfb19e4c2537873344f2"} Sep 30 14:54:15 crc kubenswrapper[4676]: I0930 14:54:15.825712 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fglpz" event={"ID":"645aae53-2a92-47f1-89e0-0ecea4ff98b1","Type":"ContainerStarted","Data":"f7662c50e3ccb81c065d79576f7a82b20886ff9e1fc01cafb412a318a8273541"} Sep 30 14:54:15 crc kubenswrapper[4676]: I0930 14:54:15.851512 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fglpz" podStartSLOduration=2.393545559 podStartE2EDuration="4.851490699s" podCreationTimestamp="2025-09-30 14:54:11 +0000 UTC" firstStartedPulling="2025-09-30 14:54:12.798548403 +0000 UTC m=+3356.781636832" lastFinishedPulling="2025-09-30 14:54:15.256493543 +0000 UTC m=+3359.239581972" observedRunningTime="2025-09-30 14:54:15.843330494 +0000 UTC m=+3359.826418943" watchObservedRunningTime="2025-09-30 14:54:15.851490699 +0000 UTC m=+3359.834579128" Sep 30 14:54:18 crc kubenswrapper[4676]: I0930 14:54:18.433827 4676 scope.go:117] "RemoveContainer" containerID="784fe086d452c9a174375c718c7a65e162c4962c87a59123843848bb343aa424" Sep 30 14:54:18 crc kubenswrapper[4676]: E0930 14:54:18.434672 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:54:21 crc kubenswrapper[4676]: I0930 14:54:21.825107 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fglpz" Sep 30 14:54:21 crc kubenswrapper[4676]: I0930 14:54:21.825741 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fglpz" Sep 30 14:54:21 crc kubenswrapper[4676]: I0930 14:54:21.874293 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fglpz" Sep 30 14:54:21 crc kubenswrapper[4676]: I0930 14:54:21.921169 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fglpz" Sep 30 14:54:22 crc kubenswrapper[4676]: I0930 14:54:22.110180 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fglpz"] Sep 30 14:54:23 crc kubenswrapper[4676]: I0930 14:54:23.891387 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fglpz" podUID="645aae53-2a92-47f1-89e0-0ecea4ff98b1" containerName="registry-server" containerID="cri-o://f7662c50e3ccb81c065d79576f7a82b20886ff9e1fc01cafb412a318a8273541" gracePeriod=2 Sep 30 14:54:24 crc kubenswrapper[4676]: I0930 14:54:24.336657 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fglpz" Sep 30 14:54:24 crc kubenswrapper[4676]: I0930 14:54:24.471837 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/645aae53-2a92-47f1-89e0-0ecea4ff98b1-catalog-content\") pod \"645aae53-2a92-47f1-89e0-0ecea4ff98b1\" (UID: \"645aae53-2a92-47f1-89e0-0ecea4ff98b1\") " Sep 30 14:54:24 crc kubenswrapper[4676]: I0930 14:54:24.472219 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxlhg\" (UniqueName: \"kubernetes.io/projected/645aae53-2a92-47f1-89e0-0ecea4ff98b1-kube-api-access-gxlhg\") pod \"645aae53-2a92-47f1-89e0-0ecea4ff98b1\" (UID: \"645aae53-2a92-47f1-89e0-0ecea4ff98b1\") " Sep 30 14:54:24 crc kubenswrapper[4676]: I0930 14:54:24.472353 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/645aae53-2a92-47f1-89e0-0ecea4ff98b1-utilities\") pod \"645aae53-2a92-47f1-89e0-0ecea4ff98b1\" (UID: \"645aae53-2a92-47f1-89e0-0ecea4ff98b1\") " Sep 30 14:54:24 crc kubenswrapper[4676]: I0930 14:54:24.473301 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/645aae53-2a92-47f1-89e0-0ecea4ff98b1-utilities" (OuterVolumeSpecName: "utilities") pod "645aae53-2a92-47f1-89e0-0ecea4ff98b1" (UID: "645aae53-2a92-47f1-89e0-0ecea4ff98b1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:54:24 crc kubenswrapper[4676]: I0930 14:54:24.479311 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/645aae53-2a92-47f1-89e0-0ecea4ff98b1-kube-api-access-gxlhg" (OuterVolumeSpecName: "kube-api-access-gxlhg") pod "645aae53-2a92-47f1-89e0-0ecea4ff98b1" (UID: "645aae53-2a92-47f1-89e0-0ecea4ff98b1"). InnerVolumeSpecName "kube-api-access-gxlhg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:54:24 crc kubenswrapper[4676]: I0930 14:54:24.520272 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/645aae53-2a92-47f1-89e0-0ecea4ff98b1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "645aae53-2a92-47f1-89e0-0ecea4ff98b1" (UID: "645aae53-2a92-47f1-89e0-0ecea4ff98b1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:54:24 crc kubenswrapper[4676]: I0930 14:54:24.580261 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxlhg\" (UniqueName: \"kubernetes.io/projected/645aae53-2a92-47f1-89e0-0ecea4ff98b1-kube-api-access-gxlhg\") on node \"crc\" DevicePath \"\"" Sep 30 14:54:24 crc kubenswrapper[4676]: I0930 14:54:24.580331 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/645aae53-2a92-47f1-89e0-0ecea4ff98b1-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 14:54:24 crc kubenswrapper[4676]: I0930 14:54:24.580348 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/645aae53-2a92-47f1-89e0-0ecea4ff98b1-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 14:54:24 crc kubenswrapper[4676]: I0930 14:54:24.903291 4676 generic.go:334] "Generic (PLEG): container finished" podID="645aae53-2a92-47f1-89e0-0ecea4ff98b1" containerID="f7662c50e3ccb81c065d79576f7a82b20886ff9e1fc01cafb412a318a8273541" exitCode=0 Sep 30 14:54:24 crc kubenswrapper[4676]: I0930 14:54:24.903340 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fglpz" event={"ID":"645aae53-2a92-47f1-89e0-0ecea4ff98b1","Type":"ContainerDied","Data":"f7662c50e3ccb81c065d79576f7a82b20886ff9e1fc01cafb412a318a8273541"} Sep 30 14:54:24 crc kubenswrapper[4676]: I0930 14:54:24.903370 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fglpz" event={"ID":"645aae53-2a92-47f1-89e0-0ecea4ff98b1","Type":"ContainerDied","Data":"2a1e808f2452cc787e66fc43ea2333e153278f260c0d88c45be13ca39d29598e"} Sep 30 14:54:24 crc kubenswrapper[4676]: I0930 14:54:24.903387 4676 scope.go:117] "RemoveContainer" containerID="f7662c50e3ccb81c065d79576f7a82b20886ff9e1fc01cafb412a318a8273541" Sep 30 14:54:24 crc kubenswrapper[4676]: I0930 14:54:24.903404 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fglpz" Sep 30 14:54:24 crc kubenswrapper[4676]: I0930 14:54:24.932065 4676 scope.go:117] "RemoveContainer" containerID="159a29ed49966e3aa58e48dafcb9be987bbbaf0275b1cfb19e4c2537873344f2" Sep 30 14:54:24 crc kubenswrapper[4676]: I0930 14:54:24.953276 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fglpz"] Sep 30 14:54:24 crc kubenswrapper[4676]: I0930 14:54:24.962624 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fglpz"] Sep 30 14:54:24 crc kubenswrapper[4676]: I0930 14:54:24.965124 4676 scope.go:117] "RemoveContainer" containerID="9553be432e0117412fd5bc1f6a63e3fd5e9db5b9a352735b0f84a523a9e4ecbe" Sep 30 14:54:25 crc kubenswrapper[4676]: I0930 14:54:25.004962 4676 scope.go:117] "RemoveContainer" containerID="f7662c50e3ccb81c065d79576f7a82b20886ff9e1fc01cafb412a318a8273541" Sep 30 14:54:25 crc kubenswrapper[4676]: E0930 14:54:25.005469 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7662c50e3ccb81c065d79576f7a82b20886ff9e1fc01cafb412a318a8273541\": container with ID starting with f7662c50e3ccb81c065d79576f7a82b20886ff9e1fc01cafb412a318a8273541 not found: ID does not exist" containerID="f7662c50e3ccb81c065d79576f7a82b20886ff9e1fc01cafb412a318a8273541" Sep 30 14:54:25 crc kubenswrapper[4676]: I0930 14:54:25.005518 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7662c50e3ccb81c065d79576f7a82b20886ff9e1fc01cafb412a318a8273541"} err="failed to get container status \"f7662c50e3ccb81c065d79576f7a82b20886ff9e1fc01cafb412a318a8273541\": rpc error: code = NotFound desc = could not find container \"f7662c50e3ccb81c065d79576f7a82b20886ff9e1fc01cafb412a318a8273541\": container with ID starting with f7662c50e3ccb81c065d79576f7a82b20886ff9e1fc01cafb412a318a8273541 not found: ID does not exist" Sep 30 14:54:25 crc kubenswrapper[4676]: I0930 14:54:25.005547 4676 scope.go:117] "RemoveContainer" containerID="159a29ed49966e3aa58e48dafcb9be987bbbaf0275b1cfb19e4c2537873344f2" Sep 30 14:54:25 crc kubenswrapper[4676]: E0930 14:54:25.005851 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"159a29ed49966e3aa58e48dafcb9be987bbbaf0275b1cfb19e4c2537873344f2\": container with ID starting with 159a29ed49966e3aa58e48dafcb9be987bbbaf0275b1cfb19e4c2537873344f2 not found: ID does not exist" containerID="159a29ed49966e3aa58e48dafcb9be987bbbaf0275b1cfb19e4c2537873344f2" Sep 30 14:54:25 crc kubenswrapper[4676]: I0930 14:54:25.005884 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"159a29ed49966e3aa58e48dafcb9be987bbbaf0275b1cfb19e4c2537873344f2"} err="failed to get container status \"159a29ed49966e3aa58e48dafcb9be987bbbaf0275b1cfb19e4c2537873344f2\": rpc error: code = NotFound desc = could not find container \"159a29ed49966e3aa58e48dafcb9be987bbbaf0275b1cfb19e4c2537873344f2\": container with ID starting with 159a29ed49966e3aa58e48dafcb9be987bbbaf0275b1cfb19e4c2537873344f2 not found: ID does not exist" Sep 30 14:54:25 crc kubenswrapper[4676]: I0930 14:54:25.005923 4676 scope.go:117] "RemoveContainer" containerID="9553be432e0117412fd5bc1f6a63e3fd5e9db5b9a352735b0f84a523a9e4ecbe" Sep 30 14:54:25 crc kubenswrapper[4676]: E0930 14:54:25.006142 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9553be432e0117412fd5bc1f6a63e3fd5e9db5b9a352735b0f84a523a9e4ecbe\": container with ID starting with 9553be432e0117412fd5bc1f6a63e3fd5e9db5b9a352735b0f84a523a9e4ecbe not found: ID does not exist" containerID="9553be432e0117412fd5bc1f6a63e3fd5e9db5b9a352735b0f84a523a9e4ecbe" Sep 30 14:54:25 crc kubenswrapper[4676]: I0930 14:54:25.006202 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9553be432e0117412fd5bc1f6a63e3fd5e9db5b9a352735b0f84a523a9e4ecbe"} err="failed to get container status \"9553be432e0117412fd5bc1f6a63e3fd5e9db5b9a352735b0f84a523a9e4ecbe\": rpc error: code = NotFound desc = could not find container \"9553be432e0117412fd5bc1f6a63e3fd5e9db5b9a352735b0f84a523a9e4ecbe\": container with ID starting with 9553be432e0117412fd5bc1f6a63e3fd5e9db5b9a352735b0f84a523a9e4ecbe not found: ID does not exist" Sep 30 14:54:25 crc kubenswrapper[4676]: I0930 14:54:25.443769 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="645aae53-2a92-47f1-89e0-0ecea4ff98b1" path="/var/lib/kubelet/pods/645aae53-2a92-47f1-89e0-0ecea4ff98b1/volumes" Sep 30 14:54:33 crc kubenswrapper[4676]: I0930 14:54:33.433663 4676 scope.go:117] "RemoveContainer" containerID="784fe086d452c9a174375c718c7a65e162c4962c87a59123843848bb343aa424" Sep 30 14:54:33 crc kubenswrapper[4676]: E0930 14:54:33.434416 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:54:48 crc kubenswrapper[4676]: I0930 14:54:48.432948 4676 scope.go:117] "RemoveContainer" containerID="784fe086d452c9a174375c718c7a65e162c4962c87a59123843848bb343aa424" Sep 30 14:54:48 crc kubenswrapper[4676]: E0930 14:54:48.433765 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:54:59 crc kubenswrapper[4676]: I0930 14:54:59.433373 4676 scope.go:117] "RemoveContainer" containerID="784fe086d452c9a174375c718c7a65e162c4962c87a59123843848bb343aa424" Sep 30 14:54:59 crc kubenswrapper[4676]: E0930 14:54:59.434161 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:55:13 crc kubenswrapper[4676]: I0930 14:55:13.433583 4676 scope.go:117] "RemoveContainer" containerID="784fe086d452c9a174375c718c7a65e162c4962c87a59123843848bb343aa424" Sep 30 14:55:13 crc kubenswrapper[4676]: E0930 14:55:13.434471 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:55:24 crc kubenswrapper[4676]: I0930 14:55:24.432815 4676 scope.go:117] "RemoveContainer" containerID="784fe086d452c9a174375c718c7a65e162c4962c87a59123843848bb343aa424" Sep 30 14:55:24 crc kubenswrapper[4676]: E0930 14:55:24.433767 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:55:35 crc kubenswrapper[4676]: I0930 14:55:35.433534 4676 scope.go:117] "RemoveContainer" containerID="784fe086d452c9a174375c718c7a65e162c4962c87a59123843848bb343aa424" Sep 30 14:55:35 crc kubenswrapper[4676]: E0930 14:55:35.434310 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:55:43 crc kubenswrapper[4676]: I0930 14:55:43.042087 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6g7q4"] Sep 30 14:55:43 crc kubenswrapper[4676]: E0930 14:55:43.043578 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="645aae53-2a92-47f1-89e0-0ecea4ff98b1" containerName="extract-content" Sep 30 14:55:43 crc kubenswrapper[4676]: I0930 14:55:43.043597 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="645aae53-2a92-47f1-89e0-0ecea4ff98b1" containerName="extract-content" Sep 30 14:55:43 crc kubenswrapper[4676]: E0930 14:55:43.043616 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="645aae53-2a92-47f1-89e0-0ecea4ff98b1" containerName="registry-server" Sep 30 14:55:43 crc kubenswrapper[4676]: I0930 14:55:43.043622 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="645aae53-2a92-47f1-89e0-0ecea4ff98b1" containerName="registry-server" Sep 30 14:55:43 crc kubenswrapper[4676]: E0930 14:55:43.043643 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="645aae53-2a92-47f1-89e0-0ecea4ff98b1" containerName="extract-utilities" Sep 30 14:55:43 crc kubenswrapper[4676]: I0930 14:55:43.043651 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="645aae53-2a92-47f1-89e0-0ecea4ff98b1" containerName="extract-utilities" Sep 30 14:55:43 crc kubenswrapper[4676]: I0930 14:55:43.043862 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="645aae53-2a92-47f1-89e0-0ecea4ff98b1" containerName="registry-server" Sep 30 14:55:43 crc kubenswrapper[4676]: I0930 14:55:43.047040 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6g7q4" Sep 30 14:55:43 crc kubenswrapper[4676]: I0930 14:55:43.054691 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6g7q4"] Sep 30 14:55:43 crc kubenswrapper[4676]: I0930 14:55:43.220867 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fb63044-ad2b-4c42-adef-a0efbaa78a96-catalog-content\") pod \"certified-operators-6g7q4\" (UID: \"7fb63044-ad2b-4c42-adef-a0efbaa78a96\") " pod="openshift-marketplace/certified-operators-6g7q4" Sep 30 14:55:43 crc kubenswrapper[4676]: I0930 14:55:43.221586 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p65sr\" (UniqueName: \"kubernetes.io/projected/7fb63044-ad2b-4c42-adef-a0efbaa78a96-kube-api-access-p65sr\") pod \"certified-operators-6g7q4\" (UID: \"7fb63044-ad2b-4c42-adef-a0efbaa78a96\") " pod="openshift-marketplace/certified-operators-6g7q4" Sep 30 14:55:43 crc kubenswrapper[4676]: I0930 14:55:43.221729 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fb63044-ad2b-4c42-adef-a0efbaa78a96-utilities\") pod \"certified-operators-6g7q4\" (UID: \"7fb63044-ad2b-4c42-adef-a0efbaa78a96\") " pod="openshift-marketplace/certified-operators-6g7q4" Sep 30 14:55:43 crc kubenswrapper[4676]: I0930 14:55:43.323983 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fb63044-ad2b-4c42-adef-a0efbaa78a96-utilities\") pod \"certified-operators-6g7q4\" (UID: \"7fb63044-ad2b-4c42-adef-a0efbaa78a96\") " pod="openshift-marketplace/certified-operators-6g7q4" Sep 30 14:55:43 crc kubenswrapper[4676]: I0930 14:55:43.324092 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fb63044-ad2b-4c42-adef-a0efbaa78a96-catalog-content\") pod \"certified-operators-6g7q4\" (UID: \"7fb63044-ad2b-4c42-adef-a0efbaa78a96\") " pod="openshift-marketplace/certified-operators-6g7q4" Sep 30 14:55:43 crc kubenswrapper[4676]: I0930 14:55:43.324193 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p65sr\" (UniqueName: \"kubernetes.io/projected/7fb63044-ad2b-4c42-adef-a0efbaa78a96-kube-api-access-p65sr\") pod \"certified-operators-6g7q4\" (UID: \"7fb63044-ad2b-4c42-adef-a0efbaa78a96\") " pod="openshift-marketplace/certified-operators-6g7q4" Sep 30 14:55:43 crc kubenswrapper[4676]: I0930 14:55:43.324917 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fb63044-ad2b-4c42-adef-a0efbaa78a96-utilities\") pod \"certified-operators-6g7q4\" (UID: \"7fb63044-ad2b-4c42-adef-a0efbaa78a96\") " pod="openshift-marketplace/certified-operators-6g7q4" Sep 30 14:55:43 crc kubenswrapper[4676]: I0930 14:55:43.325156 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fb63044-ad2b-4c42-adef-a0efbaa78a96-catalog-content\") pod \"certified-operators-6g7q4\" (UID: \"7fb63044-ad2b-4c42-adef-a0efbaa78a96\") " pod="openshift-marketplace/certified-operators-6g7q4" Sep 30 14:55:43 crc kubenswrapper[4676]: I0930 14:55:43.349741 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p65sr\" (UniqueName: \"kubernetes.io/projected/7fb63044-ad2b-4c42-adef-a0efbaa78a96-kube-api-access-p65sr\") pod \"certified-operators-6g7q4\" (UID: \"7fb63044-ad2b-4c42-adef-a0efbaa78a96\") " pod="openshift-marketplace/certified-operators-6g7q4" Sep 30 14:55:43 crc kubenswrapper[4676]: I0930 14:55:43.374507 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6g7q4" Sep 30 14:55:43 crc kubenswrapper[4676]: I0930 14:55:43.929737 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6g7q4"] Sep 30 14:55:44 crc kubenswrapper[4676]: I0930 14:55:44.608045 4676 generic.go:334] "Generic (PLEG): container finished" podID="7fb63044-ad2b-4c42-adef-a0efbaa78a96" containerID="e07b66327b071f6e350cbd6842a36240e48247d13605420480832560d0d7a8fe" exitCode=0 Sep 30 14:55:44 crc kubenswrapper[4676]: I0930 14:55:44.608260 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6g7q4" event={"ID":"7fb63044-ad2b-4c42-adef-a0efbaa78a96","Type":"ContainerDied","Data":"e07b66327b071f6e350cbd6842a36240e48247d13605420480832560d0d7a8fe"} Sep 30 14:55:44 crc kubenswrapper[4676]: I0930 14:55:44.608361 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6g7q4" event={"ID":"7fb63044-ad2b-4c42-adef-a0efbaa78a96","Type":"ContainerStarted","Data":"384a3b4d2e0f76822947eef08217604ea609c3cc5b30db286ec95787bd45c31d"} Sep 30 14:55:45 crc kubenswrapper[4676]: I0930 14:55:45.619388 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6g7q4" event={"ID":"7fb63044-ad2b-4c42-adef-a0efbaa78a96","Type":"ContainerStarted","Data":"a1f019eab09f0cce58868b1940e0ea2a4aedec65790c75de057bfb0e6eb18847"} Sep 30 14:55:46 crc kubenswrapper[4676]: I0930 14:55:46.433162 4676 scope.go:117] "RemoveContainer" containerID="784fe086d452c9a174375c718c7a65e162c4962c87a59123843848bb343aa424" Sep 30 14:55:46 crc kubenswrapper[4676]: E0930 14:55:46.433957 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:55:46 crc kubenswrapper[4676]: I0930 14:55:46.632029 4676 generic.go:334] "Generic (PLEG): container finished" podID="7fb63044-ad2b-4c42-adef-a0efbaa78a96" containerID="a1f019eab09f0cce58868b1940e0ea2a4aedec65790c75de057bfb0e6eb18847" exitCode=0 Sep 30 14:55:46 crc kubenswrapper[4676]: I0930 14:55:46.632106 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6g7q4" event={"ID":"7fb63044-ad2b-4c42-adef-a0efbaa78a96","Type":"ContainerDied","Data":"a1f019eab09f0cce58868b1940e0ea2a4aedec65790c75de057bfb0e6eb18847"} Sep 30 14:55:46 crc kubenswrapper[4676]: I0930 14:55:46.632205 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6g7q4" event={"ID":"7fb63044-ad2b-4c42-adef-a0efbaa78a96","Type":"ContainerStarted","Data":"367b879364f57c65b68561208b09f543a504c8d37e5659a22ca5efc9da7b0866"} Sep 30 14:55:46 crc kubenswrapper[4676]: I0930 14:55:46.661536 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6g7q4" podStartSLOduration=2.11932622 podStartE2EDuration="3.661500558s" podCreationTimestamp="2025-09-30 14:55:43 +0000 UTC" firstStartedPulling="2025-09-30 14:55:44.610314472 +0000 UTC m=+3448.593402901" lastFinishedPulling="2025-09-30 14:55:46.15248881 +0000 UTC m=+3450.135577239" observedRunningTime="2025-09-30 14:55:46.659081135 +0000 UTC m=+3450.642169564" watchObservedRunningTime="2025-09-30 14:55:46.661500558 +0000 UTC m=+3450.644588987" Sep 30 14:55:47 crc kubenswrapper[4676]: I0930 14:55:47.241250 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-j2tt2"] Sep 30 14:55:47 crc kubenswrapper[4676]: I0930 14:55:47.244316 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j2tt2" Sep 30 14:55:47 crc kubenswrapper[4676]: I0930 14:55:47.256848 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j2tt2"] Sep 30 14:55:47 crc kubenswrapper[4676]: I0930 14:55:47.306169 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pn4qf\" (UniqueName: \"kubernetes.io/projected/ad433a9b-2c06-42b5-89f5-0f527112115b-kube-api-access-pn4qf\") pod \"redhat-marketplace-j2tt2\" (UID: \"ad433a9b-2c06-42b5-89f5-0f527112115b\") " pod="openshift-marketplace/redhat-marketplace-j2tt2" Sep 30 14:55:47 crc kubenswrapper[4676]: I0930 14:55:47.306471 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad433a9b-2c06-42b5-89f5-0f527112115b-catalog-content\") pod \"redhat-marketplace-j2tt2\" (UID: \"ad433a9b-2c06-42b5-89f5-0f527112115b\") " pod="openshift-marketplace/redhat-marketplace-j2tt2" Sep 30 14:55:47 crc kubenswrapper[4676]: I0930 14:55:47.306555 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad433a9b-2c06-42b5-89f5-0f527112115b-utilities\") pod \"redhat-marketplace-j2tt2\" (UID: \"ad433a9b-2c06-42b5-89f5-0f527112115b\") " pod="openshift-marketplace/redhat-marketplace-j2tt2" Sep 30 14:55:47 crc kubenswrapper[4676]: I0930 14:55:47.408003 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pn4qf\" (UniqueName: \"kubernetes.io/projected/ad433a9b-2c06-42b5-89f5-0f527112115b-kube-api-access-pn4qf\") pod \"redhat-marketplace-j2tt2\" (UID: \"ad433a9b-2c06-42b5-89f5-0f527112115b\") " pod="openshift-marketplace/redhat-marketplace-j2tt2" Sep 30 14:55:47 crc kubenswrapper[4676]: I0930 14:55:47.408194 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad433a9b-2c06-42b5-89f5-0f527112115b-catalog-content\") pod \"redhat-marketplace-j2tt2\" (UID: \"ad433a9b-2c06-42b5-89f5-0f527112115b\") " pod="openshift-marketplace/redhat-marketplace-j2tt2" Sep 30 14:55:47 crc kubenswrapper[4676]: I0930 14:55:47.408235 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad433a9b-2c06-42b5-89f5-0f527112115b-utilities\") pod \"redhat-marketplace-j2tt2\" (UID: \"ad433a9b-2c06-42b5-89f5-0f527112115b\") " pod="openshift-marketplace/redhat-marketplace-j2tt2" Sep 30 14:55:47 crc kubenswrapper[4676]: I0930 14:55:47.408823 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad433a9b-2c06-42b5-89f5-0f527112115b-utilities\") pod \"redhat-marketplace-j2tt2\" (UID: \"ad433a9b-2c06-42b5-89f5-0f527112115b\") " pod="openshift-marketplace/redhat-marketplace-j2tt2" Sep 30 14:55:47 crc kubenswrapper[4676]: I0930 14:55:47.409027 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad433a9b-2c06-42b5-89f5-0f527112115b-catalog-content\") pod \"redhat-marketplace-j2tt2\" (UID: \"ad433a9b-2c06-42b5-89f5-0f527112115b\") " pod="openshift-marketplace/redhat-marketplace-j2tt2" Sep 30 14:55:47 crc kubenswrapper[4676]: I0930 14:55:47.436835 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pn4qf\" (UniqueName: \"kubernetes.io/projected/ad433a9b-2c06-42b5-89f5-0f527112115b-kube-api-access-pn4qf\") pod \"redhat-marketplace-j2tt2\" (UID: \"ad433a9b-2c06-42b5-89f5-0f527112115b\") " pod="openshift-marketplace/redhat-marketplace-j2tt2" Sep 30 14:55:47 crc kubenswrapper[4676]: I0930 14:55:47.566992 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j2tt2" Sep 30 14:55:48 crc kubenswrapper[4676]: I0930 14:55:48.118897 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j2tt2"] Sep 30 14:55:48 crc kubenswrapper[4676]: W0930 14:55:48.132133 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad433a9b_2c06_42b5_89f5_0f527112115b.slice/crio-ecc74083549c1c1219005d6c5dd4a72594514da9d6d8193bad6a67523fc55321 WatchSource:0}: Error finding container ecc74083549c1c1219005d6c5dd4a72594514da9d6d8193bad6a67523fc55321: Status 404 returned error can't find the container with id ecc74083549c1c1219005d6c5dd4a72594514da9d6d8193bad6a67523fc55321 Sep 30 14:55:48 crc kubenswrapper[4676]: I0930 14:55:48.655289 4676 generic.go:334] "Generic (PLEG): container finished" podID="ad433a9b-2c06-42b5-89f5-0f527112115b" containerID="b4c202910ed7b9b0f5850d48960d2ebf54b1d62a2fb6a2cda4f2e112f3b3d6e9" exitCode=0 Sep 30 14:55:48 crc kubenswrapper[4676]: I0930 14:55:48.655408 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j2tt2" event={"ID":"ad433a9b-2c06-42b5-89f5-0f527112115b","Type":"ContainerDied","Data":"b4c202910ed7b9b0f5850d48960d2ebf54b1d62a2fb6a2cda4f2e112f3b3d6e9"} Sep 30 14:55:48 crc kubenswrapper[4676]: I0930 14:55:48.655627 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j2tt2" event={"ID":"ad433a9b-2c06-42b5-89f5-0f527112115b","Type":"ContainerStarted","Data":"ecc74083549c1c1219005d6c5dd4a72594514da9d6d8193bad6a67523fc55321"} Sep 30 14:55:49 crc kubenswrapper[4676]: I0930 14:55:49.667400 4676 generic.go:334] "Generic (PLEG): container finished" podID="ad433a9b-2c06-42b5-89f5-0f527112115b" containerID="9f557af70430057334b34612f0c96a8bbbb928b85a9f7fb4b563e8b9ef4c4318" exitCode=0 Sep 30 14:55:49 crc kubenswrapper[4676]: I0930 14:55:49.667519 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j2tt2" event={"ID":"ad433a9b-2c06-42b5-89f5-0f527112115b","Type":"ContainerDied","Data":"9f557af70430057334b34612f0c96a8bbbb928b85a9f7fb4b563e8b9ef4c4318"} Sep 30 14:55:50 crc kubenswrapper[4676]: I0930 14:55:50.679496 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j2tt2" event={"ID":"ad433a9b-2c06-42b5-89f5-0f527112115b","Type":"ContainerStarted","Data":"a89d8487ebba5829bb4b420ee05f76166c4bc97bdca4dc026fd8ac0cb26f3b9e"} Sep 30 14:55:50 crc kubenswrapper[4676]: I0930 14:55:50.706793 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-j2tt2" podStartSLOduration=2.31841105 podStartE2EDuration="3.706757839s" podCreationTimestamp="2025-09-30 14:55:47 +0000 UTC" firstStartedPulling="2025-09-30 14:55:48.657068454 +0000 UTC m=+3452.640156893" lastFinishedPulling="2025-09-30 14:55:50.045415253 +0000 UTC m=+3454.028503682" observedRunningTime="2025-09-30 14:55:50.700519665 +0000 UTC m=+3454.683608094" watchObservedRunningTime="2025-09-30 14:55:50.706757839 +0000 UTC m=+3454.689846268" Sep 30 14:55:53 crc kubenswrapper[4676]: I0930 14:55:53.374917 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6g7q4" Sep 30 14:55:53 crc kubenswrapper[4676]: I0930 14:55:53.375508 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6g7q4" Sep 30 14:55:53 crc kubenswrapper[4676]: I0930 14:55:53.422296 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6g7q4" Sep 30 14:55:53 crc kubenswrapper[4676]: I0930 14:55:53.753977 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6g7q4" Sep 30 14:55:56 crc kubenswrapper[4676]: I0930 14:55:56.031962 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6g7q4"] Sep 30 14:55:56 crc kubenswrapper[4676]: I0930 14:55:56.032558 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6g7q4" podUID="7fb63044-ad2b-4c42-adef-a0efbaa78a96" containerName="registry-server" containerID="cri-o://367b879364f57c65b68561208b09f543a504c8d37e5659a22ca5efc9da7b0866" gracePeriod=2 Sep 30 14:55:56 crc kubenswrapper[4676]: I0930 14:55:56.554212 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6g7q4" Sep 30 14:55:56 crc kubenswrapper[4676]: I0930 14:55:56.712010 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fb63044-ad2b-4c42-adef-a0efbaa78a96-utilities\") pod \"7fb63044-ad2b-4c42-adef-a0efbaa78a96\" (UID: \"7fb63044-ad2b-4c42-adef-a0efbaa78a96\") " Sep 30 14:55:56 crc kubenswrapper[4676]: I0930 14:55:56.712396 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p65sr\" (UniqueName: \"kubernetes.io/projected/7fb63044-ad2b-4c42-adef-a0efbaa78a96-kube-api-access-p65sr\") pod \"7fb63044-ad2b-4c42-adef-a0efbaa78a96\" (UID: \"7fb63044-ad2b-4c42-adef-a0efbaa78a96\") " Sep 30 14:55:56 crc kubenswrapper[4676]: I0930 14:55:56.712501 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fb63044-ad2b-4c42-adef-a0efbaa78a96-catalog-content\") pod \"7fb63044-ad2b-4c42-adef-a0efbaa78a96\" (UID: \"7fb63044-ad2b-4c42-adef-a0efbaa78a96\") " Sep 30 14:55:56 crc kubenswrapper[4676]: I0930 14:55:56.713136 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fb63044-ad2b-4c42-adef-a0efbaa78a96-utilities" (OuterVolumeSpecName: "utilities") pod "7fb63044-ad2b-4c42-adef-a0efbaa78a96" (UID: "7fb63044-ad2b-4c42-adef-a0efbaa78a96"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:55:56 crc kubenswrapper[4676]: I0930 14:55:56.716403 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fb63044-ad2b-4c42-adef-a0efbaa78a96-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 14:55:56 crc kubenswrapper[4676]: I0930 14:55:56.735849 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fb63044-ad2b-4c42-adef-a0efbaa78a96-kube-api-access-p65sr" (OuterVolumeSpecName: "kube-api-access-p65sr") pod "7fb63044-ad2b-4c42-adef-a0efbaa78a96" (UID: "7fb63044-ad2b-4c42-adef-a0efbaa78a96"). InnerVolumeSpecName "kube-api-access-p65sr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:55:56 crc kubenswrapper[4676]: I0930 14:55:56.755606 4676 generic.go:334] "Generic (PLEG): container finished" podID="7fb63044-ad2b-4c42-adef-a0efbaa78a96" containerID="367b879364f57c65b68561208b09f543a504c8d37e5659a22ca5efc9da7b0866" exitCode=0 Sep 30 14:55:56 crc kubenswrapper[4676]: I0930 14:55:56.755681 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6g7q4" event={"ID":"7fb63044-ad2b-4c42-adef-a0efbaa78a96","Type":"ContainerDied","Data":"367b879364f57c65b68561208b09f543a504c8d37e5659a22ca5efc9da7b0866"} Sep 30 14:55:56 crc kubenswrapper[4676]: I0930 14:55:56.755722 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6g7q4" event={"ID":"7fb63044-ad2b-4c42-adef-a0efbaa78a96","Type":"ContainerDied","Data":"384a3b4d2e0f76822947eef08217604ea609c3cc5b30db286ec95787bd45c31d"} Sep 30 14:55:56 crc kubenswrapper[4676]: I0930 14:55:56.755785 4676 scope.go:117] "RemoveContainer" containerID="367b879364f57c65b68561208b09f543a504c8d37e5659a22ca5efc9da7b0866" Sep 30 14:55:56 crc kubenswrapper[4676]: I0930 14:55:56.756110 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6g7q4" Sep 30 14:55:56 crc kubenswrapper[4676]: I0930 14:55:56.783699 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fb63044-ad2b-4c42-adef-a0efbaa78a96-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7fb63044-ad2b-4c42-adef-a0efbaa78a96" (UID: "7fb63044-ad2b-4c42-adef-a0efbaa78a96"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:55:56 crc kubenswrapper[4676]: I0930 14:55:56.814190 4676 scope.go:117] "RemoveContainer" containerID="a1f019eab09f0cce58868b1940e0ea2a4aedec65790c75de057bfb0e6eb18847" Sep 30 14:55:56 crc kubenswrapper[4676]: I0930 14:55:56.820103 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p65sr\" (UniqueName: \"kubernetes.io/projected/7fb63044-ad2b-4c42-adef-a0efbaa78a96-kube-api-access-p65sr\") on node \"crc\" DevicePath \"\"" Sep 30 14:55:56 crc kubenswrapper[4676]: I0930 14:55:56.820142 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fb63044-ad2b-4c42-adef-a0efbaa78a96-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 14:55:56 crc kubenswrapper[4676]: I0930 14:55:56.868229 4676 scope.go:117] "RemoveContainer" containerID="e07b66327b071f6e350cbd6842a36240e48247d13605420480832560d0d7a8fe" Sep 30 14:55:56 crc kubenswrapper[4676]: I0930 14:55:56.932250 4676 scope.go:117] "RemoveContainer" containerID="367b879364f57c65b68561208b09f543a504c8d37e5659a22ca5efc9da7b0866" Sep 30 14:55:56 crc kubenswrapper[4676]: E0930 14:55:56.934082 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"367b879364f57c65b68561208b09f543a504c8d37e5659a22ca5efc9da7b0866\": container with ID starting with 367b879364f57c65b68561208b09f543a504c8d37e5659a22ca5efc9da7b0866 not found: ID does not exist" containerID="367b879364f57c65b68561208b09f543a504c8d37e5659a22ca5efc9da7b0866" Sep 30 14:55:56 crc kubenswrapper[4676]: I0930 14:55:56.934118 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"367b879364f57c65b68561208b09f543a504c8d37e5659a22ca5efc9da7b0866"} err="failed to get container status \"367b879364f57c65b68561208b09f543a504c8d37e5659a22ca5efc9da7b0866\": rpc error: code = NotFound desc = could not find container \"367b879364f57c65b68561208b09f543a504c8d37e5659a22ca5efc9da7b0866\": container with ID starting with 367b879364f57c65b68561208b09f543a504c8d37e5659a22ca5efc9da7b0866 not found: ID does not exist" Sep 30 14:55:56 crc kubenswrapper[4676]: I0930 14:55:56.934146 4676 scope.go:117] "RemoveContainer" containerID="a1f019eab09f0cce58868b1940e0ea2a4aedec65790c75de057bfb0e6eb18847" Sep 30 14:55:56 crc kubenswrapper[4676]: E0930 14:55:56.941173 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1f019eab09f0cce58868b1940e0ea2a4aedec65790c75de057bfb0e6eb18847\": container with ID starting with a1f019eab09f0cce58868b1940e0ea2a4aedec65790c75de057bfb0e6eb18847 not found: ID does not exist" containerID="a1f019eab09f0cce58868b1940e0ea2a4aedec65790c75de057bfb0e6eb18847" Sep 30 14:55:56 crc kubenswrapper[4676]: I0930 14:55:56.941417 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1f019eab09f0cce58868b1940e0ea2a4aedec65790c75de057bfb0e6eb18847"} err="failed to get container status \"a1f019eab09f0cce58868b1940e0ea2a4aedec65790c75de057bfb0e6eb18847\": rpc error: code = NotFound desc = could not find container \"a1f019eab09f0cce58868b1940e0ea2a4aedec65790c75de057bfb0e6eb18847\": container with ID starting with a1f019eab09f0cce58868b1940e0ea2a4aedec65790c75de057bfb0e6eb18847 not found: ID does not exist" Sep 30 14:55:56 crc kubenswrapper[4676]: I0930 14:55:56.941548 4676 scope.go:117] "RemoveContainer" containerID="e07b66327b071f6e350cbd6842a36240e48247d13605420480832560d0d7a8fe" Sep 30 14:55:56 crc kubenswrapper[4676]: E0930 14:55:56.943104 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e07b66327b071f6e350cbd6842a36240e48247d13605420480832560d0d7a8fe\": container with ID starting with e07b66327b071f6e350cbd6842a36240e48247d13605420480832560d0d7a8fe not found: ID does not exist" containerID="e07b66327b071f6e350cbd6842a36240e48247d13605420480832560d0d7a8fe" Sep 30 14:55:56 crc kubenswrapper[4676]: I0930 14:55:56.943210 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e07b66327b071f6e350cbd6842a36240e48247d13605420480832560d0d7a8fe"} err="failed to get container status \"e07b66327b071f6e350cbd6842a36240e48247d13605420480832560d0d7a8fe\": rpc error: code = NotFound desc = could not find container \"e07b66327b071f6e350cbd6842a36240e48247d13605420480832560d0d7a8fe\": container with ID starting with e07b66327b071f6e350cbd6842a36240e48247d13605420480832560d0d7a8fe not found: ID does not exist" Sep 30 14:55:57 crc kubenswrapper[4676]: I0930 14:55:57.089896 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6g7q4"] Sep 30 14:55:57 crc kubenswrapper[4676]: I0930 14:55:57.100360 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6g7q4"] Sep 30 14:55:57 crc kubenswrapper[4676]: I0930 14:55:57.446730 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fb63044-ad2b-4c42-adef-a0efbaa78a96" path="/var/lib/kubelet/pods/7fb63044-ad2b-4c42-adef-a0efbaa78a96/volumes" Sep 30 14:55:57 crc kubenswrapper[4676]: I0930 14:55:57.567142 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-j2tt2" Sep 30 14:55:57 crc kubenswrapper[4676]: I0930 14:55:57.567212 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-j2tt2" Sep 30 14:55:57 crc kubenswrapper[4676]: I0930 14:55:57.621165 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-j2tt2" Sep 30 14:55:57 crc kubenswrapper[4676]: I0930 14:55:57.806126 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-j2tt2" Sep 30 14:56:00 crc kubenswrapper[4676]: I0930 14:56:00.026377 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j2tt2"] Sep 30 14:56:00 crc kubenswrapper[4676]: I0930 14:56:00.026908 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-j2tt2" podUID="ad433a9b-2c06-42b5-89f5-0f527112115b" containerName="registry-server" containerID="cri-o://a89d8487ebba5829bb4b420ee05f76166c4bc97bdca4dc026fd8ac0cb26f3b9e" gracePeriod=2 Sep 30 14:56:00 crc kubenswrapper[4676]: I0930 14:56:00.521021 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j2tt2" Sep 30 14:56:00 crc kubenswrapper[4676]: I0930 14:56:00.697125 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pn4qf\" (UniqueName: \"kubernetes.io/projected/ad433a9b-2c06-42b5-89f5-0f527112115b-kube-api-access-pn4qf\") pod \"ad433a9b-2c06-42b5-89f5-0f527112115b\" (UID: \"ad433a9b-2c06-42b5-89f5-0f527112115b\") " Sep 30 14:56:00 crc kubenswrapper[4676]: I0930 14:56:00.697193 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad433a9b-2c06-42b5-89f5-0f527112115b-catalog-content\") pod \"ad433a9b-2c06-42b5-89f5-0f527112115b\" (UID: \"ad433a9b-2c06-42b5-89f5-0f527112115b\") " Sep 30 14:56:00 crc kubenswrapper[4676]: I0930 14:56:00.697231 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad433a9b-2c06-42b5-89f5-0f527112115b-utilities\") pod \"ad433a9b-2c06-42b5-89f5-0f527112115b\" (UID: \"ad433a9b-2c06-42b5-89f5-0f527112115b\") " Sep 30 14:56:00 crc kubenswrapper[4676]: I0930 14:56:00.698313 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad433a9b-2c06-42b5-89f5-0f527112115b-utilities" (OuterVolumeSpecName: "utilities") pod "ad433a9b-2c06-42b5-89f5-0f527112115b" (UID: "ad433a9b-2c06-42b5-89f5-0f527112115b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:56:00 crc kubenswrapper[4676]: I0930 14:56:00.702815 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad433a9b-2c06-42b5-89f5-0f527112115b-kube-api-access-pn4qf" (OuterVolumeSpecName: "kube-api-access-pn4qf") pod "ad433a9b-2c06-42b5-89f5-0f527112115b" (UID: "ad433a9b-2c06-42b5-89f5-0f527112115b"). InnerVolumeSpecName "kube-api-access-pn4qf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:56:00 crc kubenswrapper[4676]: I0930 14:56:00.710327 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad433a9b-2c06-42b5-89f5-0f527112115b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ad433a9b-2c06-42b5-89f5-0f527112115b" (UID: "ad433a9b-2c06-42b5-89f5-0f527112115b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:56:00 crc kubenswrapper[4676]: I0930 14:56:00.792008 4676 generic.go:334] "Generic (PLEG): container finished" podID="ad433a9b-2c06-42b5-89f5-0f527112115b" containerID="a89d8487ebba5829bb4b420ee05f76166c4bc97bdca4dc026fd8ac0cb26f3b9e" exitCode=0 Sep 30 14:56:00 crc kubenswrapper[4676]: I0930 14:56:00.792051 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j2tt2" event={"ID":"ad433a9b-2c06-42b5-89f5-0f527112115b","Type":"ContainerDied","Data":"a89d8487ebba5829bb4b420ee05f76166c4bc97bdca4dc026fd8ac0cb26f3b9e"} Sep 30 14:56:00 crc kubenswrapper[4676]: I0930 14:56:00.792065 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j2tt2" Sep 30 14:56:00 crc kubenswrapper[4676]: I0930 14:56:00.792075 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j2tt2" event={"ID":"ad433a9b-2c06-42b5-89f5-0f527112115b","Type":"ContainerDied","Data":"ecc74083549c1c1219005d6c5dd4a72594514da9d6d8193bad6a67523fc55321"} Sep 30 14:56:00 crc kubenswrapper[4676]: I0930 14:56:00.792094 4676 scope.go:117] "RemoveContainer" containerID="a89d8487ebba5829bb4b420ee05f76166c4bc97bdca4dc026fd8ac0cb26f3b9e" Sep 30 14:56:00 crc kubenswrapper[4676]: I0930 14:56:00.799120 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pn4qf\" (UniqueName: \"kubernetes.io/projected/ad433a9b-2c06-42b5-89f5-0f527112115b-kube-api-access-pn4qf\") on node \"crc\" DevicePath \"\"" Sep 30 14:56:00 crc kubenswrapper[4676]: I0930 14:56:00.799194 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad433a9b-2c06-42b5-89f5-0f527112115b-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 14:56:00 crc kubenswrapper[4676]: I0930 14:56:00.799209 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad433a9b-2c06-42b5-89f5-0f527112115b-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 14:56:00 crc kubenswrapper[4676]: I0930 14:56:00.813325 4676 scope.go:117] "RemoveContainer" containerID="9f557af70430057334b34612f0c96a8bbbb928b85a9f7fb4b563e8b9ef4c4318" Sep 30 14:56:00 crc kubenswrapper[4676]: I0930 14:56:00.845478 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j2tt2"] Sep 30 14:56:00 crc kubenswrapper[4676]: I0930 14:56:00.852395 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-j2tt2"] Sep 30 14:56:00 crc kubenswrapper[4676]: I0930 14:56:00.856616 4676 scope.go:117] "RemoveContainer" containerID="b4c202910ed7b9b0f5850d48960d2ebf54b1d62a2fb6a2cda4f2e112f3b3d6e9" Sep 30 14:56:00 crc kubenswrapper[4676]: I0930 14:56:00.894666 4676 scope.go:117] "RemoveContainer" containerID="a89d8487ebba5829bb4b420ee05f76166c4bc97bdca4dc026fd8ac0cb26f3b9e" Sep 30 14:56:00 crc kubenswrapper[4676]: E0930 14:56:00.895336 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a89d8487ebba5829bb4b420ee05f76166c4bc97bdca4dc026fd8ac0cb26f3b9e\": container with ID starting with a89d8487ebba5829bb4b420ee05f76166c4bc97bdca4dc026fd8ac0cb26f3b9e not found: ID does not exist" containerID="a89d8487ebba5829bb4b420ee05f76166c4bc97bdca4dc026fd8ac0cb26f3b9e" Sep 30 14:56:00 crc kubenswrapper[4676]: I0930 14:56:00.895383 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a89d8487ebba5829bb4b420ee05f76166c4bc97bdca4dc026fd8ac0cb26f3b9e"} err="failed to get container status \"a89d8487ebba5829bb4b420ee05f76166c4bc97bdca4dc026fd8ac0cb26f3b9e\": rpc error: code = NotFound desc = could not find container \"a89d8487ebba5829bb4b420ee05f76166c4bc97bdca4dc026fd8ac0cb26f3b9e\": container with ID starting with a89d8487ebba5829bb4b420ee05f76166c4bc97bdca4dc026fd8ac0cb26f3b9e not found: ID does not exist" Sep 30 14:56:00 crc kubenswrapper[4676]: I0930 14:56:00.895411 4676 scope.go:117] "RemoveContainer" containerID="9f557af70430057334b34612f0c96a8bbbb928b85a9f7fb4b563e8b9ef4c4318" Sep 30 14:56:00 crc kubenswrapper[4676]: E0930 14:56:00.895837 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f557af70430057334b34612f0c96a8bbbb928b85a9f7fb4b563e8b9ef4c4318\": container with ID starting with 9f557af70430057334b34612f0c96a8bbbb928b85a9f7fb4b563e8b9ef4c4318 not found: ID does not exist" containerID="9f557af70430057334b34612f0c96a8bbbb928b85a9f7fb4b563e8b9ef4c4318" Sep 30 14:56:00 crc kubenswrapper[4676]: I0930 14:56:00.895920 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f557af70430057334b34612f0c96a8bbbb928b85a9f7fb4b563e8b9ef4c4318"} err="failed to get container status \"9f557af70430057334b34612f0c96a8bbbb928b85a9f7fb4b563e8b9ef4c4318\": rpc error: code = NotFound desc = could not find container \"9f557af70430057334b34612f0c96a8bbbb928b85a9f7fb4b563e8b9ef4c4318\": container with ID starting with 9f557af70430057334b34612f0c96a8bbbb928b85a9f7fb4b563e8b9ef4c4318 not found: ID does not exist" Sep 30 14:56:00 crc kubenswrapper[4676]: I0930 14:56:00.895951 4676 scope.go:117] "RemoveContainer" containerID="b4c202910ed7b9b0f5850d48960d2ebf54b1d62a2fb6a2cda4f2e112f3b3d6e9" Sep 30 14:56:00 crc kubenswrapper[4676]: E0930 14:56:00.896288 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4c202910ed7b9b0f5850d48960d2ebf54b1d62a2fb6a2cda4f2e112f3b3d6e9\": container with ID starting with b4c202910ed7b9b0f5850d48960d2ebf54b1d62a2fb6a2cda4f2e112f3b3d6e9 not found: ID does not exist" containerID="b4c202910ed7b9b0f5850d48960d2ebf54b1d62a2fb6a2cda4f2e112f3b3d6e9" Sep 30 14:56:00 crc kubenswrapper[4676]: I0930 14:56:00.896335 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4c202910ed7b9b0f5850d48960d2ebf54b1d62a2fb6a2cda4f2e112f3b3d6e9"} err="failed to get container status \"b4c202910ed7b9b0f5850d48960d2ebf54b1d62a2fb6a2cda4f2e112f3b3d6e9\": rpc error: code = NotFound desc = could not find container \"b4c202910ed7b9b0f5850d48960d2ebf54b1d62a2fb6a2cda4f2e112f3b3d6e9\": container with ID starting with b4c202910ed7b9b0f5850d48960d2ebf54b1d62a2fb6a2cda4f2e112f3b3d6e9 not found: ID does not exist" Sep 30 14:56:01 crc kubenswrapper[4676]: I0930 14:56:01.433588 4676 scope.go:117] "RemoveContainer" containerID="784fe086d452c9a174375c718c7a65e162c4962c87a59123843848bb343aa424" Sep 30 14:56:01 crc kubenswrapper[4676]: E0930 14:56:01.433926 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:56:01 crc kubenswrapper[4676]: I0930 14:56:01.445883 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad433a9b-2c06-42b5-89f5-0f527112115b" path="/var/lib/kubelet/pods/ad433a9b-2c06-42b5-89f5-0f527112115b/volumes" Sep 30 14:56:15 crc kubenswrapper[4676]: I0930 14:56:15.434384 4676 scope.go:117] "RemoveContainer" containerID="784fe086d452c9a174375c718c7a65e162c4962c87a59123843848bb343aa424" Sep 30 14:56:15 crc kubenswrapper[4676]: E0930 14:56:15.435519 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:56:26 crc kubenswrapper[4676]: I0930 14:56:26.433805 4676 scope.go:117] "RemoveContainer" containerID="784fe086d452c9a174375c718c7a65e162c4962c87a59123843848bb343aa424" Sep 30 14:56:26 crc kubenswrapper[4676]: E0930 14:56:26.435016 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 14:56:40 crc kubenswrapper[4676]: I0930 14:56:40.433065 4676 scope.go:117] "RemoveContainer" containerID="784fe086d452c9a174375c718c7a65e162c4962c87a59123843848bb343aa424" Sep 30 14:56:41 crc kubenswrapper[4676]: I0930 14:56:41.158937 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" event={"ID":"af133cb7-f0e4-428e-b348-c6e81493fc1d","Type":"ContainerStarted","Data":"30d1d9e76f6f2a6cf5fbf2ec7044efa19d21f8cf0a1419c92944482b8ca02b9b"} Sep 30 14:58:59 crc kubenswrapper[4676]: I0930 14:58:59.919926 4676 patch_prober.go:28] interesting pod/machine-config-daemon-4k2dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:58:59 crc kubenswrapper[4676]: I0930 14:58:59.920510 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:59:08 crc kubenswrapper[4676]: I0930 14:59:08.494801 4676 generic.go:334] "Generic (PLEG): container finished" podID="10b046c1-241e-4dfd-9aa3-d3e5532a6190" containerID="bfe576d41d89b88bedf6a386cea7c8e7bdfcbe27bf4195eb56c4e647af73d0f0" exitCode=0 Sep 30 14:59:08 crc kubenswrapper[4676]: I0930 14:59:08.494913 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"10b046c1-241e-4dfd-9aa3-d3e5532a6190","Type":"ContainerDied","Data":"bfe576d41d89b88bedf6a386cea7c8e7bdfcbe27bf4195eb56c4e647af73d0f0"} Sep 30 14:59:09 crc kubenswrapper[4676]: I0930 14:59:09.877669 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Sep 30 14:59:10 crc kubenswrapper[4676]: I0930 14:59:10.077322 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/10b046c1-241e-4dfd-9aa3-d3e5532a6190-test-operator-ephemeral-temporary\") pod \"10b046c1-241e-4dfd-9aa3-d3e5532a6190\" (UID: \"10b046c1-241e-4dfd-9aa3-d3e5532a6190\") " Sep 30 14:59:10 crc kubenswrapper[4676]: I0930 14:59:10.077718 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/10b046c1-241e-4dfd-9aa3-d3e5532a6190-openstack-config\") pod \"10b046c1-241e-4dfd-9aa3-d3e5532a6190\" (UID: \"10b046c1-241e-4dfd-9aa3-d3e5532a6190\") " Sep 30 14:59:10 crc kubenswrapper[4676]: I0930 14:59:10.077798 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/10b046c1-241e-4dfd-9aa3-d3e5532a6190-config-data\") pod \"10b046c1-241e-4dfd-9aa3-d3e5532a6190\" (UID: \"10b046c1-241e-4dfd-9aa3-d3e5532a6190\") " Sep 30 14:59:10 crc kubenswrapper[4676]: I0930 14:59:10.077944 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"10b046c1-241e-4dfd-9aa3-d3e5532a6190\" (UID: \"10b046c1-241e-4dfd-9aa3-d3e5532a6190\") " Sep 30 14:59:10 crc kubenswrapper[4676]: I0930 14:59:10.077964 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vw9nq\" (UniqueName: \"kubernetes.io/projected/10b046c1-241e-4dfd-9aa3-d3e5532a6190-kube-api-access-vw9nq\") pod \"10b046c1-241e-4dfd-9aa3-d3e5532a6190\" (UID: \"10b046c1-241e-4dfd-9aa3-d3e5532a6190\") " Sep 30 14:59:10 crc kubenswrapper[4676]: I0930 14:59:10.077986 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/10b046c1-241e-4dfd-9aa3-d3e5532a6190-openstack-config-secret\") pod \"10b046c1-241e-4dfd-9aa3-d3e5532a6190\" (UID: \"10b046c1-241e-4dfd-9aa3-d3e5532a6190\") " Sep 30 14:59:10 crc kubenswrapper[4676]: I0930 14:59:10.078044 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/10b046c1-241e-4dfd-9aa3-d3e5532a6190-ca-certs\") pod \"10b046c1-241e-4dfd-9aa3-d3e5532a6190\" (UID: \"10b046c1-241e-4dfd-9aa3-d3e5532a6190\") " Sep 30 14:59:10 crc kubenswrapper[4676]: I0930 14:59:10.078061 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/10b046c1-241e-4dfd-9aa3-d3e5532a6190-ssh-key\") pod \"10b046c1-241e-4dfd-9aa3-d3e5532a6190\" (UID: \"10b046c1-241e-4dfd-9aa3-d3e5532a6190\") " Sep 30 14:59:10 crc kubenswrapper[4676]: I0930 14:59:10.078120 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/10b046c1-241e-4dfd-9aa3-d3e5532a6190-test-operator-ephemeral-workdir\") pod \"10b046c1-241e-4dfd-9aa3-d3e5532a6190\" (UID: \"10b046c1-241e-4dfd-9aa3-d3e5532a6190\") " Sep 30 14:59:10 crc kubenswrapper[4676]: I0930 14:59:10.078172 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10b046c1-241e-4dfd-9aa3-d3e5532a6190-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "10b046c1-241e-4dfd-9aa3-d3e5532a6190" (UID: "10b046c1-241e-4dfd-9aa3-d3e5532a6190"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:59:10 crc kubenswrapper[4676]: I0930 14:59:10.078735 4676 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/10b046c1-241e-4dfd-9aa3-d3e5532a6190-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Sep 30 14:59:10 crc kubenswrapper[4676]: I0930 14:59:10.078741 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10b046c1-241e-4dfd-9aa3-d3e5532a6190-config-data" (OuterVolumeSpecName: "config-data") pod "10b046c1-241e-4dfd-9aa3-d3e5532a6190" (UID: "10b046c1-241e-4dfd-9aa3-d3e5532a6190"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:59:10 crc kubenswrapper[4676]: I0930 14:59:10.083978 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10b046c1-241e-4dfd-9aa3-d3e5532a6190-kube-api-access-vw9nq" (OuterVolumeSpecName: "kube-api-access-vw9nq") pod "10b046c1-241e-4dfd-9aa3-d3e5532a6190" (UID: "10b046c1-241e-4dfd-9aa3-d3e5532a6190"). InnerVolumeSpecName "kube-api-access-vw9nq". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 14:59:10 crc kubenswrapper[4676]: I0930 14:59:10.085818 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10b046c1-241e-4dfd-9aa3-d3e5532a6190-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "10b046c1-241e-4dfd-9aa3-d3e5532a6190" (UID: "10b046c1-241e-4dfd-9aa3-d3e5532a6190"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 14:59:10 crc kubenswrapper[4676]: I0930 14:59:10.086179 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "test-operator-logs") pod "10b046c1-241e-4dfd-9aa3-d3e5532a6190" (UID: "10b046c1-241e-4dfd-9aa3-d3e5532a6190"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 14:59:10 crc kubenswrapper[4676]: I0930 14:59:10.109145 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10b046c1-241e-4dfd-9aa3-d3e5532a6190-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "10b046c1-241e-4dfd-9aa3-d3e5532a6190" (UID: "10b046c1-241e-4dfd-9aa3-d3e5532a6190"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:59:10 crc kubenswrapper[4676]: I0930 14:59:10.109773 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10b046c1-241e-4dfd-9aa3-d3e5532a6190-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "10b046c1-241e-4dfd-9aa3-d3e5532a6190" (UID: "10b046c1-241e-4dfd-9aa3-d3e5532a6190"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:59:10 crc kubenswrapper[4676]: I0930 14:59:10.112740 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10b046c1-241e-4dfd-9aa3-d3e5532a6190-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "10b046c1-241e-4dfd-9aa3-d3e5532a6190" (UID: "10b046c1-241e-4dfd-9aa3-d3e5532a6190"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 14:59:10 crc kubenswrapper[4676]: I0930 14:59:10.137611 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10b046c1-241e-4dfd-9aa3-d3e5532a6190-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "10b046c1-241e-4dfd-9aa3-d3e5532a6190" (UID: "10b046c1-241e-4dfd-9aa3-d3e5532a6190"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 14:59:10 crc kubenswrapper[4676]: I0930 14:59:10.180466 4676 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/10b046c1-241e-4dfd-9aa3-d3e5532a6190-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Sep 30 14:59:10 crc kubenswrapper[4676]: I0930 14:59:10.180509 4676 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/10b046c1-241e-4dfd-9aa3-d3e5532a6190-openstack-config\") on node \"crc\" DevicePath \"\"" Sep 30 14:59:10 crc kubenswrapper[4676]: I0930 14:59:10.180523 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/10b046c1-241e-4dfd-9aa3-d3e5532a6190-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 14:59:10 crc kubenswrapper[4676]: I0930 14:59:10.180558 4676 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Sep 30 14:59:10 crc kubenswrapper[4676]: I0930 14:59:10.180570 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vw9nq\" (UniqueName: \"kubernetes.io/projected/10b046c1-241e-4dfd-9aa3-d3e5532a6190-kube-api-access-vw9nq\") on node \"crc\" DevicePath \"\"" Sep 30 14:59:10 crc kubenswrapper[4676]: I0930 14:59:10.180583 4676 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/10b046c1-241e-4dfd-9aa3-d3e5532a6190-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Sep 30 14:59:10 crc kubenswrapper[4676]: I0930 14:59:10.180594 4676 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/10b046c1-241e-4dfd-9aa3-d3e5532a6190-ca-certs\") on node \"crc\" DevicePath \"\"" Sep 30 14:59:10 crc kubenswrapper[4676]: I0930 14:59:10.180604 4676 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/10b046c1-241e-4dfd-9aa3-d3e5532a6190-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 14:59:10 crc kubenswrapper[4676]: I0930 14:59:10.200584 4676 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Sep 30 14:59:10 crc kubenswrapper[4676]: I0930 14:59:10.282227 4676 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Sep 30 14:59:10 crc kubenswrapper[4676]: I0930 14:59:10.515090 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"10b046c1-241e-4dfd-9aa3-d3e5532a6190","Type":"ContainerDied","Data":"2e6525e6220f788aead0c6e8543e2695df88158cf32b944fa2e94abf1b4369bc"} Sep 30 14:59:10 crc kubenswrapper[4676]: I0930 14:59:10.515142 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e6525e6220f788aead0c6e8543e2695df88158cf32b944fa2e94abf1b4369bc" Sep 30 14:59:10 crc kubenswrapper[4676]: I0930 14:59:10.515177 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Sep 30 14:59:13 crc kubenswrapper[4676]: I0930 14:59:13.974294 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Sep 30 14:59:13 crc kubenswrapper[4676]: E0930 14:59:13.976035 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad433a9b-2c06-42b5-89f5-0f527112115b" containerName="extract-utilities" Sep 30 14:59:13 crc kubenswrapper[4676]: I0930 14:59:13.976063 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad433a9b-2c06-42b5-89f5-0f527112115b" containerName="extract-utilities" Sep 30 14:59:13 crc kubenswrapper[4676]: E0930 14:59:13.976103 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10b046c1-241e-4dfd-9aa3-d3e5532a6190" containerName="tempest-tests-tempest-tests-runner" Sep 30 14:59:13 crc kubenswrapper[4676]: I0930 14:59:13.976114 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="10b046c1-241e-4dfd-9aa3-d3e5532a6190" containerName="tempest-tests-tempest-tests-runner" Sep 30 14:59:13 crc kubenswrapper[4676]: E0930 14:59:13.976150 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fb63044-ad2b-4c42-adef-a0efbaa78a96" containerName="extract-content" Sep 30 14:59:13 crc kubenswrapper[4676]: I0930 14:59:13.976160 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fb63044-ad2b-4c42-adef-a0efbaa78a96" containerName="extract-content" Sep 30 14:59:13 crc kubenswrapper[4676]: E0930 14:59:13.976229 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fb63044-ad2b-4c42-adef-a0efbaa78a96" containerName="registry-server" Sep 30 14:59:13 crc kubenswrapper[4676]: I0930 14:59:13.976240 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fb63044-ad2b-4c42-adef-a0efbaa78a96" containerName="registry-server" Sep 30 14:59:13 crc kubenswrapper[4676]: E0930 14:59:13.976255 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad433a9b-2c06-42b5-89f5-0f527112115b" containerName="extract-content" Sep 30 14:59:13 crc kubenswrapper[4676]: I0930 14:59:13.976264 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad433a9b-2c06-42b5-89f5-0f527112115b" containerName="extract-content" Sep 30 14:59:13 crc kubenswrapper[4676]: E0930 14:59:13.976279 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fb63044-ad2b-4c42-adef-a0efbaa78a96" containerName="extract-utilities" Sep 30 14:59:13 crc kubenswrapper[4676]: I0930 14:59:13.976288 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fb63044-ad2b-4c42-adef-a0efbaa78a96" containerName="extract-utilities" Sep 30 14:59:13 crc kubenswrapper[4676]: E0930 14:59:13.976297 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad433a9b-2c06-42b5-89f5-0f527112115b" containerName="registry-server" Sep 30 14:59:13 crc kubenswrapper[4676]: I0930 14:59:13.976305 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad433a9b-2c06-42b5-89f5-0f527112115b" containerName="registry-server" Sep 30 14:59:13 crc kubenswrapper[4676]: I0930 14:59:13.976568 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fb63044-ad2b-4c42-adef-a0efbaa78a96" containerName="registry-server" Sep 30 14:59:13 crc kubenswrapper[4676]: I0930 14:59:13.976585 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad433a9b-2c06-42b5-89f5-0f527112115b" containerName="registry-server" Sep 30 14:59:13 crc kubenswrapper[4676]: I0930 14:59:13.976598 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="10b046c1-241e-4dfd-9aa3-d3e5532a6190" containerName="tempest-tests-tempest-tests-runner" Sep 30 14:59:13 crc kubenswrapper[4676]: I0930 14:59:13.977559 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 14:59:13 crc kubenswrapper[4676]: I0930 14:59:13.980724 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-qlpht" Sep 30 14:59:13 crc kubenswrapper[4676]: I0930 14:59:13.986954 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Sep 30 14:59:14 crc kubenswrapper[4676]: I0930 14:59:14.157174 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bj26\" (UniqueName: \"kubernetes.io/projected/0def7648-4733-47fb-bd01-0dba601ea3cc-kube-api-access-4bj26\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0def7648-4733-47fb-bd01-0dba601ea3cc\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 14:59:14 crc kubenswrapper[4676]: I0930 14:59:14.157260 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0def7648-4733-47fb-bd01-0dba601ea3cc\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 14:59:14 crc kubenswrapper[4676]: I0930 14:59:14.259838 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bj26\" (UniqueName: \"kubernetes.io/projected/0def7648-4733-47fb-bd01-0dba601ea3cc-kube-api-access-4bj26\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0def7648-4733-47fb-bd01-0dba601ea3cc\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 14:59:14 crc kubenswrapper[4676]: I0930 14:59:14.259957 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0def7648-4733-47fb-bd01-0dba601ea3cc\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 14:59:14 crc kubenswrapper[4676]: I0930 14:59:14.260495 4676 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0def7648-4733-47fb-bd01-0dba601ea3cc\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 14:59:14 crc kubenswrapper[4676]: I0930 14:59:14.281070 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bj26\" (UniqueName: \"kubernetes.io/projected/0def7648-4733-47fb-bd01-0dba601ea3cc-kube-api-access-4bj26\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0def7648-4733-47fb-bd01-0dba601ea3cc\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 14:59:14 crc kubenswrapper[4676]: I0930 14:59:14.310435 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0def7648-4733-47fb-bd01-0dba601ea3cc\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 14:59:14 crc kubenswrapper[4676]: I0930 14:59:14.607675 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 14:59:15 crc kubenswrapper[4676]: I0930 14:59:15.076226 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Sep 30 14:59:15 crc kubenswrapper[4676]: I0930 14:59:15.077692 4676 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 14:59:15 crc kubenswrapper[4676]: I0930 14:59:15.563093 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"0def7648-4733-47fb-bd01-0dba601ea3cc","Type":"ContainerStarted","Data":"a31c55736324c717bcc14552d1586cdf1bbf355f18e6b602b18395716a4e279c"} Sep 30 14:59:16 crc kubenswrapper[4676]: I0930 14:59:16.575612 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"0def7648-4733-47fb-bd01-0dba601ea3cc","Type":"ContainerStarted","Data":"587a9e549de2be84b9b09eb93cbc702d2369b0f107c42dcae96097dd22c52d18"} Sep 30 14:59:16 crc kubenswrapper[4676]: I0930 14:59:16.596968 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.689955627 podStartE2EDuration="3.596942253s" podCreationTimestamp="2025-09-30 14:59:13 +0000 UTC" firstStartedPulling="2025-09-30 14:59:15.077494811 +0000 UTC m=+3659.060583230" lastFinishedPulling="2025-09-30 14:59:15.984481427 +0000 UTC m=+3659.967569856" observedRunningTime="2025-09-30 14:59:16.589260741 +0000 UTC m=+3660.572349180" watchObservedRunningTime="2025-09-30 14:59:16.596942253 +0000 UTC m=+3660.580030692" Sep 30 14:59:29 crc kubenswrapper[4676]: I0930 14:59:29.919577 4676 patch_prober.go:28] interesting pod/machine-config-daemon-4k2dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:59:29 crc kubenswrapper[4676]: I0930 14:59:29.920283 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:59:33 crc kubenswrapper[4676]: I0930 14:59:33.797490 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-q7tf5/must-gather-72l9f"] Sep 30 14:59:33 crc kubenswrapper[4676]: I0930 14:59:33.800925 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-q7tf5/must-gather-72l9f" Sep 30 14:59:33 crc kubenswrapper[4676]: I0930 14:59:33.803919 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-q7tf5"/"openshift-service-ca.crt" Sep 30 14:59:33 crc kubenswrapper[4676]: I0930 14:59:33.804985 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-q7tf5"/"kube-root-ca.crt" Sep 30 14:59:33 crc kubenswrapper[4676]: I0930 14:59:33.805978 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-q7tf5"/"default-dockercfg-whv2f" Sep 30 14:59:33 crc kubenswrapper[4676]: I0930 14:59:33.809153 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-q7tf5/must-gather-72l9f"] Sep 30 14:59:33 crc kubenswrapper[4676]: I0930 14:59:33.942276 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jvkb\" (UniqueName: \"kubernetes.io/projected/26bc2079-4750-497e-a94b-c77e49611498-kube-api-access-7jvkb\") pod \"must-gather-72l9f\" (UID: \"26bc2079-4750-497e-a94b-c77e49611498\") " pod="openshift-must-gather-q7tf5/must-gather-72l9f" Sep 30 14:59:33 crc kubenswrapper[4676]: I0930 14:59:33.943725 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/26bc2079-4750-497e-a94b-c77e49611498-must-gather-output\") pod \"must-gather-72l9f\" (UID: \"26bc2079-4750-497e-a94b-c77e49611498\") " pod="openshift-must-gather-q7tf5/must-gather-72l9f" Sep 30 14:59:34 crc kubenswrapper[4676]: I0930 14:59:34.045284 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jvkb\" (UniqueName: \"kubernetes.io/projected/26bc2079-4750-497e-a94b-c77e49611498-kube-api-access-7jvkb\") pod \"must-gather-72l9f\" (UID: \"26bc2079-4750-497e-a94b-c77e49611498\") " pod="openshift-must-gather-q7tf5/must-gather-72l9f" Sep 30 14:59:34 crc kubenswrapper[4676]: I0930 14:59:34.045341 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/26bc2079-4750-497e-a94b-c77e49611498-must-gather-output\") pod \"must-gather-72l9f\" (UID: \"26bc2079-4750-497e-a94b-c77e49611498\") " pod="openshift-must-gather-q7tf5/must-gather-72l9f" Sep 30 14:59:34 crc kubenswrapper[4676]: I0930 14:59:34.045766 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/26bc2079-4750-497e-a94b-c77e49611498-must-gather-output\") pod \"must-gather-72l9f\" (UID: \"26bc2079-4750-497e-a94b-c77e49611498\") " pod="openshift-must-gather-q7tf5/must-gather-72l9f" Sep 30 14:59:34 crc kubenswrapper[4676]: I0930 14:59:34.065120 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jvkb\" (UniqueName: \"kubernetes.io/projected/26bc2079-4750-497e-a94b-c77e49611498-kube-api-access-7jvkb\") pod \"must-gather-72l9f\" (UID: \"26bc2079-4750-497e-a94b-c77e49611498\") " pod="openshift-must-gather-q7tf5/must-gather-72l9f" Sep 30 14:59:34 crc kubenswrapper[4676]: I0930 14:59:34.129252 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-q7tf5/must-gather-72l9f" Sep 30 14:59:34 crc kubenswrapper[4676]: I0930 14:59:34.596197 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-q7tf5/must-gather-72l9f"] Sep 30 14:59:34 crc kubenswrapper[4676]: W0930 14:59:34.598127 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26bc2079_4750_497e_a94b_c77e49611498.slice/crio-ee30f3c602be75a87e0ce5d62e5bf573d339c76cdbe48a523901c5360d640f32 WatchSource:0}: Error finding container ee30f3c602be75a87e0ce5d62e5bf573d339c76cdbe48a523901c5360d640f32: Status 404 returned error can't find the container with id ee30f3c602be75a87e0ce5d62e5bf573d339c76cdbe48a523901c5360d640f32 Sep 30 14:59:34 crc kubenswrapper[4676]: I0930 14:59:34.736465 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-q7tf5/must-gather-72l9f" event={"ID":"26bc2079-4750-497e-a94b-c77e49611498","Type":"ContainerStarted","Data":"ee30f3c602be75a87e0ce5d62e5bf573d339c76cdbe48a523901c5360d640f32"} Sep 30 14:59:38 crc kubenswrapper[4676]: I0930 14:59:38.787108 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-q7tf5/must-gather-72l9f" event={"ID":"26bc2079-4750-497e-a94b-c77e49611498","Type":"ContainerStarted","Data":"d99ee1a171de9a4f78efb869f912ec2edb4d66ebf5ff5d68a9dcf20c6b80db38"} Sep 30 14:59:38 crc kubenswrapper[4676]: I0930 14:59:38.787693 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-q7tf5/must-gather-72l9f" event={"ID":"26bc2079-4750-497e-a94b-c77e49611498","Type":"ContainerStarted","Data":"cc147e3b329127dc2ee2efe3997c04c72a65883c2fdf8045dc77df627894a8b7"} Sep 30 14:59:38 crc kubenswrapper[4676]: I0930 14:59:38.809229 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-q7tf5/must-gather-72l9f" podStartSLOduration=2.160090888 podStartE2EDuration="5.809206301s" podCreationTimestamp="2025-09-30 14:59:33 +0000 UTC" firstStartedPulling="2025-09-30 14:59:34.600975223 +0000 UTC m=+3678.584063652" lastFinishedPulling="2025-09-30 14:59:38.250090636 +0000 UTC m=+3682.233179065" observedRunningTime="2025-09-30 14:59:38.799522937 +0000 UTC m=+3682.782611376" watchObservedRunningTime="2025-09-30 14:59:38.809206301 +0000 UTC m=+3682.792294730" Sep 30 14:59:42 crc kubenswrapper[4676]: I0930 14:59:42.364013 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-q7tf5/crc-debug-fmh66"] Sep 30 14:59:42 crc kubenswrapper[4676]: I0930 14:59:42.367022 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-q7tf5/crc-debug-fmh66" Sep 30 14:59:42 crc kubenswrapper[4676]: I0930 14:59:42.524854 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlwdb\" (UniqueName: \"kubernetes.io/projected/62658c01-335b-4901-8fa2-fe4bca4348d3-kube-api-access-vlwdb\") pod \"crc-debug-fmh66\" (UID: \"62658c01-335b-4901-8fa2-fe4bca4348d3\") " pod="openshift-must-gather-q7tf5/crc-debug-fmh66" Sep 30 14:59:42 crc kubenswrapper[4676]: I0930 14:59:42.525027 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/62658c01-335b-4901-8fa2-fe4bca4348d3-host\") pod \"crc-debug-fmh66\" (UID: \"62658c01-335b-4901-8fa2-fe4bca4348d3\") " pod="openshift-must-gather-q7tf5/crc-debug-fmh66" Sep 30 14:59:42 crc kubenswrapper[4676]: I0930 14:59:42.627483 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/62658c01-335b-4901-8fa2-fe4bca4348d3-host\") pod \"crc-debug-fmh66\" (UID: \"62658c01-335b-4901-8fa2-fe4bca4348d3\") " pod="openshift-must-gather-q7tf5/crc-debug-fmh66" Sep 30 14:59:42 crc kubenswrapper[4676]: I0930 14:59:42.627674 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlwdb\" (UniqueName: \"kubernetes.io/projected/62658c01-335b-4901-8fa2-fe4bca4348d3-kube-api-access-vlwdb\") pod \"crc-debug-fmh66\" (UID: \"62658c01-335b-4901-8fa2-fe4bca4348d3\") " pod="openshift-must-gather-q7tf5/crc-debug-fmh66" Sep 30 14:59:42 crc kubenswrapper[4676]: I0930 14:59:42.627733 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/62658c01-335b-4901-8fa2-fe4bca4348d3-host\") pod \"crc-debug-fmh66\" (UID: \"62658c01-335b-4901-8fa2-fe4bca4348d3\") " pod="openshift-must-gather-q7tf5/crc-debug-fmh66" Sep 30 14:59:42 crc kubenswrapper[4676]: I0930 14:59:42.648358 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlwdb\" (UniqueName: \"kubernetes.io/projected/62658c01-335b-4901-8fa2-fe4bca4348d3-kube-api-access-vlwdb\") pod \"crc-debug-fmh66\" (UID: \"62658c01-335b-4901-8fa2-fe4bca4348d3\") " pod="openshift-must-gather-q7tf5/crc-debug-fmh66" Sep 30 14:59:42 crc kubenswrapper[4676]: I0930 14:59:42.694405 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-q7tf5/crc-debug-fmh66" Sep 30 14:59:42 crc kubenswrapper[4676]: I0930 14:59:42.839103 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-q7tf5/crc-debug-fmh66" event={"ID":"62658c01-335b-4901-8fa2-fe4bca4348d3","Type":"ContainerStarted","Data":"d9f4dbf90e982ea36688e6be41555a3032a0bad40d8e3328ac6a1ddb42bb3965"} Sep 30 14:59:55 crc kubenswrapper[4676]: I0930 14:59:55.989983 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-q7tf5/crc-debug-fmh66" event={"ID":"62658c01-335b-4901-8fa2-fe4bca4348d3","Type":"ContainerStarted","Data":"9bcdc1a8d44a9a07a94c6f07e7996f5427a28a3704e17cea61c4ce265a85651d"} Sep 30 14:59:56 crc kubenswrapper[4676]: I0930 14:59:56.011972 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-q7tf5/crc-debug-fmh66" podStartSLOduration=1.523569305 podStartE2EDuration="14.011950072s" podCreationTimestamp="2025-09-30 14:59:42 +0000 UTC" firstStartedPulling="2025-09-30 14:59:42.742171283 +0000 UTC m=+3686.725259712" lastFinishedPulling="2025-09-30 14:59:55.23055205 +0000 UTC m=+3699.213640479" observedRunningTime="2025-09-30 14:59:56.006597527 +0000 UTC m=+3699.989685966" watchObservedRunningTime="2025-09-30 14:59:56.011950072 +0000 UTC m=+3699.995038501" Sep 30 14:59:59 crc kubenswrapper[4676]: I0930 14:59:59.919716 4676 patch_prober.go:28] interesting pod/machine-config-daemon-4k2dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 14:59:59 crc kubenswrapper[4676]: I0930 14:59:59.920949 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 14:59:59 crc kubenswrapper[4676]: I0930 14:59:59.921056 4676 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" Sep 30 14:59:59 crc kubenswrapper[4676]: I0930 14:59:59.922753 4676 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"30d1d9e76f6f2a6cf5fbf2ec7044efa19d21f8cf0a1419c92944482b8ca02b9b"} pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 14:59:59 crc kubenswrapper[4676]: I0930 14:59:59.922844 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerName="machine-config-daemon" containerID="cri-o://30d1d9e76f6f2a6cf5fbf2ec7044efa19d21f8cf0a1419c92944482b8ca02b9b" gracePeriod=600 Sep 30 15:00:00 crc kubenswrapper[4676]: I0930 15:00:00.192802 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320740-m9nb2"] Sep 30 15:00:00 crc kubenswrapper[4676]: I0930 15:00:00.195067 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320740-m9nb2" Sep 30 15:00:00 crc kubenswrapper[4676]: I0930 15:00:00.207408 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 15:00:00 crc kubenswrapper[4676]: I0930 15:00:00.207559 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 15:00:00 crc kubenswrapper[4676]: I0930 15:00:00.209070 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320740-m9nb2"] Sep 30 15:00:00 crc kubenswrapper[4676]: I0930 15:00:00.325803 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqrs8\" (UniqueName: \"kubernetes.io/projected/d072b0f8-65f0-49b4-b184-8bc190f9090c-kube-api-access-zqrs8\") pod \"collect-profiles-29320740-m9nb2\" (UID: \"d072b0f8-65f0-49b4-b184-8bc190f9090c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320740-m9nb2" Sep 30 15:00:00 crc kubenswrapper[4676]: I0930 15:00:00.325964 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d072b0f8-65f0-49b4-b184-8bc190f9090c-secret-volume\") pod \"collect-profiles-29320740-m9nb2\" (UID: \"d072b0f8-65f0-49b4-b184-8bc190f9090c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320740-m9nb2" Sep 30 15:00:00 crc kubenswrapper[4676]: I0930 15:00:00.326032 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d072b0f8-65f0-49b4-b184-8bc190f9090c-config-volume\") pod \"collect-profiles-29320740-m9nb2\" (UID: \"d072b0f8-65f0-49b4-b184-8bc190f9090c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320740-m9nb2" Sep 30 15:00:00 crc kubenswrapper[4676]: I0930 15:00:00.428444 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d072b0f8-65f0-49b4-b184-8bc190f9090c-secret-volume\") pod \"collect-profiles-29320740-m9nb2\" (UID: \"d072b0f8-65f0-49b4-b184-8bc190f9090c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320740-m9nb2" Sep 30 15:00:00 crc kubenswrapper[4676]: I0930 15:00:00.428620 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d072b0f8-65f0-49b4-b184-8bc190f9090c-config-volume\") pod \"collect-profiles-29320740-m9nb2\" (UID: \"d072b0f8-65f0-49b4-b184-8bc190f9090c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320740-m9nb2" Sep 30 15:00:00 crc kubenswrapper[4676]: I0930 15:00:00.428678 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqrs8\" (UniqueName: \"kubernetes.io/projected/d072b0f8-65f0-49b4-b184-8bc190f9090c-kube-api-access-zqrs8\") pod \"collect-profiles-29320740-m9nb2\" (UID: \"d072b0f8-65f0-49b4-b184-8bc190f9090c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320740-m9nb2" Sep 30 15:00:00 crc kubenswrapper[4676]: I0930 15:00:00.429655 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d072b0f8-65f0-49b4-b184-8bc190f9090c-config-volume\") pod \"collect-profiles-29320740-m9nb2\" (UID: \"d072b0f8-65f0-49b4-b184-8bc190f9090c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320740-m9nb2" Sep 30 15:00:00 crc kubenswrapper[4676]: I0930 15:00:00.447377 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d072b0f8-65f0-49b4-b184-8bc190f9090c-secret-volume\") pod \"collect-profiles-29320740-m9nb2\" (UID: \"d072b0f8-65f0-49b4-b184-8bc190f9090c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320740-m9nb2" Sep 30 15:00:00 crc kubenswrapper[4676]: I0930 15:00:00.452440 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqrs8\" (UniqueName: \"kubernetes.io/projected/d072b0f8-65f0-49b4-b184-8bc190f9090c-kube-api-access-zqrs8\") pod \"collect-profiles-29320740-m9nb2\" (UID: \"d072b0f8-65f0-49b4-b184-8bc190f9090c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320740-m9nb2" Sep 30 15:00:00 crc kubenswrapper[4676]: I0930 15:00:00.535923 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320740-m9nb2" Sep 30 15:00:01 crc kubenswrapper[4676]: I0930 15:00:01.015983 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320740-m9nb2"] Sep 30 15:00:01 crc kubenswrapper[4676]: I0930 15:00:01.045683 4676 generic.go:334] "Generic (PLEG): container finished" podID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerID="30d1d9e76f6f2a6cf5fbf2ec7044efa19d21f8cf0a1419c92944482b8ca02b9b" exitCode=0 Sep 30 15:00:01 crc kubenswrapper[4676]: I0930 15:00:01.045736 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" event={"ID":"af133cb7-f0e4-428e-b348-c6e81493fc1d","Type":"ContainerDied","Data":"30d1d9e76f6f2a6cf5fbf2ec7044efa19d21f8cf0a1419c92944482b8ca02b9b"} Sep 30 15:00:01 crc kubenswrapper[4676]: I0930 15:00:01.045762 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" event={"ID":"af133cb7-f0e4-428e-b348-c6e81493fc1d","Type":"ContainerStarted","Data":"01200c66e62b433cacb49e583f03caf0998b33671a3d5ec07da6a24fae8e6b7a"} Sep 30 15:00:01 crc kubenswrapper[4676]: I0930 15:00:01.045780 4676 scope.go:117] "RemoveContainer" containerID="784fe086d452c9a174375c718c7a65e162c4962c87a59123843848bb343aa424" Sep 30 15:00:03 crc kubenswrapper[4676]: W0930 15:00:03.315585 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd072b0f8_65f0_49b4_b184_8bc190f9090c.slice/crio-f40c6c2b4dc58cdf36c9599e8e9da452cf5f95e8b29d67d87818e4375e6ea47a WatchSource:0}: Error finding container f40c6c2b4dc58cdf36c9599e8e9da452cf5f95e8b29d67d87818e4375e6ea47a: Status 404 returned error can't find the container with id f40c6c2b4dc58cdf36c9599e8e9da452cf5f95e8b29d67d87818e4375e6ea47a Sep 30 15:00:04 crc kubenswrapper[4676]: I0930 15:00:04.076166 4676 generic.go:334] "Generic (PLEG): container finished" podID="d072b0f8-65f0-49b4-b184-8bc190f9090c" containerID="ee4b42b1d7dacd60c7efb4c0b677922e2d6d0909e921b5d13dc691ddfef343cf" exitCode=0 Sep 30 15:00:04 crc kubenswrapper[4676]: I0930 15:00:04.076243 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320740-m9nb2" event={"ID":"d072b0f8-65f0-49b4-b184-8bc190f9090c","Type":"ContainerDied","Data":"ee4b42b1d7dacd60c7efb4c0b677922e2d6d0909e921b5d13dc691ddfef343cf"} Sep 30 15:00:04 crc kubenswrapper[4676]: I0930 15:00:04.076784 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320740-m9nb2" event={"ID":"d072b0f8-65f0-49b4-b184-8bc190f9090c","Type":"ContainerStarted","Data":"f40c6c2b4dc58cdf36c9599e8e9da452cf5f95e8b29d67d87818e4375e6ea47a"} Sep 30 15:00:05 crc kubenswrapper[4676]: I0930 15:00:05.509380 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320740-m9nb2" Sep 30 15:00:05 crc kubenswrapper[4676]: I0930 15:00:05.648526 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d072b0f8-65f0-49b4-b184-8bc190f9090c-config-volume\") pod \"d072b0f8-65f0-49b4-b184-8bc190f9090c\" (UID: \"d072b0f8-65f0-49b4-b184-8bc190f9090c\") " Sep 30 15:00:05 crc kubenswrapper[4676]: I0930 15:00:05.648784 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqrs8\" (UniqueName: \"kubernetes.io/projected/d072b0f8-65f0-49b4-b184-8bc190f9090c-kube-api-access-zqrs8\") pod \"d072b0f8-65f0-49b4-b184-8bc190f9090c\" (UID: \"d072b0f8-65f0-49b4-b184-8bc190f9090c\") " Sep 30 15:00:05 crc kubenswrapper[4676]: I0930 15:00:05.648990 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d072b0f8-65f0-49b4-b184-8bc190f9090c-secret-volume\") pod \"d072b0f8-65f0-49b4-b184-8bc190f9090c\" (UID: \"d072b0f8-65f0-49b4-b184-8bc190f9090c\") " Sep 30 15:00:05 crc kubenswrapper[4676]: I0930 15:00:05.649462 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d072b0f8-65f0-49b4-b184-8bc190f9090c-config-volume" (OuterVolumeSpecName: "config-volume") pod "d072b0f8-65f0-49b4-b184-8bc190f9090c" (UID: "d072b0f8-65f0-49b4-b184-8bc190f9090c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 15:00:05 crc kubenswrapper[4676]: I0930 15:00:05.649855 4676 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d072b0f8-65f0-49b4-b184-8bc190f9090c-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 15:00:05 crc kubenswrapper[4676]: I0930 15:00:05.658102 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d072b0f8-65f0-49b4-b184-8bc190f9090c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d072b0f8-65f0-49b4-b184-8bc190f9090c" (UID: "d072b0f8-65f0-49b4-b184-8bc190f9090c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 15:00:05 crc kubenswrapper[4676]: I0930 15:00:05.658600 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d072b0f8-65f0-49b4-b184-8bc190f9090c-kube-api-access-zqrs8" (OuterVolumeSpecName: "kube-api-access-zqrs8") pod "d072b0f8-65f0-49b4-b184-8bc190f9090c" (UID: "d072b0f8-65f0-49b4-b184-8bc190f9090c"). InnerVolumeSpecName "kube-api-access-zqrs8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 15:00:05 crc kubenswrapper[4676]: I0930 15:00:05.751559 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqrs8\" (UniqueName: \"kubernetes.io/projected/d072b0f8-65f0-49b4-b184-8bc190f9090c-kube-api-access-zqrs8\") on node \"crc\" DevicePath \"\"" Sep 30 15:00:05 crc kubenswrapper[4676]: I0930 15:00:05.751599 4676 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d072b0f8-65f0-49b4-b184-8bc190f9090c-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 15:00:06 crc kubenswrapper[4676]: I0930 15:00:06.097284 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320740-m9nb2" event={"ID":"d072b0f8-65f0-49b4-b184-8bc190f9090c","Type":"ContainerDied","Data":"f40c6c2b4dc58cdf36c9599e8e9da452cf5f95e8b29d67d87818e4375e6ea47a"} Sep 30 15:00:06 crc kubenswrapper[4676]: I0930 15:00:06.097333 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f40c6c2b4dc58cdf36c9599e8e9da452cf5f95e8b29d67d87818e4375e6ea47a" Sep 30 15:00:06 crc kubenswrapper[4676]: I0930 15:00:06.097388 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320740-m9nb2" Sep 30 15:00:06 crc kubenswrapper[4676]: I0930 15:00:06.585381 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320695-ztt5f"] Sep 30 15:00:06 crc kubenswrapper[4676]: I0930 15:00:06.598177 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320695-ztt5f"] Sep 30 15:00:07 crc kubenswrapper[4676]: I0930 15:00:07.447068 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59ee7394-b396-4e50-9ead-e46010aa9f40" path="/var/lib/kubelet/pods/59ee7394-b396-4e50-9ead-e46010aa9f40/volumes" Sep 30 15:00:23 crc kubenswrapper[4676]: I0930 15:00:23.136022 4676 scope.go:117] "RemoveContainer" containerID="4194a635ef36d9d1ddf532db3fde4407a3a65b5bcfa3c4ad7b5e52174214554b" Sep 30 15:00:45 crc kubenswrapper[4676]: I0930 15:00:45.857370 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5fbdbfbbdb-mjhjg_1e134bd5-ad40-427d-ba65-7cf9a5a25104/barbican-api/0.log" Sep 30 15:00:46 crc kubenswrapper[4676]: I0930 15:00:46.078581 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5fbdbfbbdb-mjhjg_1e134bd5-ad40-427d-ba65-7cf9a5a25104/barbican-api-log/0.log" Sep 30 15:00:46 crc kubenswrapper[4676]: I0930 15:00:46.286962 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-777c6c994b-kk5rn_70c7535f-4b3b-438f-9470-c857ece73452/barbican-keystone-listener/0.log" Sep 30 15:00:46 crc kubenswrapper[4676]: I0930 15:00:46.316819 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-777c6c994b-kk5rn_70c7535f-4b3b-438f-9470-c857ece73452/barbican-keystone-listener-log/0.log" Sep 30 15:00:46 crc kubenswrapper[4676]: I0930 15:00:46.551491 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-bf6466755-m2t9t_39f521d2-b195-4179-a114-1c1611e4ba2f/barbican-worker/0.log" Sep 30 15:00:46 crc kubenswrapper[4676]: I0930 15:00:46.565139 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-bf6466755-m2t9t_39f521d2-b195-4179-a114-1c1611e4ba2f/barbican-worker-log/0.log" Sep 30 15:00:46 crc kubenswrapper[4676]: I0930 15:00:46.788716 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-9p8tn_9093887d-1e08-4208-9584-a78c329fd7b0/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 15:00:47 crc kubenswrapper[4676]: I0930 15:00:47.001324 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2a301731-560b-431e-b265-ef436fa8eccb/ceilometer-notification-agent/0.log" Sep 30 15:00:47 crc kubenswrapper[4676]: I0930 15:00:47.062023 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2a301731-560b-431e-b265-ef436fa8eccb/ceilometer-central-agent/0.log" Sep 30 15:00:47 crc kubenswrapper[4676]: I0930 15:00:47.084803 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2a301731-560b-431e-b265-ef436fa8eccb/proxy-httpd/0.log" Sep 30 15:00:47 crc kubenswrapper[4676]: I0930 15:00:47.217005 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2a301731-560b-431e-b265-ef436fa8eccb/sg-core/0.log" Sep 30 15:00:47 crc kubenswrapper[4676]: I0930 15:00:47.365983 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_f490182f-5ea6-45fa-85d0-a6b1c02c5849/cinder-api/0.log" Sep 30 15:00:47 crc kubenswrapper[4676]: I0930 15:00:47.474699 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_f490182f-5ea6-45fa-85d0-a6b1c02c5849/cinder-api-log/0.log" Sep 30 15:00:47 crc kubenswrapper[4676]: I0930 15:00:47.634991 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_21c274fe-4499-4294-b725-96e48b657186/cinder-scheduler/0.log" Sep 30 15:00:47 crc kubenswrapper[4676]: I0930 15:00:47.733678 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_21c274fe-4499-4294-b725-96e48b657186/probe/0.log" Sep 30 15:00:47 crc kubenswrapper[4676]: I0930 15:00:47.880250 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-56d9g_24983a6b-dac1-4567-b8b8-ded54e7287bb/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 15:00:48 crc kubenswrapper[4676]: I0930 15:00:48.046366 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-gq8qc_2ecdd5ea-0ceb-47e4-9e2a-3782b3050fa7/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 15:00:48 crc kubenswrapper[4676]: I0930 15:00:48.297898 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-zhdfz_6f7ebae7-0748-4052-859c-fb6a5fa89d33/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 15:00:48 crc kubenswrapper[4676]: I0930 15:00:48.998300 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-qglmb_a391dbc4-4a80-4a26-9e6d-5903b425ae97/init/0.log" Sep 30 15:00:49 crc kubenswrapper[4676]: I0930 15:00:49.927641 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-qglmb_a391dbc4-4a80-4a26-9e6d-5903b425ae97/init/0.log" Sep 30 15:00:50 crc kubenswrapper[4676]: I0930 15:00:50.314458 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-qglmb_a391dbc4-4a80-4a26-9e6d-5903b425ae97/dnsmasq-dns/0.log" Sep 30 15:00:50 crc kubenswrapper[4676]: I0930 15:00:50.782145 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-p4p5c_8b08c117-d7d7-4bc3-89a0-8a05169688fa/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 15:00:51 crc kubenswrapper[4676]: I0930 15:00:51.369724 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_a22bcf65-b8af-4f8a-845c-31b1b3609e05/glance-log/0.log" Sep 30 15:00:51 crc kubenswrapper[4676]: I0930 15:00:51.371943 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_a22bcf65-b8af-4f8a-845c-31b1b3609e05/glance-httpd/0.log" Sep 30 15:00:51 crc kubenswrapper[4676]: I0930 15:00:51.404079 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_cec7cd30-e0cb-41bb-a620-8d3fad4e2338/glance-httpd/0.log" Sep 30 15:00:51 crc kubenswrapper[4676]: I0930 15:00:51.561951 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_cec7cd30-e0cb-41bb-a620-8d3fad4e2338/glance-log/0.log" Sep 30 15:00:51 crc kubenswrapper[4676]: I0930 15:00:51.700107 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-68fc47cdb4-6758j_a020c8ba-b848-4a3f-80e4-b3692cf99ffa/horizon/0.log" Sep 30 15:00:52 crc kubenswrapper[4676]: I0930 15:00:52.056440 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-524mx_425308e0-6300-4e6a-922e-dc9ef39d61f8/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 15:00:52 crc kubenswrapper[4676]: I0930 15:00:52.061277 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-68fc47cdb4-6758j_a020c8ba-b848-4a3f-80e4-b3692cf99ffa/horizon-log/0.log" Sep 30 15:00:52 crc kubenswrapper[4676]: I0930 15:00:52.109527 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-j8kmb_29932146-0fdd-4717-8a42-2b04967df9ce/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 15:00:52 crc kubenswrapper[4676]: I0930 15:00:52.381597 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_ab292b94-70ab-4d77-9100-d6db2654e3e2/kube-state-metrics/0.log" Sep 30 15:00:52 crc kubenswrapper[4676]: I0930 15:00:52.435206 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-d85658b59-96mgj_4e10025d-8396-4100-8652-3358d52c3199/keystone-api/0.log" Sep 30 15:00:52 crc kubenswrapper[4676]: I0930 15:00:52.622578 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-dr7m6_53729d22-521b-4f61-a225-832492a98b7b/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 15:00:53 crc kubenswrapper[4676]: I0930 15:00:53.601175 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-56956855f5-jwqlp_af5d35f3-c607-4084-9585-0a750ea54db5/neutron-httpd/0.log" Sep 30 15:00:53 crc kubenswrapper[4676]: I0930 15:00:53.726383 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-56956855f5-jwqlp_af5d35f3-c607-4084-9585-0a750ea54db5/neutron-api/0.log" Sep 30 15:00:53 crc kubenswrapper[4676]: I0930 15:00:53.874679 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-w6g6m_87967da4-c3f2-46e1-ae80-230612ebe6af/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 15:00:54 crc kubenswrapper[4676]: I0930 15:00:54.347820 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_b49d9f0a-d50a-409b-b985-c09b657e9ba2/nova-api-api/0.log" Sep 30 15:00:54 crc kubenswrapper[4676]: I0930 15:00:54.591492 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_b49d9f0a-d50a-409b-b985-c09b657e9ba2/nova-api-log/0.log" Sep 30 15:00:54 crc kubenswrapper[4676]: I0930 15:00:54.771280 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_7c38ef92-eb09-4ce7-b23f-10886d83860c/nova-cell0-conductor-conductor/0.log" Sep 30 15:00:54 crc kubenswrapper[4676]: I0930 15:00:54.957865 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_f3250c50-5426-440d-a8ba-9a4f75001b16/nova-cell1-conductor-conductor/0.log" Sep 30 15:00:55 crc kubenswrapper[4676]: I0930 15:00:55.113370 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_1cc6f007-8ed9-4512-8b1b-70e2081f873a/nova-cell1-novncproxy-novncproxy/0.log" Sep 30 15:00:55 crc kubenswrapper[4676]: I0930 15:00:55.380486 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-gjtn5_e4cb3ae8-cc50-4de4-b279-51105c6fc45c/nova-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 15:00:55 crc kubenswrapper[4676]: I0930 15:00:55.568015 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_b9578818-8dfa-4aec-8923-d1d9424068be/nova-metadata-log/0.log" Sep 30 15:00:55 crc kubenswrapper[4676]: I0930 15:00:55.986396 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_92d9c3ce-1a0b-48d0-a88b-ce25162e54b0/nova-scheduler-scheduler/0.log" Sep 30 15:00:56 crc kubenswrapper[4676]: I0930 15:00:56.139068 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_9762912c-8ca3-4791-93c0-4d5728543998/mysql-bootstrap/0.log" Sep 30 15:00:56 crc kubenswrapper[4676]: I0930 15:00:56.394766 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_9762912c-8ca3-4791-93c0-4d5728543998/galera/0.log" Sep 30 15:00:56 crc kubenswrapper[4676]: I0930 15:00:56.428645 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_9762912c-8ca3-4791-93c0-4d5728543998/mysql-bootstrap/0.log" Sep 30 15:00:56 crc kubenswrapper[4676]: I0930 15:00:56.665877 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_d32e1e85-6a70-4751-9223-85e7018c3cc7/mysql-bootstrap/0.log" Sep 30 15:00:56 crc kubenswrapper[4676]: I0930 15:00:56.931600 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_d32e1e85-6a70-4751-9223-85e7018c3cc7/mysql-bootstrap/0.log" Sep 30 15:00:56 crc kubenswrapper[4676]: I0930 15:00:56.941542 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_d32e1e85-6a70-4751-9223-85e7018c3cc7/galera/0.log" Sep 30 15:00:57 crc kubenswrapper[4676]: I0930 15:00:57.110214 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_b9578818-8dfa-4aec-8923-d1d9424068be/nova-metadata-metadata/0.log" Sep 30 15:00:57 crc kubenswrapper[4676]: I0930 15:00:57.206684 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_f682d501-bba0-4b08-98aa-0ee2a0603939/openstackclient/0.log" Sep 30 15:00:57 crc kubenswrapper[4676]: I0930 15:00:57.486786 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-9nlxb_72431bf0-bd8f-431d-81a8-082f9ef654e1/ovn-controller/0.log" Sep 30 15:00:57 crc kubenswrapper[4676]: I0930 15:00:57.703622 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-w8c76_39632f21-878d-4bb9-ba72-afcac2cd0b5d/openstack-network-exporter/0.log" Sep 30 15:00:57 crc kubenswrapper[4676]: I0930 15:00:57.827491 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-7t792_a69f5429-b9b9-47c3-b720-1a59c5d87b27/ovsdb-server-init/0.log" Sep 30 15:00:58 crc kubenswrapper[4676]: I0930 15:00:58.014253 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-7t792_a69f5429-b9b9-47c3-b720-1a59c5d87b27/ovs-vswitchd/0.log" Sep 30 15:00:58 crc kubenswrapper[4676]: I0930 15:00:58.053163 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-7t792_a69f5429-b9b9-47c3-b720-1a59c5d87b27/ovsdb-server-init/0.log" Sep 30 15:00:58 crc kubenswrapper[4676]: I0930 15:00:58.067333 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-7t792_a69f5429-b9b9-47c3-b720-1a59c5d87b27/ovsdb-server/0.log" Sep 30 15:00:58 crc kubenswrapper[4676]: I0930 15:00:58.353720 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-js2fj_ac2b4c79-3867-4f1b-bb55-d0978cffaded/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 15:00:58 crc kubenswrapper[4676]: I0930 15:00:58.506223 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_2aa43108-6602-4fc3-b8b1-ce07a8ef0b31/openstack-network-exporter/0.log" Sep 30 15:00:58 crc kubenswrapper[4676]: I0930 15:00:58.614518 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_2aa43108-6602-4fc3-b8b1-ce07a8ef0b31/ovn-northd/0.log" Sep 30 15:00:58 crc kubenswrapper[4676]: I0930 15:00:58.787409 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_35e313d1-3779-4eb1-b12f-c3b5432dfd1d/openstack-network-exporter/0.log" Sep 30 15:00:58 crc kubenswrapper[4676]: I0930 15:00:58.896402 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_35e313d1-3779-4eb1-b12f-c3b5432dfd1d/ovsdbserver-nb/0.log" Sep 30 15:00:59 crc kubenswrapper[4676]: I0930 15:00:59.052593 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_e73b3580-31d5-4c06-9bd8-acbd16c5c48d/openstack-network-exporter/0.log" Sep 30 15:00:59 crc kubenswrapper[4676]: I0930 15:00:59.098823 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_e73b3580-31d5-4c06-9bd8-acbd16c5c48d/ovsdbserver-sb/0.log" Sep 30 15:00:59 crc kubenswrapper[4676]: I0930 15:00:59.461244 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6558bbc9d4-wdcbn_c5505b25-a501-44f0-8b24-6630fb71d41b/placement-api/0.log" Sep 30 15:00:59 crc kubenswrapper[4676]: I0930 15:00:59.466838 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6558bbc9d4-wdcbn_c5505b25-a501-44f0-8b24-6630fb71d41b/placement-log/0.log" Sep 30 15:00:59 crc kubenswrapper[4676]: I0930 15:00:59.718980 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_56d86f95-7d72-42d4-84d7-fdca29b1270f/setup-container/0.log" Sep 30 15:00:59 crc kubenswrapper[4676]: I0930 15:00:59.940431 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_56d86f95-7d72-42d4-84d7-fdca29b1270f/setup-container/0.log" Sep 30 15:00:59 crc kubenswrapper[4676]: I0930 15:00:59.973221 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_56d86f95-7d72-42d4-84d7-fdca29b1270f/rabbitmq/0.log" Sep 30 15:01:00 crc kubenswrapper[4676]: I0930 15:01:00.158648 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29320741-8mjdr"] Sep 30 15:01:00 crc kubenswrapper[4676]: E0930 15:01:00.159150 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d072b0f8-65f0-49b4-b184-8bc190f9090c" containerName="collect-profiles" Sep 30 15:01:00 crc kubenswrapper[4676]: I0930 15:01:00.159166 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="d072b0f8-65f0-49b4-b184-8bc190f9090c" containerName="collect-profiles" Sep 30 15:01:00 crc kubenswrapper[4676]: I0930 15:01:00.159367 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="d072b0f8-65f0-49b4-b184-8bc190f9090c" containerName="collect-profiles" Sep 30 15:01:00 crc kubenswrapper[4676]: I0930 15:01:00.160162 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29320741-8mjdr" Sep 30 15:01:00 crc kubenswrapper[4676]: I0930 15:01:00.186208 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29320741-8mjdr"] Sep 30 15:01:00 crc kubenswrapper[4676]: I0930 15:01:00.206332 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_140577c7-99f4-4dc1-85dd-1bec990df549/setup-container/0.log" Sep 30 15:01:00 crc kubenswrapper[4676]: I0930 15:01:00.308207 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9whcg\" (UniqueName: \"kubernetes.io/projected/a4a69823-fcbd-4141-83bd-6f242e8304be-kube-api-access-9whcg\") pod \"keystone-cron-29320741-8mjdr\" (UID: \"a4a69823-fcbd-4141-83bd-6f242e8304be\") " pod="openstack/keystone-cron-29320741-8mjdr" Sep 30 15:01:00 crc kubenswrapper[4676]: I0930 15:01:00.308332 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4a69823-fcbd-4141-83bd-6f242e8304be-combined-ca-bundle\") pod \"keystone-cron-29320741-8mjdr\" (UID: \"a4a69823-fcbd-4141-83bd-6f242e8304be\") " pod="openstack/keystone-cron-29320741-8mjdr" Sep 30 15:01:00 crc kubenswrapper[4676]: I0930 15:01:00.308498 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a4a69823-fcbd-4141-83bd-6f242e8304be-fernet-keys\") pod \"keystone-cron-29320741-8mjdr\" (UID: \"a4a69823-fcbd-4141-83bd-6f242e8304be\") " pod="openstack/keystone-cron-29320741-8mjdr" Sep 30 15:01:00 crc kubenswrapper[4676]: I0930 15:01:00.308539 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4a69823-fcbd-4141-83bd-6f242e8304be-config-data\") pod \"keystone-cron-29320741-8mjdr\" (UID: \"a4a69823-fcbd-4141-83bd-6f242e8304be\") " pod="openstack/keystone-cron-29320741-8mjdr" Sep 30 15:01:00 crc kubenswrapper[4676]: I0930 15:01:00.410039 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9whcg\" (UniqueName: \"kubernetes.io/projected/a4a69823-fcbd-4141-83bd-6f242e8304be-kube-api-access-9whcg\") pod \"keystone-cron-29320741-8mjdr\" (UID: \"a4a69823-fcbd-4141-83bd-6f242e8304be\") " pod="openstack/keystone-cron-29320741-8mjdr" Sep 30 15:01:00 crc kubenswrapper[4676]: I0930 15:01:00.410091 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4a69823-fcbd-4141-83bd-6f242e8304be-combined-ca-bundle\") pod \"keystone-cron-29320741-8mjdr\" (UID: \"a4a69823-fcbd-4141-83bd-6f242e8304be\") " pod="openstack/keystone-cron-29320741-8mjdr" Sep 30 15:01:00 crc kubenswrapper[4676]: I0930 15:01:00.410207 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a4a69823-fcbd-4141-83bd-6f242e8304be-fernet-keys\") pod \"keystone-cron-29320741-8mjdr\" (UID: \"a4a69823-fcbd-4141-83bd-6f242e8304be\") " pod="openstack/keystone-cron-29320741-8mjdr" Sep 30 15:01:00 crc kubenswrapper[4676]: I0930 15:01:00.410232 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4a69823-fcbd-4141-83bd-6f242e8304be-config-data\") pod \"keystone-cron-29320741-8mjdr\" (UID: \"a4a69823-fcbd-4141-83bd-6f242e8304be\") " pod="openstack/keystone-cron-29320741-8mjdr" Sep 30 15:01:00 crc kubenswrapper[4676]: I0930 15:01:00.417980 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4a69823-fcbd-4141-83bd-6f242e8304be-config-data\") pod \"keystone-cron-29320741-8mjdr\" (UID: \"a4a69823-fcbd-4141-83bd-6f242e8304be\") " pod="openstack/keystone-cron-29320741-8mjdr" Sep 30 15:01:00 crc kubenswrapper[4676]: I0930 15:01:00.418948 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4a69823-fcbd-4141-83bd-6f242e8304be-combined-ca-bundle\") pod \"keystone-cron-29320741-8mjdr\" (UID: \"a4a69823-fcbd-4141-83bd-6f242e8304be\") " pod="openstack/keystone-cron-29320741-8mjdr" Sep 30 15:01:00 crc kubenswrapper[4676]: I0930 15:01:00.421705 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a4a69823-fcbd-4141-83bd-6f242e8304be-fernet-keys\") pod \"keystone-cron-29320741-8mjdr\" (UID: \"a4a69823-fcbd-4141-83bd-6f242e8304be\") " pod="openstack/keystone-cron-29320741-8mjdr" Sep 30 15:01:00 crc kubenswrapper[4676]: I0930 15:01:00.436931 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9whcg\" (UniqueName: \"kubernetes.io/projected/a4a69823-fcbd-4141-83bd-6f242e8304be-kube-api-access-9whcg\") pod \"keystone-cron-29320741-8mjdr\" (UID: \"a4a69823-fcbd-4141-83bd-6f242e8304be\") " pod="openstack/keystone-cron-29320741-8mjdr" Sep 30 15:01:00 crc kubenswrapper[4676]: I0930 15:01:00.484437 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29320741-8mjdr" Sep 30 15:01:00 crc kubenswrapper[4676]: I0930 15:01:00.486198 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_140577c7-99f4-4dc1-85dd-1bec990df549/rabbitmq/0.log" Sep 30 15:01:00 crc kubenswrapper[4676]: I0930 15:01:00.531623 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_140577c7-99f4-4dc1-85dd-1bec990df549/setup-container/0.log" Sep 30 15:01:00 crc kubenswrapper[4676]: I0930 15:01:00.969621 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-cj72b_bcad236c-a0f7-47a2-ae9e-e52839eaee9d/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 15:01:00 crc kubenswrapper[4676]: I0930 15:01:00.973638 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-sgkqz_fc6b249a-7b71-4b2e-9d6a-6b6b635b8ddc/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 15:01:01 crc kubenswrapper[4676]: I0930 15:01:01.206988 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29320741-8mjdr"] Sep 30 15:01:01 crc kubenswrapper[4676]: I0930 15:01:01.290428 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-54lsm_8d063f50-40de-47eb-9849-8b29cee35392/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 15:01:01 crc kubenswrapper[4676]: I0930 15:01:01.663043 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29320741-8mjdr" event={"ID":"a4a69823-fcbd-4141-83bd-6f242e8304be","Type":"ContainerStarted","Data":"1686ea2ad5545f5a63b14104050d79b91ba97f52f456e3771cfdd59045a14f00"} Sep 30 15:01:01 crc kubenswrapper[4676]: I0930 15:01:01.667404 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-flzt4_44bcdced-a8cf-4b1d-baa4-31988a1ca72d/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 15:01:01 crc kubenswrapper[4676]: I0930 15:01:01.696406 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-kgg8r_c9aca039-cea1-4fe5-8ee1-226f22cbefd2/ssh-known-hosts-edpm-deployment/0.log" Sep 30 15:01:01 crc kubenswrapper[4676]: I0930 15:01:01.968617 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6f9769955f-6wd7m_b18c2fcd-dc66-434b-b3ef-61215f24a511/proxy-server/0.log" Sep 30 15:01:02 crc kubenswrapper[4676]: I0930 15:01:02.202447 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6f9769955f-6wd7m_b18c2fcd-dc66-434b-b3ef-61215f24a511/proxy-httpd/0.log" Sep 30 15:01:02 crc kubenswrapper[4676]: I0930 15:01:02.578401 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-vbg62_fb668321-0ea7-4c30-9773-6c7f511959f4/swift-ring-rebalance/0.log" Sep 30 15:01:02 crc kubenswrapper[4676]: I0930 15:01:02.679279 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29320741-8mjdr" event={"ID":"a4a69823-fcbd-4141-83bd-6f242e8304be","Type":"ContainerStarted","Data":"3b458d674800d49a310ed29678c2caacb9b804386441d40815085eb519a2627a"} Sep 30 15:01:02 crc kubenswrapper[4676]: I0930 15:01:02.821064 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6c0230d2-8bbc-4ad0-8f3d-062d2d940013/account-auditor/0.log" Sep 30 15:01:02 crc kubenswrapper[4676]: I0930 15:01:02.856802 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6c0230d2-8bbc-4ad0-8f3d-062d2d940013/account-reaper/0.log" Sep 30 15:01:03 crc kubenswrapper[4676]: I0930 15:01:03.052482 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6c0230d2-8bbc-4ad0-8f3d-062d2d940013/account-server/0.log" Sep 30 15:01:03 crc kubenswrapper[4676]: I0930 15:01:03.071695 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6c0230d2-8bbc-4ad0-8f3d-062d2d940013/account-replicator/0.log" Sep 30 15:01:03 crc kubenswrapper[4676]: I0930 15:01:03.118384 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6c0230d2-8bbc-4ad0-8f3d-062d2d940013/container-auditor/0.log" Sep 30 15:01:03 crc kubenswrapper[4676]: I0930 15:01:03.261026 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6c0230d2-8bbc-4ad0-8f3d-062d2d940013/container-replicator/0.log" Sep 30 15:01:03 crc kubenswrapper[4676]: I0930 15:01:03.352927 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6c0230d2-8bbc-4ad0-8f3d-062d2d940013/container-updater/0.log" Sep 30 15:01:03 crc kubenswrapper[4676]: I0930 15:01:03.369734 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6c0230d2-8bbc-4ad0-8f3d-062d2d940013/container-server/0.log" Sep 30 15:01:03 crc kubenswrapper[4676]: I0930 15:01:03.504179 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6c0230d2-8bbc-4ad0-8f3d-062d2d940013/object-auditor/0.log" Sep 30 15:01:03 crc kubenswrapper[4676]: I0930 15:01:03.593979 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6c0230d2-8bbc-4ad0-8f3d-062d2d940013/object-expirer/0.log" Sep 30 15:01:03 crc kubenswrapper[4676]: I0930 15:01:03.632854 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6c0230d2-8bbc-4ad0-8f3d-062d2d940013/object-replicator/0.log" Sep 30 15:01:03 crc kubenswrapper[4676]: I0930 15:01:03.727557 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6c0230d2-8bbc-4ad0-8f3d-062d2d940013/object-server/0.log" Sep 30 15:01:03 crc kubenswrapper[4676]: I0930 15:01:03.854574 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6c0230d2-8bbc-4ad0-8f3d-062d2d940013/object-updater/0.log" Sep 30 15:01:03 crc kubenswrapper[4676]: I0930 15:01:03.871231 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6c0230d2-8bbc-4ad0-8f3d-062d2d940013/rsync/0.log" Sep 30 15:01:03 crc kubenswrapper[4676]: I0930 15:01:03.991625 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6c0230d2-8bbc-4ad0-8f3d-062d2d940013/swift-recon-cron/0.log" Sep 30 15:01:04 crc kubenswrapper[4676]: I0930 15:01:04.192041 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-hzvwk_6b813464-177e-4354-af90-edefef63c05c/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 15:01:04 crc kubenswrapper[4676]: I0930 15:01:04.387394 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_10b046c1-241e-4dfd-9aa3-d3e5532a6190/tempest-tests-tempest-tests-runner/0.log" Sep 30 15:01:04 crc kubenswrapper[4676]: I0930 15:01:04.446436 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_0def7648-4733-47fb-bd01-0dba601ea3cc/test-operator-logs-container/0.log" Sep 30 15:01:04 crc kubenswrapper[4676]: I0930 15:01:04.652631 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-6v5pp_d889c774-3b4f-4b05-9fd4-f18ad2aa2e4b/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 15:01:04 crc kubenswrapper[4676]: I0930 15:01:04.713552 4676 generic.go:334] "Generic (PLEG): container finished" podID="a4a69823-fcbd-4141-83bd-6f242e8304be" containerID="3b458d674800d49a310ed29678c2caacb9b804386441d40815085eb519a2627a" exitCode=0 Sep 30 15:01:04 crc kubenswrapper[4676]: I0930 15:01:04.714184 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29320741-8mjdr" event={"ID":"a4a69823-fcbd-4141-83bd-6f242e8304be","Type":"ContainerDied","Data":"3b458d674800d49a310ed29678c2caacb9b804386441d40815085eb519a2627a"} Sep 30 15:01:06 crc kubenswrapper[4676]: I0930 15:01:06.224077 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29320741-8mjdr" Sep 30 15:01:06 crc kubenswrapper[4676]: I0930 15:01:06.350579 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9whcg\" (UniqueName: \"kubernetes.io/projected/a4a69823-fcbd-4141-83bd-6f242e8304be-kube-api-access-9whcg\") pod \"a4a69823-fcbd-4141-83bd-6f242e8304be\" (UID: \"a4a69823-fcbd-4141-83bd-6f242e8304be\") " Sep 30 15:01:06 crc kubenswrapper[4676]: I0930 15:01:06.350662 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4a69823-fcbd-4141-83bd-6f242e8304be-config-data\") pod \"a4a69823-fcbd-4141-83bd-6f242e8304be\" (UID: \"a4a69823-fcbd-4141-83bd-6f242e8304be\") " Sep 30 15:01:06 crc kubenswrapper[4676]: I0930 15:01:06.350735 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a4a69823-fcbd-4141-83bd-6f242e8304be-fernet-keys\") pod \"a4a69823-fcbd-4141-83bd-6f242e8304be\" (UID: \"a4a69823-fcbd-4141-83bd-6f242e8304be\") " Sep 30 15:01:06 crc kubenswrapper[4676]: I0930 15:01:06.350864 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4a69823-fcbd-4141-83bd-6f242e8304be-combined-ca-bundle\") pod \"a4a69823-fcbd-4141-83bd-6f242e8304be\" (UID: \"a4a69823-fcbd-4141-83bd-6f242e8304be\") " Sep 30 15:01:06 crc kubenswrapper[4676]: I0930 15:01:06.364102 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4a69823-fcbd-4141-83bd-6f242e8304be-kube-api-access-9whcg" (OuterVolumeSpecName: "kube-api-access-9whcg") pod "a4a69823-fcbd-4141-83bd-6f242e8304be" (UID: "a4a69823-fcbd-4141-83bd-6f242e8304be"). InnerVolumeSpecName "kube-api-access-9whcg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 15:01:06 crc kubenswrapper[4676]: I0930 15:01:06.366265 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4a69823-fcbd-4141-83bd-6f242e8304be-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "a4a69823-fcbd-4141-83bd-6f242e8304be" (UID: "a4a69823-fcbd-4141-83bd-6f242e8304be"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 15:01:06 crc kubenswrapper[4676]: I0930 15:01:06.388702 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4a69823-fcbd-4141-83bd-6f242e8304be-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a4a69823-fcbd-4141-83bd-6f242e8304be" (UID: "a4a69823-fcbd-4141-83bd-6f242e8304be"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 15:01:06 crc kubenswrapper[4676]: I0930 15:01:06.453232 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4a69823-fcbd-4141-83bd-6f242e8304be-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 15:01:06 crc kubenswrapper[4676]: I0930 15:01:06.453273 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9whcg\" (UniqueName: \"kubernetes.io/projected/a4a69823-fcbd-4141-83bd-6f242e8304be-kube-api-access-9whcg\") on node \"crc\" DevicePath \"\"" Sep 30 15:01:06 crc kubenswrapper[4676]: I0930 15:01:06.453287 4676 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a4a69823-fcbd-4141-83bd-6f242e8304be-fernet-keys\") on node \"crc\" DevicePath \"\"" Sep 30 15:01:06 crc kubenswrapper[4676]: I0930 15:01:06.454416 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4a69823-fcbd-4141-83bd-6f242e8304be-config-data" (OuterVolumeSpecName: "config-data") pod "a4a69823-fcbd-4141-83bd-6f242e8304be" (UID: "a4a69823-fcbd-4141-83bd-6f242e8304be"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 15:01:06 crc kubenswrapper[4676]: I0930 15:01:06.556793 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4a69823-fcbd-4141-83bd-6f242e8304be-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 15:01:06 crc kubenswrapper[4676]: I0930 15:01:06.739445 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29320741-8mjdr" event={"ID":"a4a69823-fcbd-4141-83bd-6f242e8304be","Type":"ContainerDied","Data":"1686ea2ad5545f5a63b14104050d79b91ba97f52f456e3771cfdd59045a14f00"} Sep 30 15:01:06 crc kubenswrapper[4676]: I0930 15:01:06.739504 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1686ea2ad5545f5a63b14104050d79b91ba97f52f456e3771cfdd59045a14f00" Sep 30 15:01:06 crc kubenswrapper[4676]: I0930 15:01:06.739653 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29320741-8mjdr" Sep 30 15:01:13 crc kubenswrapper[4676]: I0930 15:01:13.208220 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_7aef6349-6b33-4f9e-972d-a990cb3ff62e/memcached/0.log" Sep 30 15:02:03 crc kubenswrapper[4676]: I0930 15:02:03.255407 4676 generic.go:334] "Generic (PLEG): container finished" podID="62658c01-335b-4901-8fa2-fe4bca4348d3" containerID="9bcdc1a8d44a9a07a94c6f07e7996f5427a28a3704e17cea61c4ce265a85651d" exitCode=0 Sep 30 15:02:03 crc kubenswrapper[4676]: I0930 15:02:03.255486 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-q7tf5/crc-debug-fmh66" event={"ID":"62658c01-335b-4901-8fa2-fe4bca4348d3","Type":"ContainerDied","Data":"9bcdc1a8d44a9a07a94c6f07e7996f5427a28a3704e17cea61c4ce265a85651d"} Sep 30 15:02:04 crc kubenswrapper[4676]: I0930 15:02:04.358233 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-q7tf5/crc-debug-fmh66" Sep 30 15:02:04 crc kubenswrapper[4676]: I0930 15:02:04.392231 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-q7tf5/crc-debug-fmh66"] Sep 30 15:02:04 crc kubenswrapper[4676]: I0930 15:02:04.401343 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-q7tf5/crc-debug-fmh66"] Sep 30 15:02:04 crc kubenswrapper[4676]: I0930 15:02:04.516380 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/62658c01-335b-4901-8fa2-fe4bca4348d3-host\") pod \"62658c01-335b-4901-8fa2-fe4bca4348d3\" (UID: \"62658c01-335b-4901-8fa2-fe4bca4348d3\") " Sep 30 15:02:04 crc kubenswrapper[4676]: I0930 15:02:04.516961 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlwdb\" (UniqueName: \"kubernetes.io/projected/62658c01-335b-4901-8fa2-fe4bca4348d3-kube-api-access-vlwdb\") pod \"62658c01-335b-4901-8fa2-fe4bca4348d3\" (UID: \"62658c01-335b-4901-8fa2-fe4bca4348d3\") " Sep 30 15:02:04 crc kubenswrapper[4676]: I0930 15:02:04.516469 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/62658c01-335b-4901-8fa2-fe4bca4348d3-host" (OuterVolumeSpecName: "host") pod "62658c01-335b-4901-8fa2-fe4bca4348d3" (UID: "62658c01-335b-4901-8fa2-fe4bca4348d3"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 15:02:04 crc kubenswrapper[4676]: I0930 15:02:04.518022 4676 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/62658c01-335b-4901-8fa2-fe4bca4348d3-host\") on node \"crc\" DevicePath \"\"" Sep 30 15:02:04 crc kubenswrapper[4676]: I0930 15:02:04.523409 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62658c01-335b-4901-8fa2-fe4bca4348d3-kube-api-access-vlwdb" (OuterVolumeSpecName: "kube-api-access-vlwdb") pod "62658c01-335b-4901-8fa2-fe4bca4348d3" (UID: "62658c01-335b-4901-8fa2-fe4bca4348d3"). InnerVolumeSpecName "kube-api-access-vlwdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 15:02:04 crc kubenswrapper[4676]: I0930 15:02:04.619976 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlwdb\" (UniqueName: \"kubernetes.io/projected/62658c01-335b-4901-8fa2-fe4bca4348d3-kube-api-access-vlwdb\") on node \"crc\" DevicePath \"\"" Sep 30 15:02:05 crc kubenswrapper[4676]: I0930 15:02:05.274244 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9f4dbf90e982ea36688e6be41555a3032a0bad40d8e3328ac6a1ddb42bb3965" Sep 30 15:02:05 crc kubenswrapper[4676]: I0930 15:02:05.274343 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-q7tf5/crc-debug-fmh66" Sep 30 15:02:05 crc kubenswrapper[4676]: I0930 15:02:05.446071 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62658c01-335b-4901-8fa2-fe4bca4348d3" path="/var/lib/kubelet/pods/62658c01-335b-4901-8fa2-fe4bca4348d3/volumes" Sep 30 15:02:05 crc kubenswrapper[4676]: I0930 15:02:05.554474 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-q7tf5/crc-debug-mwb65"] Sep 30 15:02:05 crc kubenswrapper[4676]: E0930 15:02:05.554921 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4a69823-fcbd-4141-83bd-6f242e8304be" containerName="keystone-cron" Sep 30 15:02:05 crc kubenswrapper[4676]: I0930 15:02:05.554980 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4a69823-fcbd-4141-83bd-6f242e8304be" containerName="keystone-cron" Sep 30 15:02:05 crc kubenswrapper[4676]: E0930 15:02:05.555030 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62658c01-335b-4901-8fa2-fe4bca4348d3" containerName="container-00" Sep 30 15:02:05 crc kubenswrapper[4676]: I0930 15:02:05.555037 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="62658c01-335b-4901-8fa2-fe4bca4348d3" containerName="container-00" Sep 30 15:02:05 crc kubenswrapper[4676]: I0930 15:02:05.555231 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4a69823-fcbd-4141-83bd-6f242e8304be" containerName="keystone-cron" Sep 30 15:02:05 crc kubenswrapper[4676]: I0930 15:02:05.555254 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="62658c01-335b-4901-8fa2-fe4bca4348d3" containerName="container-00" Sep 30 15:02:05 crc kubenswrapper[4676]: I0930 15:02:05.555959 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-q7tf5/crc-debug-mwb65" Sep 30 15:02:05 crc kubenswrapper[4676]: I0930 15:02:05.738450 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r77km\" (UniqueName: \"kubernetes.io/projected/03d22a2b-171b-4e68-8469-c761d145e4d7-kube-api-access-r77km\") pod \"crc-debug-mwb65\" (UID: \"03d22a2b-171b-4e68-8469-c761d145e4d7\") " pod="openshift-must-gather-q7tf5/crc-debug-mwb65" Sep 30 15:02:05 crc kubenswrapper[4676]: I0930 15:02:05.738593 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/03d22a2b-171b-4e68-8469-c761d145e4d7-host\") pod \"crc-debug-mwb65\" (UID: \"03d22a2b-171b-4e68-8469-c761d145e4d7\") " pod="openshift-must-gather-q7tf5/crc-debug-mwb65" Sep 30 15:02:05 crc kubenswrapper[4676]: I0930 15:02:05.840781 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/03d22a2b-171b-4e68-8469-c761d145e4d7-host\") pod \"crc-debug-mwb65\" (UID: \"03d22a2b-171b-4e68-8469-c761d145e4d7\") " pod="openshift-must-gather-q7tf5/crc-debug-mwb65" Sep 30 15:02:05 crc kubenswrapper[4676]: I0930 15:02:05.840945 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r77km\" (UniqueName: \"kubernetes.io/projected/03d22a2b-171b-4e68-8469-c761d145e4d7-kube-api-access-r77km\") pod \"crc-debug-mwb65\" (UID: \"03d22a2b-171b-4e68-8469-c761d145e4d7\") " pod="openshift-must-gather-q7tf5/crc-debug-mwb65" Sep 30 15:02:05 crc kubenswrapper[4676]: I0930 15:02:05.840954 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/03d22a2b-171b-4e68-8469-c761d145e4d7-host\") pod \"crc-debug-mwb65\" (UID: \"03d22a2b-171b-4e68-8469-c761d145e4d7\") " pod="openshift-must-gather-q7tf5/crc-debug-mwb65" Sep 30 15:02:05 crc kubenswrapper[4676]: I0930 15:02:05.859949 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r77km\" (UniqueName: \"kubernetes.io/projected/03d22a2b-171b-4e68-8469-c761d145e4d7-kube-api-access-r77km\") pod \"crc-debug-mwb65\" (UID: \"03d22a2b-171b-4e68-8469-c761d145e4d7\") " pod="openshift-must-gather-q7tf5/crc-debug-mwb65" Sep 30 15:02:05 crc kubenswrapper[4676]: I0930 15:02:05.873188 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-q7tf5/crc-debug-mwb65" Sep 30 15:02:06 crc kubenswrapper[4676]: I0930 15:02:06.283816 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-q7tf5/crc-debug-mwb65" event={"ID":"03d22a2b-171b-4e68-8469-c761d145e4d7","Type":"ContainerStarted","Data":"88af6369ea5f80e4b1fd977c9f74cbf68fbbef393507358f88acbbfc54fc70c2"} Sep 30 15:02:06 crc kubenswrapper[4676]: I0930 15:02:06.284174 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-q7tf5/crc-debug-mwb65" event={"ID":"03d22a2b-171b-4e68-8469-c761d145e4d7","Type":"ContainerStarted","Data":"459d3cbe6ea2d382372a0371dbf7c4469c854b773898a09b9fda5b2a737c1a6b"} Sep 30 15:02:06 crc kubenswrapper[4676]: I0930 15:02:06.299359 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-q7tf5/crc-debug-mwb65" podStartSLOduration=1.299342964 podStartE2EDuration="1.299342964s" podCreationTimestamp="2025-09-30 15:02:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 15:02:06.297192806 +0000 UTC m=+3830.280281225" watchObservedRunningTime="2025-09-30 15:02:06.299342964 +0000 UTC m=+3830.282431393" Sep 30 15:02:07 crc kubenswrapper[4676]: I0930 15:02:07.293170 4676 generic.go:334] "Generic (PLEG): container finished" podID="03d22a2b-171b-4e68-8469-c761d145e4d7" containerID="88af6369ea5f80e4b1fd977c9f74cbf68fbbef393507358f88acbbfc54fc70c2" exitCode=0 Sep 30 15:02:07 crc kubenswrapper[4676]: I0930 15:02:07.293468 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-q7tf5/crc-debug-mwb65" event={"ID":"03d22a2b-171b-4e68-8469-c761d145e4d7","Type":"ContainerDied","Data":"88af6369ea5f80e4b1fd977c9f74cbf68fbbef393507358f88acbbfc54fc70c2"} Sep 30 15:02:08 crc kubenswrapper[4676]: I0930 15:02:08.441010 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-q7tf5/crc-debug-mwb65" Sep 30 15:02:08 crc kubenswrapper[4676]: I0930 15:02:08.583514 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/03d22a2b-171b-4e68-8469-c761d145e4d7-host\") pod \"03d22a2b-171b-4e68-8469-c761d145e4d7\" (UID: \"03d22a2b-171b-4e68-8469-c761d145e4d7\") " Sep 30 15:02:08 crc kubenswrapper[4676]: I0930 15:02:08.583588 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r77km\" (UniqueName: \"kubernetes.io/projected/03d22a2b-171b-4e68-8469-c761d145e4d7-kube-api-access-r77km\") pod \"03d22a2b-171b-4e68-8469-c761d145e4d7\" (UID: \"03d22a2b-171b-4e68-8469-c761d145e4d7\") " Sep 30 15:02:08 crc kubenswrapper[4676]: I0930 15:02:08.583662 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/03d22a2b-171b-4e68-8469-c761d145e4d7-host" (OuterVolumeSpecName: "host") pod "03d22a2b-171b-4e68-8469-c761d145e4d7" (UID: "03d22a2b-171b-4e68-8469-c761d145e4d7"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 15:02:08 crc kubenswrapper[4676]: I0930 15:02:08.584373 4676 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/03d22a2b-171b-4e68-8469-c761d145e4d7-host\") on node \"crc\" DevicePath \"\"" Sep 30 15:02:08 crc kubenswrapper[4676]: I0930 15:02:08.589357 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03d22a2b-171b-4e68-8469-c761d145e4d7-kube-api-access-r77km" (OuterVolumeSpecName: "kube-api-access-r77km") pod "03d22a2b-171b-4e68-8469-c761d145e4d7" (UID: "03d22a2b-171b-4e68-8469-c761d145e4d7"). InnerVolumeSpecName "kube-api-access-r77km". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 15:02:08 crc kubenswrapper[4676]: I0930 15:02:08.685956 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r77km\" (UniqueName: \"kubernetes.io/projected/03d22a2b-171b-4e68-8469-c761d145e4d7-kube-api-access-r77km\") on node \"crc\" DevicePath \"\"" Sep 30 15:02:09 crc kubenswrapper[4676]: I0930 15:02:09.319322 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-q7tf5/crc-debug-mwb65" event={"ID":"03d22a2b-171b-4e68-8469-c761d145e4d7","Type":"ContainerDied","Data":"459d3cbe6ea2d382372a0371dbf7c4469c854b773898a09b9fda5b2a737c1a6b"} Sep 30 15:02:09 crc kubenswrapper[4676]: I0930 15:02:09.319363 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="459d3cbe6ea2d382372a0371dbf7c4469c854b773898a09b9fda5b2a737c1a6b" Sep 30 15:02:09 crc kubenswrapper[4676]: I0930 15:02:09.319385 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-q7tf5/crc-debug-mwb65" Sep 30 15:02:13 crc kubenswrapper[4676]: I0930 15:02:13.368008 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-q7tf5/crc-debug-mwb65"] Sep 30 15:02:13 crc kubenswrapper[4676]: I0930 15:02:13.376935 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-q7tf5/crc-debug-mwb65"] Sep 30 15:02:13 crc kubenswrapper[4676]: I0930 15:02:13.446040 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03d22a2b-171b-4e68-8469-c761d145e4d7" path="/var/lib/kubelet/pods/03d22a2b-171b-4e68-8469-c761d145e4d7/volumes" Sep 30 15:02:14 crc kubenswrapper[4676]: I0930 15:02:14.521969 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-q7tf5/crc-debug-h9rss"] Sep 30 15:02:14 crc kubenswrapper[4676]: E0930 15:02:14.523001 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03d22a2b-171b-4e68-8469-c761d145e4d7" containerName="container-00" Sep 30 15:02:14 crc kubenswrapper[4676]: I0930 15:02:14.523017 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="03d22a2b-171b-4e68-8469-c761d145e4d7" containerName="container-00" Sep 30 15:02:14 crc kubenswrapper[4676]: I0930 15:02:14.523297 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="03d22a2b-171b-4e68-8469-c761d145e4d7" containerName="container-00" Sep 30 15:02:14 crc kubenswrapper[4676]: I0930 15:02:14.523958 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-q7tf5/crc-debug-h9rss" Sep 30 15:02:14 crc kubenswrapper[4676]: I0930 15:02:14.693842 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/59f4bf8c-601d-4114-9218-d763bdadc6a0-host\") pod \"crc-debug-h9rss\" (UID: \"59f4bf8c-601d-4114-9218-d763bdadc6a0\") " pod="openshift-must-gather-q7tf5/crc-debug-h9rss" Sep 30 15:02:14 crc kubenswrapper[4676]: I0930 15:02:14.693949 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j75cr\" (UniqueName: \"kubernetes.io/projected/59f4bf8c-601d-4114-9218-d763bdadc6a0-kube-api-access-j75cr\") pod \"crc-debug-h9rss\" (UID: \"59f4bf8c-601d-4114-9218-d763bdadc6a0\") " pod="openshift-must-gather-q7tf5/crc-debug-h9rss" Sep 30 15:02:14 crc kubenswrapper[4676]: I0930 15:02:14.795680 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/59f4bf8c-601d-4114-9218-d763bdadc6a0-host\") pod \"crc-debug-h9rss\" (UID: \"59f4bf8c-601d-4114-9218-d763bdadc6a0\") " pod="openshift-must-gather-q7tf5/crc-debug-h9rss" Sep 30 15:02:14 crc kubenswrapper[4676]: I0930 15:02:14.795760 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j75cr\" (UniqueName: \"kubernetes.io/projected/59f4bf8c-601d-4114-9218-d763bdadc6a0-kube-api-access-j75cr\") pod \"crc-debug-h9rss\" (UID: \"59f4bf8c-601d-4114-9218-d763bdadc6a0\") " pod="openshift-must-gather-q7tf5/crc-debug-h9rss" Sep 30 15:02:14 crc kubenswrapper[4676]: I0930 15:02:14.795835 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/59f4bf8c-601d-4114-9218-d763bdadc6a0-host\") pod \"crc-debug-h9rss\" (UID: \"59f4bf8c-601d-4114-9218-d763bdadc6a0\") " pod="openshift-must-gather-q7tf5/crc-debug-h9rss" Sep 30 15:02:14 crc kubenswrapper[4676]: I0930 15:02:14.817571 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j75cr\" (UniqueName: \"kubernetes.io/projected/59f4bf8c-601d-4114-9218-d763bdadc6a0-kube-api-access-j75cr\") pod \"crc-debug-h9rss\" (UID: \"59f4bf8c-601d-4114-9218-d763bdadc6a0\") " pod="openshift-must-gather-q7tf5/crc-debug-h9rss" Sep 30 15:02:14 crc kubenswrapper[4676]: I0930 15:02:14.845872 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-q7tf5/crc-debug-h9rss" Sep 30 15:02:15 crc kubenswrapper[4676]: I0930 15:02:15.381958 4676 generic.go:334] "Generic (PLEG): container finished" podID="59f4bf8c-601d-4114-9218-d763bdadc6a0" containerID="5afc5087e78af333beb4ae6094d545820f40c26f831ca3a7a3f082f5117573bd" exitCode=0 Sep 30 15:02:15 crc kubenswrapper[4676]: I0930 15:02:15.382035 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-q7tf5/crc-debug-h9rss" event={"ID":"59f4bf8c-601d-4114-9218-d763bdadc6a0","Type":"ContainerDied","Data":"5afc5087e78af333beb4ae6094d545820f40c26f831ca3a7a3f082f5117573bd"} Sep 30 15:02:15 crc kubenswrapper[4676]: I0930 15:02:15.382289 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-q7tf5/crc-debug-h9rss" event={"ID":"59f4bf8c-601d-4114-9218-d763bdadc6a0","Type":"ContainerStarted","Data":"d9ddf8d476bcdcbb46c6057ba7b93dfbc16c566d45c1fc75e3c06c78c9fe1bfb"} Sep 30 15:02:15 crc kubenswrapper[4676]: I0930 15:02:15.418019 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-q7tf5/crc-debug-h9rss"] Sep 30 15:02:15 crc kubenswrapper[4676]: I0930 15:02:15.427677 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-q7tf5/crc-debug-h9rss"] Sep 30 15:02:16 crc kubenswrapper[4676]: I0930 15:02:16.497413 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-q7tf5/crc-debug-h9rss" Sep 30 15:02:16 crc kubenswrapper[4676]: I0930 15:02:16.632177 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/59f4bf8c-601d-4114-9218-d763bdadc6a0-host\") pod \"59f4bf8c-601d-4114-9218-d763bdadc6a0\" (UID: \"59f4bf8c-601d-4114-9218-d763bdadc6a0\") " Sep 30 15:02:16 crc kubenswrapper[4676]: I0930 15:02:16.632242 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j75cr\" (UniqueName: \"kubernetes.io/projected/59f4bf8c-601d-4114-9218-d763bdadc6a0-kube-api-access-j75cr\") pod \"59f4bf8c-601d-4114-9218-d763bdadc6a0\" (UID: \"59f4bf8c-601d-4114-9218-d763bdadc6a0\") " Sep 30 15:02:16 crc kubenswrapper[4676]: I0930 15:02:16.632317 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/59f4bf8c-601d-4114-9218-d763bdadc6a0-host" (OuterVolumeSpecName: "host") pod "59f4bf8c-601d-4114-9218-d763bdadc6a0" (UID: "59f4bf8c-601d-4114-9218-d763bdadc6a0"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 15:02:16 crc kubenswrapper[4676]: I0930 15:02:16.632786 4676 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/59f4bf8c-601d-4114-9218-d763bdadc6a0-host\") on node \"crc\" DevicePath \"\"" Sep 30 15:02:16 crc kubenswrapper[4676]: I0930 15:02:16.639327 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59f4bf8c-601d-4114-9218-d763bdadc6a0-kube-api-access-j75cr" (OuterVolumeSpecName: "kube-api-access-j75cr") pod "59f4bf8c-601d-4114-9218-d763bdadc6a0" (UID: "59f4bf8c-601d-4114-9218-d763bdadc6a0"). InnerVolumeSpecName "kube-api-access-j75cr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 15:02:16 crc kubenswrapper[4676]: I0930 15:02:16.735132 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j75cr\" (UniqueName: \"kubernetes.io/projected/59f4bf8c-601d-4114-9218-d763bdadc6a0-kube-api-access-j75cr\") on node \"crc\" DevicePath \"\"" Sep 30 15:02:16 crc kubenswrapper[4676]: I0930 15:02:16.857524 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_64d7a47084c6b06ecfa189b2173a2aa65d74cada1beaa569dc58135644wj2pm_0c22d3ee-9eda-44ac-b7af-367b71fc5505/util/0.log" Sep 30 15:02:17 crc kubenswrapper[4676]: I0930 15:02:17.038656 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_64d7a47084c6b06ecfa189b2173a2aa65d74cada1beaa569dc58135644wj2pm_0c22d3ee-9eda-44ac-b7af-367b71fc5505/util/0.log" Sep 30 15:02:17 crc kubenswrapper[4676]: I0930 15:02:17.065823 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_64d7a47084c6b06ecfa189b2173a2aa65d74cada1beaa569dc58135644wj2pm_0c22d3ee-9eda-44ac-b7af-367b71fc5505/pull/0.log" Sep 30 15:02:17 crc kubenswrapper[4676]: I0930 15:02:17.091603 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_64d7a47084c6b06ecfa189b2173a2aa65d74cada1beaa569dc58135644wj2pm_0c22d3ee-9eda-44ac-b7af-367b71fc5505/pull/0.log" Sep 30 15:02:17 crc kubenswrapper[4676]: I0930 15:02:17.228274 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_64d7a47084c6b06ecfa189b2173a2aa65d74cada1beaa569dc58135644wj2pm_0c22d3ee-9eda-44ac-b7af-367b71fc5505/pull/0.log" Sep 30 15:02:17 crc kubenswrapper[4676]: I0930 15:02:17.276662 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_64d7a47084c6b06ecfa189b2173a2aa65d74cada1beaa569dc58135644wj2pm_0c22d3ee-9eda-44ac-b7af-367b71fc5505/util/0.log" Sep 30 15:02:17 crc kubenswrapper[4676]: I0930 15:02:17.293245 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_64d7a47084c6b06ecfa189b2173a2aa65d74cada1beaa569dc58135644wj2pm_0c22d3ee-9eda-44ac-b7af-367b71fc5505/extract/0.log" Sep 30 15:02:17 crc kubenswrapper[4676]: I0930 15:02:17.402961 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-q7tf5/crc-debug-h9rss" Sep 30 15:02:17 crc kubenswrapper[4676]: I0930 15:02:17.402990 4676 scope.go:117] "RemoveContainer" containerID="5afc5087e78af333beb4ae6094d545820f40c26f831ca3a7a3f082f5117573bd" Sep 30 15:02:17 crc kubenswrapper[4676]: I0930 15:02:17.448324 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59f4bf8c-601d-4114-9218-d763bdadc6a0" path="/var/lib/kubelet/pods/59f4bf8c-601d-4114-9218-d763bdadc6a0/volumes" Sep 30 15:02:17 crc kubenswrapper[4676]: I0930 15:02:17.485039 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-7b844_9e7d83e3-0f96-4a53-88ad-568d39435e5f/kube-rbac-proxy/0.log" Sep 30 15:02:17 crc kubenswrapper[4676]: I0930 15:02:17.540409 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-7b844_9e7d83e3-0f96-4a53-88ad-568d39435e5f/manager/0.log" Sep 30 15:02:17 crc kubenswrapper[4676]: I0930 15:02:17.557238 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-7tf2x_2342a742-ce41-4487-9d32-34fc69cb4445/kube-rbac-proxy/0.log" Sep 30 15:02:17 crc kubenswrapper[4676]: I0930 15:02:17.680785 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-7tf2x_2342a742-ce41-4487-9d32-34fc69cb4445/manager/0.log" Sep 30 15:02:17 crc kubenswrapper[4676]: I0930 15:02:17.721352 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-7qc9p_8f9d1069-29eb-42e5-8029-1ed616f31c4a/kube-rbac-proxy/0.log" Sep 30 15:02:17 crc kubenswrapper[4676]: I0930 15:02:17.789480 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-7qc9p_8f9d1069-29eb-42e5-8029-1ed616f31c4a/manager/0.log" Sep 30 15:02:17 crc kubenswrapper[4676]: I0930 15:02:17.927320 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-zn6zg_b5ead6b1-3f68-454e-847c-89cac8d7f1f0/kube-rbac-proxy/0.log" Sep 30 15:02:18 crc kubenswrapper[4676]: I0930 15:02:18.007356 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-zn6zg_b5ead6b1-3f68-454e-847c-89cac8d7f1f0/manager/0.log" Sep 30 15:02:18 crc kubenswrapper[4676]: I0930 15:02:18.094867 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-6qcf5_930c8b21-3bfd-497b-9bc7-60f2cf7abde6/kube-rbac-proxy/0.log" Sep 30 15:02:18 crc kubenswrapper[4676]: I0930 15:02:18.138815 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-6qcf5_930c8b21-3bfd-497b-9bc7-60f2cf7abde6/manager/0.log" Sep 30 15:02:18 crc kubenswrapper[4676]: I0930 15:02:18.180562 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-s9znt_1d51c97e-7c47-4274-8bd4-bc3d7402a378/kube-rbac-proxy/0.log" Sep 30 15:02:18 crc kubenswrapper[4676]: I0930 15:02:18.312973 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-s9znt_1d51c97e-7c47-4274-8bd4-bc3d7402a378/manager/0.log" Sep 30 15:02:18 crc kubenswrapper[4676]: I0930 15:02:18.425741 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7d857cc749-mg54v_c15f6efe-27f0-4f55-b1f0-957366ff23a4/kube-rbac-proxy/0.log" Sep 30 15:02:18 crc kubenswrapper[4676]: I0930 15:02:18.516371 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7d857cc749-mg54v_c15f6efe-27f0-4f55-b1f0-957366ff23a4/manager/0.log" Sep 30 15:02:18 crc kubenswrapper[4676]: I0930 15:02:18.537459 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-7975b88857-4dzg7_5e535753-178a-4b7b-b20c-e13fa0be5ce1/kube-rbac-proxy/0.log" Sep 30 15:02:18 crc kubenswrapper[4676]: I0930 15:02:18.633744 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-7975b88857-4dzg7_5e535753-178a-4b7b-b20c-e13fa0be5ce1/manager/0.log" Sep 30 15:02:18 crc kubenswrapper[4676]: I0930 15:02:18.744459 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-8pv7q_7e6672d2-5e94-4d5d-b927-ad3573b95469/kube-rbac-proxy/0.log" Sep 30 15:02:18 crc kubenswrapper[4676]: I0930 15:02:18.805775 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-8pv7q_7e6672d2-5e94-4d5d-b927-ad3573b95469/manager/0.log" Sep 30 15:02:18 crc kubenswrapper[4676]: I0930 15:02:18.926123 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6d68dbc695-gh2vw_daee0b60-331c-4108-8881-66cf4eb731e0/manager/0.log" Sep 30 15:02:18 crc kubenswrapper[4676]: I0930 15:02:18.932093 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6d68dbc695-gh2vw_daee0b60-331c-4108-8881-66cf4eb731e0/kube-rbac-proxy/0.log" Sep 30 15:02:19 crc kubenswrapper[4676]: I0930 15:02:19.037323 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-n5djz_d9d6dec4-1cc3-40c5-b90a-71b5f7a7da85/kube-rbac-proxy/0.log" Sep 30 15:02:19 crc kubenswrapper[4676]: I0930 15:02:19.110584 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-n5djz_d9d6dec4-1cc3-40c5-b90a-71b5f7a7da85/manager/0.log" Sep 30 15:02:19 crc kubenswrapper[4676]: I0930 15:02:19.186709 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64d7b59854-t6h2t_dbe0db98-4cbd-49d2-9f6a-f54a8189c64b/kube-rbac-proxy/0.log" Sep 30 15:02:19 crc kubenswrapper[4676]: I0930 15:02:19.288617 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64d7b59854-t6h2t_dbe0db98-4cbd-49d2-9f6a-f54a8189c64b/manager/0.log" Sep 30 15:02:19 crc kubenswrapper[4676]: I0930 15:02:19.351397 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-c7c776c96-8fx9n_5dbc4210-e31a-4bf8-a5cb-6f00a7406743/kube-rbac-proxy/0.log" Sep 30 15:02:19 crc kubenswrapper[4676]: I0930 15:02:19.499323 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-c7c776c96-8fx9n_5dbc4210-e31a-4bf8-a5cb-6f00a7406743/manager/0.log" Sep 30 15:02:19 crc kubenswrapper[4676]: I0930 15:02:19.535395 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-76fcc6dc7c-8m45v_43f93725-c577-4253-ae9c-7d14e8aec0b9/kube-rbac-proxy/0.log" Sep 30 15:02:19 crc kubenswrapper[4676]: I0930 15:02:19.567289 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-76fcc6dc7c-8m45v_43f93725-c577-4253-ae9c-7d14e8aec0b9/manager/0.log" Sep 30 15:02:19 crc kubenswrapper[4676]: I0930 15:02:19.723061 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6d776955-qd7b8_96c8a26f-c044-429d-90eb-d0342486c32f/kube-rbac-proxy/0.log" Sep 30 15:02:19 crc kubenswrapper[4676]: I0930 15:02:19.733770 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6d776955-qd7b8_96c8a26f-c044-429d-90eb-d0342486c32f/manager/0.log" Sep 30 15:02:19 crc kubenswrapper[4676]: I0930 15:02:19.937630 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5f5687bfdd-7nt8g_72b68346-543d-4b80-ba31-9bcb856b6989/kube-rbac-proxy/0.log" Sep 30 15:02:19 crc kubenswrapper[4676]: I0930 15:02:19.978724 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-55ccb8ddf4-slxtv_df3c9717-78cf-49b2-a967-7177da8f2e17/kube-rbac-proxy/0.log" Sep 30 15:02:20 crc kubenswrapper[4676]: I0930 15:02:20.215096 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-fhxvs_2bf0efcb-3cb6-491b-961f-6655a84de268/registry-server/0.log" Sep 30 15:02:20 crc kubenswrapper[4676]: I0930 15:02:20.386579 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-55ccb8ddf4-slxtv_df3c9717-78cf-49b2-a967-7177da8f2e17/operator/0.log" Sep 30 15:02:20 crc kubenswrapper[4676]: I0930 15:02:20.411497 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-g2s9t_e51ea15a-8d04-4d56-956d-0fcf41846eb8/kube-rbac-proxy/0.log" Sep 30 15:02:20 crc kubenswrapper[4676]: I0930 15:02:20.529329 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-g2s9t_e51ea15a-8d04-4d56-956d-0fcf41846eb8/manager/0.log" Sep 30 15:02:20 crc kubenswrapper[4676]: I0930 15:02:20.628956 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-9dzn5_a3762232-4e9f-452e-aea4-c5feb443ad75/kube-rbac-proxy/0.log" Sep 30 15:02:20 crc kubenswrapper[4676]: I0930 15:02:20.667248 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-9dzn5_a3762232-4e9f-452e-aea4-c5feb443ad75/manager/0.log" Sep 30 15:02:20 crc kubenswrapper[4676]: I0930 15:02:20.821506 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-79d8469568-xkkwv_677c476e-c8df-4a21-9968-b2bd23b246f6/operator/0.log" Sep 30 15:02:20 crc kubenswrapper[4676]: I0930 15:02:20.943354 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-bc7dc7bd9-l4p8f_cedc986e-ac92-45e8-862a-fc4dcb60455d/kube-rbac-proxy/0.log" Sep 30 15:02:21 crc kubenswrapper[4676]: I0930 15:02:21.056663 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-bc7dc7bd9-l4p8f_cedc986e-ac92-45e8-862a-fc4dcb60455d/manager/0.log" Sep 30 15:02:21 crc kubenswrapper[4676]: I0930 15:02:21.165190 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-b8d54b5d7-wf6fb_aa6dd699-ccd5-476f-ab9c-3d4841ed591a/kube-rbac-proxy/0.log" Sep 30 15:02:21 crc kubenswrapper[4676]: I0930 15:02:21.194720 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5f5687bfdd-7nt8g_72b68346-543d-4b80-ba31-9bcb856b6989/manager/0.log" Sep 30 15:02:21 crc kubenswrapper[4676]: I0930 15:02:21.289222 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-f66b554c6-bvj6l_129d5672-c8dd-4a63-8d48-dc95c84a45b2/kube-rbac-proxy/0.log" Sep 30 15:02:21 crc kubenswrapper[4676]: I0930 15:02:21.292148 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-b8d54b5d7-wf6fb_aa6dd699-ccd5-476f-ab9c-3d4841ed591a/manager/0.log" Sep 30 15:02:21 crc kubenswrapper[4676]: I0930 15:02:21.306458 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zbl25"] Sep 30 15:02:21 crc kubenswrapper[4676]: E0930 15:02:21.306969 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59f4bf8c-601d-4114-9218-d763bdadc6a0" containerName="container-00" Sep 30 15:02:21 crc kubenswrapper[4676]: I0930 15:02:21.306993 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="59f4bf8c-601d-4114-9218-d763bdadc6a0" containerName="container-00" Sep 30 15:02:21 crc kubenswrapper[4676]: I0930 15:02:21.310072 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="59f4bf8c-601d-4114-9218-d763bdadc6a0" containerName="container-00" Sep 30 15:02:21 crc kubenswrapper[4676]: I0930 15:02:21.312049 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zbl25" Sep 30 15:02:21 crc kubenswrapper[4676]: I0930 15:02:21.324616 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zbl25"] Sep 30 15:02:21 crc kubenswrapper[4676]: I0930 15:02:21.426602 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-f66b554c6-bvj6l_129d5672-c8dd-4a63-8d48-dc95c84a45b2/manager/0.log" Sep 30 15:02:21 crc kubenswrapper[4676]: I0930 15:02:21.436822 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b338aceb-403e-47d9-905d-0e152b132f22-utilities\") pod \"redhat-operators-zbl25\" (UID: \"b338aceb-403e-47d9-905d-0e152b132f22\") " pod="openshift-marketplace/redhat-operators-zbl25" Sep 30 15:02:21 crc kubenswrapper[4676]: I0930 15:02:21.436941 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b338aceb-403e-47d9-905d-0e152b132f22-catalog-content\") pod \"redhat-operators-zbl25\" (UID: \"b338aceb-403e-47d9-905d-0e152b132f22\") " pod="openshift-marketplace/redhat-operators-zbl25" Sep 30 15:02:21 crc kubenswrapper[4676]: I0930 15:02:21.436974 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v46bw\" (UniqueName: \"kubernetes.io/projected/b338aceb-403e-47d9-905d-0e152b132f22-kube-api-access-v46bw\") pod \"redhat-operators-zbl25\" (UID: \"b338aceb-403e-47d9-905d-0e152b132f22\") " pod="openshift-marketplace/redhat-operators-zbl25" Sep 30 15:02:21 crc kubenswrapper[4676]: I0930 15:02:21.533191 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-76669f99c-bwq8t_347e3ac8-4477-4bab-a64b-a443098bb400/kube-rbac-proxy/0.log" Sep 30 15:02:21 crc kubenswrapper[4676]: I0930 15:02:21.539247 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b338aceb-403e-47d9-905d-0e152b132f22-utilities\") pod \"redhat-operators-zbl25\" (UID: \"b338aceb-403e-47d9-905d-0e152b132f22\") " pod="openshift-marketplace/redhat-operators-zbl25" Sep 30 15:02:21 crc kubenswrapper[4676]: I0930 15:02:21.539570 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b338aceb-403e-47d9-905d-0e152b132f22-catalog-content\") pod \"redhat-operators-zbl25\" (UID: \"b338aceb-403e-47d9-905d-0e152b132f22\") " pod="openshift-marketplace/redhat-operators-zbl25" Sep 30 15:02:21 crc kubenswrapper[4676]: I0930 15:02:21.539715 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v46bw\" (UniqueName: \"kubernetes.io/projected/b338aceb-403e-47d9-905d-0e152b132f22-kube-api-access-v46bw\") pod \"redhat-operators-zbl25\" (UID: \"b338aceb-403e-47d9-905d-0e152b132f22\") " pod="openshift-marketplace/redhat-operators-zbl25" Sep 30 15:02:21 crc kubenswrapper[4676]: I0930 15:02:21.540015 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b338aceb-403e-47d9-905d-0e152b132f22-catalog-content\") pod \"redhat-operators-zbl25\" (UID: \"b338aceb-403e-47d9-905d-0e152b132f22\") " pod="openshift-marketplace/redhat-operators-zbl25" Sep 30 15:02:21 crc kubenswrapper[4676]: I0930 15:02:21.541049 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b338aceb-403e-47d9-905d-0e152b132f22-utilities\") pod \"redhat-operators-zbl25\" (UID: \"b338aceb-403e-47d9-905d-0e152b132f22\") " pod="openshift-marketplace/redhat-operators-zbl25" Sep 30 15:02:21 crc kubenswrapper[4676]: I0930 15:02:21.562648 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v46bw\" (UniqueName: \"kubernetes.io/projected/b338aceb-403e-47d9-905d-0e152b132f22-kube-api-access-v46bw\") pod \"redhat-operators-zbl25\" (UID: \"b338aceb-403e-47d9-905d-0e152b132f22\") " pod="openshift-marketplace/redhat-operators-zbl25" Sep 30 15:02:21 crc kubenswrapper[4676]: I0930 15:02:21.588512 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-76669f99c-bwq8t_347e3ac8-4477-4bab-a64b-a443098bb400/manager/0.log" Sep 30 15:02:21 crc kubenswrapper[4676]: I0930 15:02:21.641010 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zbl25" Sep 30 15:02:22 crc kubenswrapper[4676]: I0930 15:02:22.147841 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zbl25"] Sep 30 15:02:22 crc kubenswrapper[4676]: I0930 15:02:22.463214 4676 generic.go:334] "Generic (PLEG): container finished" podID="b338aceb-403e-47d9-905d-0e152b132f22" containerID="83c485e99f31a037e608042d0f3ee843b2bdd7806a37beb31b37ed9ff60723cb" exitCode=0 Sep 30 15:02:22 crc kubenswrapper[4676]: I0930 15:02:22.463413 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zbl25" event={"ID":"b338aceb-403e-47d9-905d-0e152b132f22","Type":"ContainerDied","Data":"83c485e99f31a037e608042d0f3ee843b2bdd7806a37beb31b37ed9ff60723cb"} Sep 30 15:02:22 crc kubenswrapper[4676]: I0930 15:02:22.463556 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zbl25" event={"ID":"b338aceb-403e-47d9-905d-0e152b132f22","Type":"ContainerStarted","Data":"0067b370be481492ff3a0e08205746c1997cb48307c4ffc074e70d8c3a5de139"} Sep 30 15:02:24 crc kubenswrapper[4676]: I0930 15:02:24.492086 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zbl25" event={"ID":"b338aceb-403e-47d9-905d-0e152b132f22","Type":"ContainerStarted","Data":"0ea1004b6bb72c28efe16a4af0fd417c9583a356884d45f9edc96da90c31ca67"} Sep 30 15:02:27 crc kubenswrapper[4676]: I0930 15:02:27.531677 4676 generic.go:334] "Generic (PLEG): container finished" podID="b338aceb-403e-47d9-905d-0e152b132f22" containerID="0ea1004b6bb72c28efe16a4af0fd417c9583a356884d45f9edc96da90c31ca67" exitCode=0 Sep 30 15:02:27 crc kubenswrapper[4676]: I0930 15:02:27.531744 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zbl25" event={"ID":"b338aceb-403e-47d9-905d-0e152b132f22","Type":"ContainerDied","Data":"0ea1004b6bb72c28efe16a4af0fd417c9583a356884d45f9edc96da90c31ca67"} Sep 30 15:02:28 crc kubenswrapper[4676]: I0930 15:02:28.547358 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zbl25" event={"ID":"b338aceb-403e-47d9-905d-0e152b132f22","Type":"ContainerStarted","Data":"82ba06ba8f6b6c811a80ca7974a92e8044814b826533c39bbaee8fc3c79cbf30"} Sep 30 15:02:29 crc kubenswrapper[4676]: I0930 15:02:29.919984 4676 patch_prober.go:28] interesting pod/machine-config-daemon-4k2dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 15:02:29 crc kubenswrapper[4676]: I0930 15:02:29.920345 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 15:02:31 crc kubenswrapper[4676]: I0930 15:02:31.641949 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zbl25" Sep 30 15:02:31 crc kubenswrapper[4676]: I0930 15:02:31.642961 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zbl25" Sep 30 15:02:32 crc kubenswrapper[4676]: I0930 15:02:32.691667 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zbl25" podUID="b338aceb-403e-47d9-905d-0e152b132f22" containerName="registry-server" probeResult="failure" output=< Sep 30 15:02:32 crc kubenswrapper[4676]: timeout: failed to connect service ":50051" within 1s Sep 30 15:02:32 crc kubenswrapper[4676]: > Sep 30 15:02:37 crc kubenswrapper[4676]: I0930 15:02:37.114319 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-wb8d2_97f344fc-42a3-4630-af31-ea25b72941e6/control-plane-machine-set-operator/0.log" Sep 30 15:02:37 crc kubenswrapper[4676]: I0930 15:02:37.296261 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-c9kjj_ebd6b987-d54a-4692-800a-8eadc5e8690c/kube-rbac-proxy/0.log" Sep 30 15:02:37 crc kubenswrapper[4676]: I0930 15:02:37.310620 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-c9kjj_ebd6b987-d54a-4692-800a-8eadc5e8690c/machine-api-operator/0.log" Sep 30 15:02:41 crc kubenswrapper[4676]: I0930 15:02:41.687502 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zbl25" Sep 30 15:02:41 crc kubenswrapper[4676]: I0930 15:02:41.707285 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zbl25" podStartSLOduration=15.003355286 podStartE2EDuration="20.707255596s" podCreationTimestamp="2025-09-30 15:02:21 +0000 UTC" firstStartedPulling="2025-09-30 15:02:22.465745499 +0000 UTC m=+3846.448833928" lastFinishedPulling="2025-09-30 15:02:28.169645809 +0000 UTC m=+3852.152734238" observedRunningTime="2025-09-30 15:02:28.564674208 +0000 UTC m=+3852.547762637" watchObservedRunningTime="2025-09-30 15:02:41.707255596 +0000 UTC m=+3865.690344025" Sep 30 15:02:41 crc kubenswrapper[4676]: I0930 15:02:41.745260 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zbl25" Sep 30 15:02:41 crc kubenswrapper[4676]: I0930 15:02:41.923440 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zbl25"] Sep 30 15:02:43 crc kubenswrapper[4676]: I0930 15:02:43.680452 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zbl25" podUID="b338aceb-403e-47d9-905d-0e152b132f22" containerName="registry-server" containerID="cri-o://82ba06ba8f6b6c811a80ca7974a92e8044814b826533c39bbaee8fc3c79cbf30" gracePeriod=2 Sep 30 15:02:44 crc kubenswrapper[4676]: I0930 15:02:44.157028 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zbl25" Sep 30 15:02:44 crc kubenswrapper[4676]: I0930 15:02:44.329256 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b338aceb-403e-47d9-905d-0e152b132f22-utilities\") pod \"b338aceb-403e-47d9-905d-0e152b132f22\" (UID: \"b338aceb-403e-47d9-905d-0e152b132f22\") " Sep 30 15:02:44 crc kubenswrapper[4676]: I0930 15:02:44.329960 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b338aceb-403e-47d9-905d-0e152b132f22-catalog-content\") pod \"b338aceb-403e-47d9-905d-0e152b132f22\" (UID: \"b338aceb-403e-47d9-905d-0e152b132f22\") " Sep 30 15:02:44 crc kubenswrapper[4676]: I0930 15:02:44.330135 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v46bw\" (UniqueName: \"kubernetes.io/projected/b338aceb-403e-47d9-905d-0e152b132f22-kube-api-access-v46bw\") pod \"b338aceb-403e-47d9-905d-0e152b132f22\" (UID: \"b338aceb-403e-47d9-905d-0e152b132f22\") " Sep 30 15:02:44 crc kubenswrapper[4676]: I0930 15:02:44.330216 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b338aceb-403e-47d9-905d-0e152b132f22-utilities" (OuterVolumeSpecName: "utilities") pod "b338aceb-403e-47d9-905d-0e152b132f22" (UID: "b338aceb-403e-47d9-905d-0e152b132f22"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 15:02:44 crc kubenswrapper[4676]: I0930 15:02:44.330704 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b338aceb-403e-47d9-905d-0e152b132f22-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 15:02:44 crc kubenswrapper[4676]: I0930 15:02:44.337657 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b338aceb-403e-47d9-905d-0e152b132f22-kube-api-access-v46bw" (OuterVolumeSpecName: "kube-api-access-v46bw") pod "b338aceb-403e-47d9-905d-0e152b132f22" (UID: "b338aceb-403e-47d9-905d-0e152b132f22"). InnerVolumeSpecName "kube-api-access-v46bw". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 15:02:44 crc kubenswrapper[4676]: I0930 15:02:44.426667 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b338aceb-403e-47d9-905d-0e152b132f22-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b338aceb-403e-47d9-905d-0e152b132f22" (UID: "b338aceb-403e-47d9-905d-0e152b132f22"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 15:02:44 crc kubenswrapper[4676]: I0930 15:02:44.433800 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b338aceb-403e-47d9-905d-0e152b132f22-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 15:02:44 crc kubenswrapper[4676]: I0930 15:02:44.433859 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v46bw\" (UniqueName: \"kubernetes.io/projected/b338aceb-403e-47d9-905d-0e152b132f22-kube-api-access-v46bw\") on node \"crc\" DevicePath \"\"" Sep 30 15:02:44 crc kubenswrapper[4676]: I0930 15:02:44.691251 4676 generic.go:334] "Generic (PLEG): container finished" podID="b338aceb-403e-47d9-905d-0e152b132f22" containerID="82ba06ba8f6b6c811a80ca7974a92e8044814b826533c39bbaee8fc3c79cbf30" exitCode=0 Sep 30 15:02:44 crc kubenswrapper[4676]: I0930 15:02:44.691293 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zbl25" event={"ID":"b338aceb-403e-47d9-905d-0e152b132f22","Type":"ContainerDied","Data":"82ba06ba8f6b6c811a80ca7974a92e8044814b826533c39bbaee8fc3c79cbf30"} Sep 30 15:02:44 crc kubenswrapper[4676]: I0930 15:02:44.691318 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zbl25" event={"ID":"b338aceb-403e-47d9-905d-0e152b132f22","Type":"ContainerDied","Data":"0067b370be481492ff3a0e08205746c1997cb48307c4ffc074e70d8c3a5de139"} Sep 30 15:02:44 crc kubenswrapper[4676]: I0930 15:02:44.691323 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zbl25" Sep 30 15:02:44 crc kubenswrapper[4676]: I0930 15:02:44.691333 4676 scope.go:117] "RemoveContainer" containerID="82ba06ba8f6b6c811a80ca7974a92e8044814b826533c39bbaee8fc3c79cbf30" Sep 30 15:02:44 crc kubenswrapper[4676]: I0930 15:02:44.710707 4676 scope.go:117] "RemoveContainer" containerID="0ea1004b6bb72c28efe16a4af0fd417c9583a356884d45f9edc96da90c31ca67" Sep 30 15:02:44 crc kubenswrapper[4676]: I0930 15:02:44.735023 4676 scope.go:117] "RemoveContainer" containerID="83c485e99f31a037e608042d0f3ee843b2bdd7806a37beb31b37ed9ff60723cb" Sep 30 15:02:44 crc kubenswrapper[4676]: I0930 15:02:44.740054 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zbl25"] Sep 30 15:02:44 crc kubenswrapper[4676]: I0930 15:02:44.749836 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zbl25"] Sep 30 15:02:44 crc kubenswrapper[4676]: I0930 15:02:44.773951 4676 scope.go:117] "RemoveContainer" containerID="82ba06ba8f6b6c811a80ca7974a92e8044814b826533c39bbaee8fc3c79cbf30" Sep 30 15:02:44 crc kubenswrapper[4676]: E0930 15:02:44.774418 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82ba06ba8f6b6c811a80ca7974a92e8044814b826533c39bbaee8fc3c79cbf30\": container with ID starting with 82ba06ba8f6b6c811a80ca7974a92e8044814b826533c39bbaee8fc3c79cbf30 not found: ID does not exist" containerID="82ba06ba8f6b6c811a80ca7974a92e8044814b826533c39bbaee8fc3c79cbf30" Sep 30 15:02:44 crc kubenswrapper[4676]: I0930 15:02:44.774466 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82ba06ba8f6b6c811a80ca7974a92e8044814b826533c39bbaee8fc3c79cbf30"} err="failed to get container status \"82ba06ba8f6b6c811a80ca7974a92e8044814b826533c39bbaee8fc3c79cbf30\": rpc error: code = NotFound desc = could not find container \"82ba06ba8f6b6c811a80ca7974a92e8044814b826533c39bbaee8fc3c79cbf30\": container with ID starting with 82ba06ba8f6b6c811a80ca7974a92e8044814b826533c39bbaee8fc3c79cbf30 not found: ID does not exist" Sep 30 15:02:44 crc kubenswrapper[4676]: I0930 15:02:44.774494 4676 scope.go:117] "RemoveContainer" containerID="0ea1004b6bb72c28efe16a4af0fd417c9583a356884d45f9edc96da90c31ca67" Sep 30 15:02:44 crc kubenswrapper[4676]: E0930 15:02:44.775192 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ea1004b6bb72c28efe16a4af0fd417c9583a356884d45f9edc96da90c31ca67\": container with ID starting with 0ea1004b6bb72c28efe16a4af0fd417c9583a356884d45f9edc96da90c31ca67 not found: ID does not exist" containerID="0ea1004b6bb72c28efe16a4af0fd417c9583a356884d45f9edc96da90c31ca67" Sep 30 15:02:44 crc kubenswrapper[4676]: I0930 15:02:44.775228 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ea1004b6bb72c28efe16a4af0fd417c9583a356884d45f9edc96da90c31ca67"} err="failed to get container status \"0ea1004b6bb72c28efe16a4af0fd417c9583a356884d45f9edc96da90c31ca67\": rpc error: code = NotFound desc = could not find container \"0ea1004b6bb72c28efe16a4af0fd417c9583a356884d45f9edc96da90c31ca67\": container with ID starting with 0ea1004b6bb72c28efe16a4af0fd417c9583a356884d45f9edc96da90c31ca67 not found: ID does not exist" Sep 30 15:02:44 crc kubenswrapper[4676]: I0930 15:02:44.775255 4676 scope.go:117] "RemoveContainer" containerID="83c485e99f31a037e608042d0f3ee843b2bdd7806a37beb31b37ed9ff60723cb" Sep 30 15:02:44 crc kubenswrapper[4676]: E0930 15:02:44.775530 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83c485e99f31a037e608042d0f3ee843b2bdd7806a37beb31b37ed9ff60723cb\": container with ID starting with 83c485e99f31a037e608042d0f3ee843b2bdd7806a37beb31b37ed9ff60723cb not found: ID does not exist" containerID="83c485e99f31a037e608042d0f3ee843b2bdd7806a37beb31b37ed9ff60723cb" Sep 30 15:02:44 crc kubenswrapper[4676]: I0930 15:02:44.775557 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83c485e99f31a037e608042d0f3ee843b2bdd7806a37beb31b37ed9ff60723cb"} err="failed to get container status \"83c485e99f31a037e608042d0f3ee843b2bdd7806a37beb31b37ed9ff60723cb\": rpc error: code = NotFound desc = could not find container \"83c485e99f31a037e608042d0f3ee843b2bdd7806a37beb31b37ed9ff60723cb\": container with ID starting with 83c485e99f31a037e608042d0f3ee843b2bdd7806a37beb31b37ed9ff60723cb not found: ID does not exist" Sep 30 15:02:45 crc kubenswrapper[4676]: I0930 15:02:45.444463 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b338aceb-403e-47d9-905d-0e152b132f22" path="/var/lib/kubelet/pods/b338aceb-403e-47d9-905d-0e152b132f22/volumes" Sep 30 15:02:48 crc kubenswrapper[4676]: I0930 15:02:48.436590 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-7k5cw_74eb08ee-8ebc-4f31-a952-22b99cfb68ac/cert-manager-controller/0.log" Sep 30 15:02:48 crc kubenswrapper[4676]: I0930 15:02:48.552363 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-qd84c_f593d783-2014-4870-b97a-66bd22eba1b4/cert-manager-cainjector/0.log" Sep 30 15:02:48 crc kubenswrapper[4676]: I0930 15:02:48.615257 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-srpgg_faf2d116-cfb0-451e-b919-5ff1e93ee944/cert-manager-webhook/0.log" Sep 30 15:02:59 crc kubenswrapper[4676]: I0930 15:02:59.172013 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-864bb6dfb5-l7jdz_87e24d2c-e308-4f03-a28c-eb3ca52bb5f6/nmstate-console-plugin/0.log" Sep 30 15:02:59 crc kubenswrapper[4676]: I0930 15:02:59.362430 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-w7g4n_ec2f5c4c-d407-4e7d-9f0b-406feff7cdcc/nmstate-handler/0.log" Sep 30 15:02:59 crc kubenswrapper[4676]: I0930 15:02:59.405891 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-w5b6p_6c1f56f8-8d1c-47f2-9099-25a15fdaee77/kube-rbac-proxy/0.log" Sep 30 15:02:59 crc kubenswrapper[4676]: I0930 15:02:59.454434 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-w5b6p_6c1f56f8-8d1c-47f2-9099-25a15fdaee77/nmstate-metrics/0.log" Sep 30 15:02:59 crc kubenswrapper[4676]: I0930 15:02:59.603714 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5d6f6cfd66-4v459_73fea0d7-e7c7-4db0-8205-1c86203f6a88/nmstate-operator/0.log" Sep 30 15:02:59 crc kubenswrapper[4676]: I0930 15:02:59.706245 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6d689559c5-rb9kz_9820f583-f5b4-4642-ade6-683242648b4d/nmstate-webhook/0.log" Sep 30 15:02:59 crc kubenswrapper[4676]: I0930 15:02:59.920173 4676 patch_prober.go:28] interesting pod/machine-config-daemon-4k2dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 15:02:59 crc kubenswrapper[4676]: I0930 15:02:59.920253 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 15:03:12 crc kubenswrapper[4676]: I0930 15:03:12.427153 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-c24th_43a57b66-554a-40f3-ae9c-1f8dd4053405/kube-rbac-proxy/0.log" Sep 30 15:03:12 crc kubenswrapper[4676]: I0930 15:03:12.483838 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-c24th_43a57b66-554a-40f3-ae9c-1f8dd4053405/controller/0.log" Sep 30 15:03:12 crc kubenswrapper[4676]: I0930 15:03:12.628129 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-5478bdb765-rdndd_518077e0-6a46-480c-9cdd-d5d5c64814b7/frr-k8s-webhook-server/0.log" Sep 30 15:03:12 crc kubenswrapper[4676]: I0930 15:03:12.718016 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x4mlw_5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5/cp-frr-files/0.log" Sep 30 15:03:12 crc kubenswrapper[4676]: I0930 15:03:12.877178 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x4mlw_5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5/cp-reloader/0.log" Sep 30 15:03:12 crc kubenswrapper[4676]: I0930 15:03:12.891727 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x4mlw_5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5/cp-frr-files/0.log" Sep 30 15:03:12 crc kubenswrapper[4676]: I0930 15:03:12.936067 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x4mlw_5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5/cp-metrics/0.log" Sep 30 15:03:12 crc kubenswrapper[4676]: I0930 15:03:12.941135 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x4mlw_5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5/cp-reloader/0.log" Sep 30 15:03:13 crc kubenswrapper[4676]: I0930 15:03:13.119192 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x4mlw_5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5/cp-frr-files/0.log" Sep 30 15:03:13 crc kubenswrapper[4676]: I0930 15:03:13.140133 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x4mlw_5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5/cp-metrics/0.log" Sep 30 15:03:13 crc kubenswrapper[4676]: I0930 15:03:13.181983 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x4mlw_5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5/cp-reloader/0.log" Sep 30 15:03:13 crc kubenswrapper[4676]: I0930 15:03:13.182034 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x4mlw_5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5/cp-metrics/0.log" Sep 30 15:03:13 crc kubenswrapper[4676]: I0930 15:03:13.366742 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x4mlw_5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5/cp-frr-files/0.log" Sep 30 15:03:13 crc kubenswrapper[4676]: I0930 15:03:13.366827 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x4mlw_5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5/cp-reloader/0.log" Sep 30 15:03:13 crc kubenswrapper[4676]: I0930 15:03:13.371298 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x4mlw_5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5/controller/0.log" Sep 30 15:03:13 crc kubenswrapper[4676]: I0930 15:03:13.379782 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x4mlw_5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5/cp-metrics/0.log" Sep 30 15:03:13 crc kubenswrapper[4676]: I0930 15:03:13.563575 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x4mlw_5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5/frr-metrics/0.log" Sep 30 15:03:13 crc kubenswrapper[4676]: I0930 15:03:13.564837 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x4mlw_5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5/kube-rbac-proxy-frr/0.log" Sep 30 15:03:13 crc kubenswrapper[4676]: I0930 15:03:13.575203 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x4mlw_5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5/kube-rbac-proxy/0.log" Sep 30 15:03:13 crc kubenswrapper[4676]: I0930 15:03:13.747069 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x4mlw_5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5/reloader/0.log" Sep 30 15:03:13 crc kubenswrapper[4676]: I0930 15:03:13.811828 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-76c7cc4875-dchh6_2a1324e4-02a6-4c81-b533-086cbd21e10f/manager/0.log" Sep 30 15:03:14 crc kubenswrapper[4676]: I0930 15:03:14.059902 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-fb5687d59-w22bc_eb9ac7e3-b48e-44d4-9053-7e5ecec6a138/webhook-server/0.log" Sep 30 15:03:14 crc kubenswrapper[4676]: I0930 15:03:14.220971 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-4gcd5_32604773-f635-41b2-a665-740ace937075/kube-rbac-proxy/0.log" Sep 30 15:03:14 crc kubenswrapper[4676]: I0930 15:03:14.858205 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-4gcd5_32604773-f635-41b2-a665-740ace937075/speaker/0.log" Sep 30 15:03:15 crc kubenswrapper[4676]: I0930 15:03:15.178245 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x4mlw_5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5/frr/0.log" Sep 30 15:03:25 crc kubenswrapper[4676]: I0930 15:03:25.629930 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcksm7x_bc584293-077a-42bb-a90c-5f1931728034/util/0.log" Sep 30 15:03:25 crc kubenswrapper[4676]: I0930 15:03:25.809916 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcksm7x_bc584293-077a-42bb-a90c-5f1931728034/util/0.log" Sep 30 15:03:25 crc kubenswrapper[4676]: I0930 15:03:25.810021 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcksm7x_bc584293-077a-42bb-a90c-5f1931728034/pull/0.log" Sep 30 15:03:25 crc kubenswrapper[4676]: I0930 15:03:25.817758 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcksm7x_bc584293-077a-42bb-a90c-5f1931728034/pull/0.log" Sep 30 15:03:26 crc kubenswrapper[4676]: I0930 15:03:26.014872 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcksm7x_bc584293-077a-42bb-a90c-5f1931728034/util/0.log" Sep 30 15:03:26 crc kubenswrapper[4676]: I0930 15:03:26.017727 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcksm7x_bc584293-077a-42bb-a90c-5f1931728034/pull/0.log" Sep 30 15:03:26 crc kubenswrapper[4676]: I0930 15:03:26.037348 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcksm7x_bc584293-077a-42bb-a90c-5f1931728034/extract/0.log" Sep 30 15:03:26 crc kubenswrapper[4676]: I0930 15:03:26.168005 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qxnqk_6b3ca6ec-bdbc-44be-90e5-21fa0d2cd5f1/extract-utilities/0.log" Sep 30 15:03:26 crc kubenswrapper[4676]: I0930 15:03:26.347233 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qxnqk_6b3ca6ec-bdbc-44be-90e5-21fa0d2cd5f1/extract-utilities/0.log" Sep 30 15:03:26 crc kubenswrapper[4676]: I0930 15:03:26.367549 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qxnqk_6b3ca6ec-bdbc-44be-90e5-21fa0d2cd5f1/extract-content/0.log" Sep 30 15:03:26 crc kubenswrapper[4676]: I0930 15:03:26.390950 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qxnqk_6b3ca6ec-bdbc-44be-90e5-21fa0d2cd5f1/extract-content/0.log" Sep 30 15:03:26 crc kubenswrapper[4676]: I0930 15:03:26.550918 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qxnqk_6b3ca6ec-bdbc-44be-90e5-21fa0d2cd5f1/extract-content/0.log" Sep 30 15:03:26 crc kubenswrapper[4676]: I0930 15:03:26.560089 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qxnqk_6b3ca6ec-bdbc-44be-90e5-21fa0d2cd5f1/extract-utilities/0.log" Sep 30 15:03:26 crc kubenswrapper[4676]: I0930 15:03:26.760953 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qcljw_6f1bd977-799f-432c-80c7-cec958829e37/extract-utilities/0.log" Sep 30 15:03:26 crc kubenswrapper[4676]: I0930 15:03:26.965685 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qcljw_6f1bd977-799f-432c-80c7-cec958829e37/extract-utilities/0.log" Sep 30 15:03:27 crc kubenswrapper[4676]: I0930 15:03:27.049709 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qcljw_6f1bd977-799f-432c-80c7-cec958829e37/extract-content/0.log" Sep 30 15:03:27 crc kubenswrapper[4676]: I0930 15:03:27.056603 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qcljw_6f1bd977-799f-432c-80c7-cec958829e37/extract-content/0.log" Sep 30 15:03:27 crc kubenswrapper[4676]: I0930 15:03:27.076219 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qxnqk_6b3ca6ec-bdbc-44be-90e5-21fa0d2cd5f1/registry-server/0.log" Sep 30 15:03:27 crc kubenswrapper[4676]: I0930 15:03:27.307466 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qcljw_6f1bd977-799f-432c-80c7-cec958829e37/extract-utilities/0.log" Sep 30 15:03:27 crc kubenswrapper[4676]: I0930 15:03:27.329735 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qcljw_6f1bd977-799f-432c-80c7-cec958829e37/extract-content/0.log" Sep 30 15:03:27 crc kubenswrapper[4676]: I0930 15:03:27.599733 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96stcjg_c6917646-fb37-4aa8-bd63-a09bdb713ea1/util/0.log" Sep 30 15:03:27 crc kubenswrapper[4676]: I0930 15:03:27.802895 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96stcjg_c6917646-fb37-4aa8-bd63-a09bdb713ea1/pull/0.log" Sep 30 15:03:27 crc kubenswrapper[4676]: I0930 15:03:27.873116 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96stcjg_c6917646-fb37-4aa8-bd63-a09bdb713ea1/pull/0.log" Sep 30 15:03:27 crc kubenswrapper[4676]: I0930 15:03:27.875421 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96stcjg_c6917646-fb37-4aa8-bd63-a09bdb713ea1/util/0.log" Sep 30 15:03:28 crc kubenswrapper[4676]: I0930 15:03:28.055899 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qcljw_6f1bd977-799f-432c-80c7-cec958829e37/registry-server/0.log" Sep 30 15:03:28 crc kubenswrapper[4676]: I0930 15:03:28.107669 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96stcjg_c6917646-fb37-4aa8-bd63-a09bdb713ea1/pull/0.log" Sep 30 15:03:28 crc kubenswrapper[4676]: I0930 15:03:28.115197 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96stcjg_c6917646-fb37-4aa8-bd63-a09bdb713ea1/extract/0.log" Sep 30 15:03:28 crc kubenswrapper[4676]: I0930 15:03:28.181396 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96stcjg_c6917646-fb37-4aa8-bd63-a09bdb713ea1/util/0.log" Sep 30 15:03:28 crc kubenswrapper[4676]: I0930 15:03:28.293147 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-vnxs8_fbc2525f-9c5f-4639-908d-35fed61607f5/marketplace-operator/0.log" Sep 30 15:03:28 crc kubenswrapper[4676]: I0930 15:03:28.350992 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pw8ff_6ffc9090-e9c1-4f42-b73c-1ec86c49e317/extract-utilities/0.log" Sep 30 15:03:28 crc kubenswrapper[4676]: I0930 15:03:28.577246 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pw8ff_6ffc9090-e9c1-4f42-b73c-1ec86c49e317/extract-content/0.log" Sep 30 15:03:28 crc kubenswrapper[4676]: I0930 15:03:28.589603 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pw8ff_6ffc9090-e9c1-4f42-b73c-1ec86c49e317/extract-utilities/0.log" Sep 30 15:03:28 crc kubenswrapper[4676]: I0930 15:03:28.595696 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pw8ff_6ffc9090-e9c1-4f42-b73c-1ec86c49e317/extract-content/0.log" Sep 30 15:03:28 crc kubenswrapper[4676]: I0930 15:03:28.727716 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pw8ff_6ffc9090-e9c1-4f42-b73c-1ec86c49e317/extract-content/0.log" Sep 30 15:03:28 crc kubenswrapper[4676]: I0930 15:03:28.735621 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pw8ff_6ffc9090-e9c1-4f42-b73c-1ec86c49e317/extract-utilities/0.log" Sep 30 15:03:28 crc kubenswrapper[4676]: I0930 15:03:28.929241 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pw8ff_6ffc9090-e9c1-4f42-b73c-1ec86c49e317/registry-server/0.log" Sep 30 15:03:28 crc kubenswrapper[4676]: I0930 15:03:28.929755 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8kfpw_239015d7-f8f2-4823-9094-6a3248c8e0a0/extract-utilities/0.log" Sep 30 15:03:29 crc kubenswrapper[4676]: I0930 15:03:29.096705 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8kfpw_239015d7-f8f2-4823-9094-6a3248c8e0a0/extract-utilities/0.log" Sep 30 15:03:29 crc kubenswrapper[4676]: I0930 15:03:29.132316 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8kfpw_239015d7-f8f2-4823-9094-6a3248c8e0a0/extract-content/0.log" Sep 30 15:03:29 crc kubenswrapper[4676]: I0930 15:03:29.133246 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8kfpw_239015d7-f8f2-4823-9094-6a3248c8e0a0/extract-content/0.log" Sep 30 15:03:29 crc kubenswrapper[4676]: I0930 15:03:29.295987 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8kfpw_239015d7-f8f2-4823-9094-6a3248c8e0a0/extract-utilities/0.log" Sep 30 15:03:29 crc kubenswrapper[4676]: I0930 15:03:29.329610 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8kfpw_239015d7-f8f2-4823-9094-6a3248c8e0a0/extract-content/0.log" Sep 30 15:03:29 crc kubenswrapper[4676]: I0930 15:03:29.781954 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8kfpw_239015d7-f8f2-4823-9094-6a3248c8e0a0/registry-server/0.log" Sep 30 15:03:29 crc kubenswrapper[4676]: I0930 15:03:29.919827 4676 patch_prober.go:28] interesting pod/machine-config-daemon-4k2dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 15:03:29 crc kubenswrapper[4676]: I0930 15:03:29.919902 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 15:03:29 crc kubenswrapper[4676]: I0930 15:03:29.919945 4676 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" Sep 30 15:03:29 crc kubenswrapper[4676]: I0930 15:03:29.920646 4676 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"01200c66e62b433cacb49e583f03caf0998b33671a3d5ec07da6a24fae8e6b7a"} pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 15:03:29 crc kubenswrapper[4676]: I0930 15:03:29.920696 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerName="machine-config-daemon" containerID="cri-o://01200c66e62b433cacb49e583f03caf0998b33671a3d5ec07da6a24fae8e6b7a" gracePeriod=600 Sep 30 15:03:30 crc kubenswrapper[4676]: E0930 15:03:30.043255 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 15:03:30 crc kubenswrapper[4676]: I0930 15:03:30.100127 4676 generic.go:334] "Generic (PLEG): container finished" podID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerID="01200c66e62b433cacb49e583f03caf0998b33671a3d5ec07da6a24fae8e6b7a" exitCode=0 Sep 30 15:03:30 crc kubenswrapper[4676]: I0930 15:03:30.100156 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" event={"ID":"af133cb7-f0e4-428e-b348-c6e81493fc1d","Type":"ContainerDied","Data":"01200c66e62b433cacb49e583f03caf0998b33671a3d5ec07da6a24fae8e6b7a"} Sep 30 15:03:30 crc kubenswrapper[4676]: I0930 15:03:30.100214 4676 scope.go:117] "RemoveContainer" containerID="30d1d9e76f6f2a6cf5fbf2ec7044efa19d21f8cf0a1419c92944482b8ca02b9b" Sep 30 15:03:30 crc kubenswrapper[4676]: I0930 15:03:30.100867 4676 scope.go:117] "RemoveContainer" containerID="01200c66e62b433cacb49e583f03caf0998b33671a3d5ec07da6a24fae8e6b7a" Sep 30 15:03:30 crc kubenswrapper[4676]: E0930 15:03:30.101278 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 15:03:44 crc kubenswrapper[4676]: I0930 15:03:44.434659 4676 scope.go:117] "RemoveContainer" containerID="01200c66e62b433cacb49e583f03caf0998b33671a3d5ec07da6a24fae8e6b7a" Sep 30 15:03:44 crc kubenswrapper[4676]: E0930 15:03:44.438673 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 15:03:57 crc kubenswrapper[4676]: I0930 15:03:57.444287 4676 scope.go:117] "RemoveContainer" containerID="01200c66e62b433cacb49e583f03caf0998b33671a3d5ec07da6a24fae8e6b7a" Sep 30 15:03:57 crc kubenswrapper[4676]: E0930 15:03:57.445229 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 15:04:10 crc kubenswrapper[4676]: I0930 15:04:10.433093 4676 scope.go:117] "RemoveContainer" containerID="01200c66e62b433cacb49e583f03caf0998b33671a3d5ec07da6a24fae8e6b7a" Sep 30 15:04:10 crc kubenswrapper[4676]: E0930 15:04:10.434047 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 15:04:21 crc kubenswrapper[4676]: I0930 15:04:21.433199 4676 scope.go:117] "RemoveContainer" containerID="01200c66e62b433cacb49e583f03caf0998b33671a3d5ec07da6a24fae8e6b7a" Sep 30 15:04:21 crc kubenswrapper[4676]: E0930 15:04:21.434050 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 15:04:35 crc kubenswrapper[4676]: I0930 15:04:35.433192 4676 scope.go:117] "RemoveContainer" containerID="01200c66e62b433cacb49e583f03caf0998b33671a3d5ec07da6a24fae8e6b7a" Sep 30 15:04:35 crc kubenswrapper[4676]: E0930 15:04:35.434190 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 15:04:48 crc kubenswrapper[4676]: I0930 15:04:48.433318 4676 scope.go:117] "RemoveContainer" containerID="01200c66e62b433cacb49e583f03caf0998b33671a3d5ec07da6a24fae8e6b7a" Sep 30 15:04:48 crc kubenswrapper[4676]: E0930 15:04:48.434090 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 15:04:53 crc kubenswrapper[4676]: I0930 15:04:53.920395 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wr58n"] Sep 30 15:04:53 crc kubenswrapper[4676]: E0930 15:04:53.921498 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b338aceb-403e-47d9-905d-0e152b132f22" containerName="extract-utilities" Sep 30 15:04:53 crc kubenswrapper[4676]: I0930 15:04:53.921512 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="b338aceb-403e-47d9-905d-0e152b132f22" containerName="extract-utilities" Sep 30 15:04:53 crc kubenswrapper[4676]: E0930 15:04:53.921547 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b338aceb-403e-47d9-905d-0e152b132f22" containerName="registry-server" Sep 30 15:04:53 crc kubenswrapper[4676]: I0930 15:04:53.921553 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="b338aceb-403e-47d9-905d-0e152b132f22" containerName="registry-server" Sep 30 15:04:53 crc kubenswrapper[4676]: E0930 15:04:53.921576 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b338aceb-403e-47d9-905d-0e152b132f22" containerName="extract-content" Sep 30 15:04:53 crc kubenswrapper[4676]: I0930 15:04:53.921582 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="b338aceb-403e-47d9-905d-0e152b132f22" containerName="extract-content" Sep 30 15:04:53 crc kubenswrapper[4676]: I0930 15:04:53.921762 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="b338aceb-403e-47d9-905d-0e152b132f22" containerName="registry-server" Sep 30 15:04:53 crc kubenswrapper[4676]: I0930 15:04:53.923516 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wr58n" Sep 30 15:04:53 crc kubenswrapper[4676]: I0930 15:04:53.933393 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wr58n"] Sep 30 15:04:54 crc kubenswrapper[4676]: I0930 15:04:54.029585 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fc5df74-e04c-45d8-92d4-a8e15bd54315-catalog-content\") pod \"community-operators-wr58n\" (UID: \"3fc5df74-e04c-45d8-92d4-a8e15bd54315\") " pod="openshift-marketplace/community-operators-wr58n" Sep 30 15:04:54 crc kubenswrapper[4676]: I0930 15:04:54.029680 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbckc\" (UniqueName: \"kubernetes.io/projected/3fc5df74-e04c-45d8-92d4-a8e15bd54315-kube-api-access-jbckc\") pod \"community-operators-wr58n\" (UID: \"3fc5df74-e04c-45d8-92d4-a8e15bd54315\") " pod="openshift-marketplace/community-operators-wr58n" Sep 30 15:04:54 crc kubenswrapper[4676]: I0930 15:04:54.029760 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fc5df74-e04c-45d8-92d4-a8e15bd54315-utilities\") pod \"community-operators-wr58n\" (UID: \"3fc5df74-e04c-45d8-92d4-a8e15bd54315\") " pod="openshift-marketplace/community-operators-wr58n" Sep 30 15:04:54 crc kubenswrapper[4676]: I0930 15:04:54.131633 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fc5df74-e04c-45d8-92d4-a8e15bd54315-catalog-content\") pod \"community-operators-wr58n\" (UID: \"3fc5df74-e04c-45d8-92d4-a8e15bd54315\") " pod="openshift-marketplace/community-operators-wr58n" Sep 30 15:04:54 crc kubenswrapper[4676]: I0930 15:04:54.131723 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbckc\" (UniqueName: \"kubernetes.io/projected/3fc5df74-e04c-45d8-92d4-a8e15bd54315-kube-api-access-jbckc\") pod \"community-operators-wr58n\" (UID: \"3fc5df74-e04c-45d8-92d4-a8e15bd54315\") " pod="openshift-marketplace/community-operators-wr58n" Sep 30 15:04:54 crc kubenswrapper[4676]: I0930 15:04:54.131822 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fc5df74-e04c-45d8-92d4-a8e15bd54315-utilities\") pod \"community-operators-wr58n\" (UID: \"3fc5df74-e04c-45d8-92d4-a8e15bd54315\") " pod="openshift-marketplace/community-operators-wr58n" Sep 30 15:04:54 crc kubenswrapper[4676]: I0930 15:04:54.132389 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fc5df74-e04c-45d8-92d4-a8e15bd54315-catalog-content\") pod \"community-operators-wr58n\" (UID: \"3fc5df74-e04c-45d8-92d4-a8e15bd54315\") " pod="openshift-marketplace/community-operators-wr58n" Sep 30 15:04:54 crc kubenswrapper[4676]: I0930 15:04:54.132477 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fc5df74-e04c-45d8-92d4-a8e15bd54315-utilities\") pod \"community-operators-wr58n\" (UID: \"3fc5df74-e04c-45d8-92d4-a8e15bd54315\") " pod="openshift-marketplace/community-operators-wr58n" Sep 30 15:04:54 crc kubenswrapper[4676]: I0930 15:04:54.150210 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbckc\" (UniqueName: \"kubernetes.io/projected/3fc5df74-e04c-45d8-92d4-a8e15bd54315-kube-api-access-jbckc\") pod \"community-operators-wr58n\" (UID: \"3fc5df74-e04c-45d8-92d4-a8e15bd54315\") " pod="openshift-marketplace/community-operators-wr58n" Sep 30 15:04:54 crc kubenswrapper[4676]: I0930 15:04:54.247907 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wr58n" Sep 30 15:04:54 crc kubenswrapper[4676]: I0930 15:04:54.916589 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wr58n"] Sep 30 15:04:54 crc kubenswrapper[4676]: I0930 15:04:54.941020 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wr58n" event={"ID":"3fc5df74-e04c-45d8-92d4-a8e15bd54315","Type":"ContainerStarted","Data":"098866f2a96a73523846c00288398b86fb71239de71a821d9945faeb47daac20"} Sep 30 15:04:55 crc kubenswrapper[4676]: I0930 15:04:55.950429 4676 generic.go:334] "Generic (PLEG): container finished" podID="3fc5df74-e04c-45d8-92d4-a8e15bd54315" containerID="9a525f80f72651b791f7d4289e3a18880aa23effe847cabeeaf37599ff3f1037" exitCode=0 Sep 30 15:04:55 crc kubenswrapper[4676]: I0930 15:04:55.950489 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wr58n" event={"ID":"3fc5df74-e04c-45d8-92d4-a8e15bd54315","Type":"ContainerDied","Data":"9a525f80f72651b791f7d4289e3a18880aa23effe847cabeeaf37599ff3f1037"} Sep 30 15:04:55 crc kubenswrapper[4676]: I0930 15:04:55.953800 4676 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 15:05:00 crc kubenswrapper[4676]: I0930 15:05:00.433471 4676 scope.go:117] "RemoveContainer" containerID="01200c66e62b433cacb49e583f03caf0998b33671a3d5ec07da6a24fae8e6b7a" Sep 30 15:05:00 crc kubenswrapper[4676]: E0930 15:05:00.434282 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 15:05:02 crc kubenswrapper[4676]: I0930 15:05:02.011573 4676 generic.go:334] "Generic (PLEG): container finished" podID="3fc5df74-e04c-45d8-92d4-a8e15bd54315" containerID="61f4478a28b94d4f6c8500042143446babc76cfa7bcfa19346827f56c6a531f0" exitCode=0 Sep 30 15:05:02 crc kubenswrapper[4676]: I0930 15:05:02.011668 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wr58n" event={"ID":"3fc5df74-e04c-45d8-92d4-a8e15bd54315","Type":"ContainerDied","Data":"61f4478a28b94d4f6c8500042143446babc76cfa7bcfa19346827f56c6a531f0"} Sep 30 15:05:04 crc kubenswrapper[4676]: I0930 15:05:04.031347 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wr58n" event={"ID":"3fc5df74-e04c-45d8-92d4-a8e15bd54315","Type":"ContainerStarted","Data":"fd22e85efed1da7c055f3397d5f106e1e859d00bb4db1fcdfb9b5220d9f4d0d9"} Sep 30 15:05:04 crc kubenswrapper[4676]: I0930 15:05:04.056576 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wr58n" podStartSLOduration=4.283766771 podStartE2EDuration="11.056557737s" podCreationTimestamp="2025-09-30 15:04:53 +0000 UTC" firstStartedPulling="2025-09-30 15:04:55.953323187 +0000 UTC m=+3999.936411616" lastFinishedPulling="2025-09-30 15:05:02.726114153 +0000 UTC m=+4006.709202582" observedRunningTime="2025-09-30 15:05:04.050982279 +0000 UTC m=+4008.034070708" watchObservedRunningTime="2025-09-30 15:05:04.056557737 +0000 UTC m=+4008.039646166" Sep 30 15:05:04 crc kubenswrapper[4676]: I0930 15:05:04.249084 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wr58n" Sep 30 15:05:04 crc kubenswrapper[4676]: I0930 15:05:04.249463 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wr58n" Sep 30 15:05:05 crc kubenswrapper[4676]: I0930 15:05:05.302129 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-wr58n" podUID="3fc5df74-e04c-45d8-92d4-a8e15bd54315" containerName="registry-server" probeResult="failure" output=< Sep 30 15:05:05 crc kubenswrapper[4676]: timeout: failed to connect service ":50051" within 1s Sep 30 15:05:05 crc kubenswrapper[4676]: > Sep 30 15:05:12 crc kubenswrapper[4676]: I0930 15:05:12.433032 4676 scope.go:117] "RemoveContainer" containerID="01200c66e62b433cacb49e583f03caf0998b33671a3d5ec07da6a24fae8e6b7a" Sep 30 15:05:12 crc kubenswrapper[4676]: E0930 15:05:12.433825 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 15:05:14 crc kubenswrapper[4676]: I0930 15:05:14.294090 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wr58n" Sep 30 15:05:14 crc kubenswrapper[4676]: I0930 15:05:14.345694 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wr58n" Sep 30 15:05:14 crc kubenswrapper[4676]: I0930 15:05:14.423472 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wr58n"] Sep 30 15:05:14 crc kubenswrapper[4676]: I0930 15:05:14.577604 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qcljw"] Sep 30 15:05:15 crc kubenswrapper[4676]: I0930 15:05:15.144251 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qcljw" podUID="6f1bd977-799f-432c-80c7-cec958829e37" containerName="registry-server" containerID="cri-o://1711df3767b1228cec6ea1c39049dc07a8ad949da5745a8d23141718b5efe012" gracePeriod=2 Sep 30 15:05:15 crc kubenswrapper[4676]: E0930 15:05:15.498293 4676 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1711df3767b1228cec6ea1c39049dc07a8ad949da5745a8d23141718b5efe012 is running failed: container process not found" containerID="1711df3767b1228cec6ea1c39049dc07a8ad949da5745a8d23141718b5efe012" cmd=["grpc_health_probe","-addr=:50051"] Sep 30 15:05:15 crc kubenswrapper[4676]: E0930 15:05:15.500148 4676 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1711df3767b1228cec6ea1c39049dc07a8ad949da5745a8d23141718b5efe012 is running failed: container process not found" containerID="1711df3767b1228cec6ea1c39049dc07a8ad949da5745a8d23141718b5efe012" cmd=["grpc_health_probe","-addr=:50051"] Sep 30 15:05:15 crc kubenswrapper[4676]: E0930 15:05:15.500676 4676 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1711df3767b1228cec6ea1c39049dc07a8ad949da5745a8d23141718b5efe012 is running failed: container process not found" containerID="1711df3767b1228cec6ea1c39049dc07a8ad949da5745a8d23141718b5efe012" cmd=["grpc_health_probe","-addr=:50051"] Sep 30 15:05:15 crc kubenswrapper[4676]: E0930 15:05:15.500717 4676 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1711df3767b1228cec6ea1c39049dc07a8ad949da5745a8d23141718b5efe012 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-qcljw" podUID="6f1bd977-799f-432c-80c7-cec958829e37" containerName="registry-server" Sep 30 15:05:15 crc kubenswrapper[4676]: I0930 15:05:15.661616 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qcljw" Sep 30 15:05:15 crc kubenswrapper[4676]: I0930 15:05:15.845659 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f1bd977-799f-432c-80c7-cec958829e37-utilities\") pod \"6f1bd977-799f-432c-80c7-cec958829e37\" (UID: \"6f1bd977-799f-432c-80c7-cec958829e37\") " Sep 30 15:05:15 crc kubenswrapper[4676]: I0930 15:05:15.845749 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f1bd977-799f-432c-80c7-cec958829e37-catalog-content\") pod \"6f1bd977-799f-432c-80c7-cec958829e37\" (UID: \"6f1bd977-799f-432c-80c7-cec958829e37\") " Sep 30 15:05:15 crc kubenswrapper[4676]: I0930 15:05:15.845784 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96npb\" (UniqueName: \"kubernetes.io/projected/6f1bd977-799f-432c-80c7-cec958829e37-kube-api-access-96npb\") pod \"6f1bd977-799f-432c-80c7-cec958829e37\" (UID: \"6f1bd977-799f-432c-80c7-cec958829e37\") " Sep 30 15:05:15 crc kubenswrapper[4676]: I0930 15:05:15.846768 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f1bd977-799f-432c-80c7-cec958829e37-utilities" (OuterVolumeSpecName: "utilities") pod "6f1bd977-799f-432c-80c7-cec958829e37" (UID: "6f1bd977-799f-432c-80c7-cec958829e37"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 15:05:15 crc kubenswrapper[4676]: I0930 15:05:15.856256 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f1bd977-799f-432c-80c7-cec958829e37-kube-api-access-96npb" (OuterVolumeSpecName: "kube-api-access-96npb") pod "6f1bd977-799f-432c-80c7-cec958829e37" (UID: "6f1bd977-799f-432c-80c7-cec958829e37"). InnerVolumeSpecName "kube-api-access-96npb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 15:05:15 crc kubenswrapper[4676]: I0930 15:05:15.920928 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f1bd977-799f-432c-80c7-cec958829e37-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6f1bd977-799f-432c-80c7-cec958829e37" (UID: "6f1bd977-799f-432c-80c7-cec958829e37"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 15:05:15 crc kubenswrapper[4676]: I0930 15:05:15.947973 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f1bd977-799f-432c-80c7-cec958829e37-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 15:05:15 crc kubenswrapper[4676]: I0930 15:05:15.948013 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f1bd977-799f-432c-80c7-cec958829e37-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 15:05:15 crc kubenswrapper[4676]: I0930 15:05:15.948027 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96npb\" (UniqueName: \"kubernetes.io/projected/6f1bd977-799f-432c-80c7-cec958829e37-kube-api-access-96npb\") on node \"crc\" DevicePath \"\"" Sep 30 15:05:16 crc kubenswrapper[4676]: I0930 15:05:16.155890 4676 generic.go:334] "Generic (PLEG): container finished" podID="6f1bd977-799f-432c-80c7-cec958829e37" containerID="1711df3767b1228cec6ea1c39049dc07a8ad949da5745a8d23141718b5efe012" exitCode=0 Sep 30 15:05:16 crc kubenswrapper[4676]: I0930 15:05:16.155907 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qcljw" event={"ID":"6f1bd977-799f-432c-80c7-cec958829e37","Type":"ContainerDied","Data":"1711df3767b1228cec6ea1c39049dc07a8ad949da5745a8d23141718b5efe012"} Sep 30 15:05:16 crc kubenswrapper[4676]: I0930 15:05:16.156293 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qcljw" event={"ID":"6f1bd977-799f-432c-80c7-cec958829e37","Type":"ContainerDied","Data":"37f6b1a68e08a82b49398e4a8ca45984e737fb12acf0b8149c73890e082b56c9"} Sep 30 15:05:16 crc kubenswrapper[4676]: I0930 15:05:16.156321 4676 scope.go:117] "RemoveContainer" containerID="1711df3767b1228cec6ea1c39049dc07a8ad949da5745a8d23141718b5efe012" Sep 30 15:05:16 crc kubenswrapper[4676]: I0930 15:05:16.155959 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qcljw" Sep 30 15:05:16 crc kubenswrapper[4676]: I0930 15:05:16.186069 4676 scope.go:117] "RemoveContainer" containerID="1d7e1d673056b3d5155f6ceea152eea9fa776e14fe7796044b64eff4fa6fcb73" Sep 30 15:05:16 crc kubenswrapper[4676]: I0930 15:05:16.198043 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qcljw"] Sep 30 15:05:16 crc kubenswrapper[4676]: I0930 15:05:16.205453 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qcljw"] Sep 30 15:05:16 crc kubenswrapper[4676]: I0930 15:05:16.261824 4676 scope.go:117] "RemoveContainer" containerID="de995028abcf78fb09d871fa5c53c9c43166a6aaef10898177900d17fa71fff5" Sep 30 15:05:16 crc kubenswrapper[4676]: I0930 15:05:16.311521 4676 scope.go:117] "RemoveContainer" containerID="1711df3767b1228cec6ea1c39049dc07a8ad949da5745a8d23141718b5efe012" Sep 30 15:05:16 crc kubenswrapper[4676]: E0930 15:05:16.311867 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1711df3767b1228cec6ea1c39049dc07a8ad949da5745a8d23141718b5efe012\": container with ID starting with 1711df3767b1228cec6ea1c39049dc07a8ad949da5745a8d23141718b5efe012 not found: ID does not exist" containerID="1711df3767b1228cec6ea1c39049dc07a8ad949da5745a8d23141718b5efe012" Sep 30 15:05:16 crc kubenswrapper[4676]: I0930 15:05:16.311911 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1711df3767b1228cec6ea1c39049dc07a8ad949da5745a8d23141718b5efe012"} err="failed to get container status \"1711df3767b1228cec6ea1c39049dc07a8ad949da5745a8d23141718b5efe012\": rpc error: code = NotFound desc = could not find container \"1711df3767b1228cec6ea1c39049dc07a8ad949da5745a8d23141718b5efe012\": container with ID starting with 1711df3767b1228cec6ea1c39049dc07a8ad949da5745a8d23141718b5efe012 not found: ID does not exist" Sep 30 15:05:16 crc kubenswrapper[4676]: I0930 15:05:16.311939 4676 scope.go:117] "RemoveContainer" containerID="1d7e1d673056b3d5155f6ceea152eea9fa776e14fe7796044b64eff4fa6fcb73" Sep 30 15:05:16 crc kubenswrapper[4676]: E0930 15:05:16.312339 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d7e1d673056b3d5155f6ceea152eea9fa776e14fe7796044b64eff4fa6fcb73\": container with ID starting with 1d7e1d673056b3d5155f6ceea152eea9fa776e14fe7796044b64eff4fa6fcb73 not found: ID does not exist" containerID="1d7e1d673056b3d5155f6ceea152eea9fa776e14fe7796044b64eff4fa6fcb73" Sep 30 15:05:16 crc kubenswrapper[4676]: I0930 15:05:16.312362 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d7e1d673056b3d5155f6ceea152eea9fa776e14fe7796044b64eff4fa6fcb73"} err="failed to get container status \"1d7e1d673056b3d5155f6ceea152eea9fa776e14fe7796044b64eff4fa6fcb73\": rpc error: code = NotFound desc = could not find container \"1d7e1d673056b3d5155f6ceea152eea9fa776e14fe7796044b64eff4fa6fcb73\": container with ID starting with 1d7e1d673056b3d5155f6ceea152eea9fa776e14fe7796044b64eff4fa6fcb73 not found: ID does not exist" Sep 30 15:05:16 crc kubenswrapper[4676]: I0930 15:05:16.312378 4676 scope.go:117] "RemoveContainer" containerID="de995028abcf78fb09d871fa5c53c9c43166a6aaef10898177900d17fa71fff5" Sep 30 15:05:16 crc kubenswrapper[4676]: E0930 15:05:16.312620 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de995028abcf78fb09d871fa5c53c9c43166a6aaef10898177900d17fa71fff5\": container with ID starting with de995028abcf78fb09d871fa5c53c9c43166a6aaef10898177900d17fa71fff5 not found: ID does not exist" containerID="de995028abcf78fb09d871fa5c53c9c43166a6aaef10898177900d17fa71fff5" Sep 30 15:05:16 crc kubenswrapper[4676]: I0930 15:05:16.312660 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de995028abcf78fb09d871fa5c53c9c43166a6aaef10898177900d17fa71fff5"} err="failed to get container status \"de995028abcf78fb09d871fa5c53c9c43166a6aaef10898177900d17fa71fff5\": rpc error: code = NotFound desc = could not find container \"de995028abcf78fb09d871fa5c53c9c43166a6aaef10898177900d17fa71fff5\": container with ID starting with de995028abcf78fb09d871fa5c53c9c43166a6aaef10898177900d17fa71fff5 not found: ID does not exist" Sep 30 15:05:17 crc kubenswrapper[4676]: I0930 15:05:17.450945 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f1bd977-799f-432c-80c7-cec958829e37" path="/var/lib/kubelet/pods/6f1bd977-799f-432c-80c7-cec958829e37/volumes" Sep 30 15:05:24 crc kubenswrapper[4676]: I0930 15:05:24.434527 4676 scope.go:117] "RemoveContainer" containerID="01200c66e62b433cacb49e583f03caf0998b33671a3d5ec07da6a24fae8e6b7a" Sep 30 15:05:24 crc kubenswrapper[4676]: E0930 15:05:24.435400 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 15:05:37 crc kubenswrapper[4676]: I0930 15:05:37.349604 4676 generic.go:334] "Generic (PLEG): container finished" podID="26bc2079-4750-497e-a94b-c77e49611498" containerID="cc147e3b329127dc2ee2efe3997c04c72a65883c2fdf8045dc77df627894a8b7" exitCode=0 Sep 30 15:05:37 crc kubenswrapper[4676]: I0930 15:05:37.350148 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-q7tf5/must-gather-72l9f" event={"ID":"26bc2079-4750-497e-a94b-c77e49611498","Type":"ContainerDied","Data":"cc147e3b329127dc2ee2efe3997c04c72a65883c2fdf8045dc77df627894a8b7"} Sep 30 15:05:37 crc kubenswrapper[4676]: I0930 15:05:37.350844 4676 scope.go:117] "RemoveContainer" containerID="cc147e3b329127dc2ee2efe3997c04c72a65883c2fdf8045dc77df627894a8b7" Sep 30 15:05:37 crc kubenswrapper[4676]: I0930 15:05:37.971278 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-q7tf5_must-gather-72l9f_26bc2079-4750-497e-a94b-c77e49611498/gather/0.log" Sep 30 15:05:38 crc kubenswrapper[4676]: I0930 15:05:38.433647 4676 scope.go:117] "RemoveContainer" containerID="01200c66e62b433cacb49e583f03caf0998b33671a3d5ec07da6a24fae8e6b7a" Sep 30 15:05:38 crc kubenswrapper[4676]: E0930 15:05:38.434172 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 15:05:45 crc kubenswrapper[4676]: I0930 15:05:45.506144 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-q7tf5/must-gather-72l9f"] Sep 30 15:05:45 crc kubenswrapper[4676]: I0930 15:05:45.506999 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-q7tf5/must-gather-72l9f" podUID="26bc2079-4750-497e-a94b-c77e49611498" containerName="copy" containerID="cri-o://d99ee1a171de9a4f78efb869f912ec2edb4d66ebf5ff5d68a9dcf20c6b80db38" gracePeriod=2 Sep 30 15:05:45 crc kubenswrapper[4676]: I0930 15:05:45.515579 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-q7tf5/must-gather-72l9f"] Sep 30 15:05:45 crc kubenswrapper[4676]: I0930 15:05:45.957457 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-q7tf5_must-gather-72l9f_26bc2079-4750-497e-a94b-c77e49611498/copy/0.log" Sep 30 15:05:45 crc kubenswrapper[4676]: I0930 15:05:45.958378 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-q7tf5/must-gather-72l9f" Sep 30 15:05:46 crc kubenswrapper[4676]: I0930 15:05:46.141475 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/26bc2079-4750-497e-a94b-c77e49611498-must-gather-output\") pod \"26bc2079-4750-497e-a94b-c77e49611498\" (UID: \"26bc2079-4750-497e-a94b-c77e49611498\") " Sep 30 15:05:46 crc kubenswrapper[4676]: I0930 15:05:46.141771 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jvkb\" (UniqueName: \"kubernetes.io/projected/26bc2079-4750-497e-a94b-c77e49611498-kube-api-access-7jvkb\") pod \"26bc2079-4750-497e-a94b-c77e49611498\" (UID: \"26bc2079-4750-497e-a94b-c77e49611498\") " Sep 30 15:05:46 crc kubenswrapper[4676]: I0930 15:05:46.150096 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26bc2079-4750-497e-a94b-c77e49611498-kube-api-access-7jvkb" (OuterVolumeSpecName: "kube-api-access-7jvkb") pod "26bc2079-4750-497e-a94b-c77e49611498" (UID: "26bc2079-4750-497e-a94b-c77e49611498"). InnerVolumeSpecName "kube-api-access-7jvkb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 15:05:46 crc kubenswrapper[4676]: I0930 15:05:46.248547 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jvkb\" (UniqueName: \"kubernetes.io/projected/26bc2079-4750-497e-a94b-c77e49611498-kube-api-access-7jvkb\") on node \"crc\" DevicePath \"\"" Sep 30 15:05:46 crc kubenswrapper[4676]: I0930 15:05:46.303172 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26bc2079-4750-497e-a94b-c77e49611498-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "26bc2079-4750-497e-a94b-c77e49611498" (UID: "26bc2079-4750-497e-a94b-c77e49611498"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 15:05:46 crc kubenswrapper[4676]: I0930 15:05:46.350044 4676 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/26bc2079-4750-497e-a94b-c77e49611498-must-gather-output\") on node \"crc\" DevicePath \"\"" Sep 30 15:05:46 crc kubenswrapper[4676]: I0930 15:05:46.425660 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-q7tf5_must-gather-72l9f_26bc2079-4750-497e-a94b-c77e49611498/copy/0.log" Sep 30 15:05:46 crc kubenswrapper[4676]: I0930 15:05:46.426178 4676 generic.go:334] "Generic (PLEG): container finished" podID="26bc2079-4750-497e-a94b-c77e49611498" containerID="d99ee1a171de9a4f78efb869f912ec2edb4d66ebf5ff5d68a9dcf20c6b80db38" exitCode=143 Sep 30 15:05:46 crc kubenswrapper[4676]: I0930 15:05:46.426234 4676 scope.go:117] "RemoveContainer" containerID="d99ee1a171de9a4f78efb869f912ec2edb4d66ebf5ff5d68a9dcf20c6b80db38" Sep 30 15:05:46 crc kubenswrapper[4676]: I0930 15:05:46.426240 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-q7tf5/must-gather-72l9f" Sep 30 15:05:46 crc kubenswrapper[4676]: I0930 15:05:46.461319 4676 scope.go:117] "RemoveContainer" containerID="cc147e3b329127dc2ee2efe3997c04c72a65883c2fdf8045dc77df627894a8b7" Sep 30 15:05:46 crc kubenswrapper[4676]: I0930 15:05:46.539210 4676 scope.go:117] "RemoveContainer" containerID="d99ee1a171de9a4f78efb869f912ec2edb4d66ebf5ff5d68a9dcf20c6b80db38" Sep 30 15:05:46 crc kubenswrapper[4676]: E0930 15:05:46.540074 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d99ee1a171de9a4f78efb869f912ec2edb4d66ebf5ff5d68a9dcf20c6b80db38\": container with ID starting with d99ee1a171de9a4f78efb869f912ec2edb4d66ebf5ff5d68a9dcf20c6b80db38 not found: ID does not exist" containerID="d99ee1a171de9a4f78efb869f912ec2edb4d66ebf5ff5d68a9dcf20c6b80db38" Sep 30 15:05:46 crc kubenswrapper[4676]: I0930 15:05:46.540115 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d99ee1a171de9a4f78efb869f912ec2edb4d66ebf5ff5d68a9dcf20c6b80db38"} err="failed to get container status \"d99ee1a171de9a4f78efb869f912ec2edb4d66ebf5ff5d68a9dcf20c6b80db38\": rpc error: code = NotFound desc = could not find container \"d99ee1a171de9a4f78efb869f912ec2edb4d66ebf5ff5d68a9dcf20c6b80db38\": container with ID starting with d99ee1a171de9a4f78efb869f912ec2edb4d66ebf5ff5d68a9dcf20c6b80db38 not found: ID does not exist" Sep 30 15:05:46 crc kubenswrapper[4676]: I0930 15:05:46.540141 4676 scope.go:117] "RemoveContainer" containerID="cc147e3b329127dc2ee2efe3997c04c72a65883c2fdf8045dc77df627894a8b7" Sep 30 15:05:46 crc kubenswrapper[4676]: E0930 15:05:46.540628 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc147e3b329127dc2ee2efe3997c04c72a65883c2fdf8045dc77df627894a8b7\": container with ID starting with cc147e3b329127dc2ee2efe3997c04c72a65883c2fdf8045dc77df627894a8b7 not found: ID does not exist" containerID="cc147e3b329127dc2ee2efe3997c04c72a65883c2fdf8045dc77df627894a8b7" Sep 30 15:05:46 crc kubenswrapper[4676]: I0930 15:05:46.540658 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc147e3b329127dc2ee2efe3997c04c72a65883c2fdf8045dc77df627894a8b7"} err="failed to get container status \"cc147e3b329127dc2ee2efe3997c04c72a65883c2fdf8045dc77df627894a8b7\": rpc error: code = NotFound desc = could not find container \"cc147e3b329127dc2ee2efe3997c04c72a65883c2fdf8045dc77df627894a8b7\": container with ID starting with cc147e3b329127dc2ee2efe3997c04c72a65883c2fdf8045dc77df627894a8b7 not found: ID does not exist" Sep 30 15:05:47 crc kubenswrapper[4676]: I0930 15:05:47.443669 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26bc2079-4750-497e-a94b-c77e49611498" path="/var/lib/kubelet/pods/26bc2079-4750-497e-a94b-c77e49611498/volumes" Sep 30 15:05:51 crc kubenswrapper[4676]: I0930 15:05:51.432924 4676 scope.go:117] "RemoveContainer" containerID="01200c66e62b433cacb49e583f03caf0998b33671a3d5ec07da6a24fae8e6b7a" Sep 30 15:05:51 crc kubenswrapper[4676]: E0930 15:05:51.433631 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 15:06:04 crc kubenswrapper[4676]: I0930 15:06:04.433488 4676 scope.go:117] "RemoveContainer" containerID="01200c66e62b433cacb49e583f03caf0998b33671a3d5ec07da6a24fae8e6b7a" Sep 30 15:06:04 crc kubenswrapper[4676]: E0930 15:06:04.434371 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 15:06:17 crc kubenswrapper[4676]: I0930 15:06:17.434082 4676 scope.go:117] "RemoveContainer" containerID="01200c66e62b433cacb49e583f03caf0998b33671a3d5ec07da6a24fae8e6b7a" Sep 30 15:06:17 crc kubenswrapper[4676]: E0930 15:06:17.435522 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 15:06:21 crc kubenswrapper[4676]: I0930 15:06:21.765025 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-sr586"] Sep 30 15:06:21 crc kubenswrapper[4676]: E0930 15:06:21.766010 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f1bd977-799f-432c-80c7-cec958829e37" containerName="extract-content" Sep 30 15:06:21 crc kubenswrapper[4676]: I0930 15:06:21.766027 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f1bd977-799f-432c-80c7-cec958829e37" containerName="extract-content" Sep 30 15:06:21 crc kubenswrapper[4676]: E0930 15:06:21.766042 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f1bd977-799f-432c-80c7-cec958829e37" containerName="extract-utilities" Sep 30 15:06:21 crc kubenswrapper[4676]: I0930 15:06:21.766049 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f1bd977-799f-432c-80c7-cec958829e37" containerName="extract-utilities" Sep 30 15:06:21 crc kubenswrapper[4676]: E0930 15:06:21.766061 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26bc2079-4750-497e-a94b-c77e49611498" containerName="copy" Sep 30 15:06:21 crc kubenswrapper[4676]: I0930 15:06:21.766067 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="26bc2079-4750-497e-a94b-c77e49611498" containerName="copy" Sep 30 15:06:21 crc kubenswrapper[4676]: E0930 15:06:21.766100 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26bc2079-4750-497e-a94b-c77e49611498" containerName="gather" Sep 30 15:06:21 crc kubenswrapper[4676]: I0930 15:06:21.766107 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="26bc2079-4750-497e-a94b-c77e49611498" containerName="gather" Sep 30 15:06:21 crc kubenswrapper[4676]: E0930 15:06:21.766117 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f1bd977-799f-432c-80c7-cec958829e37" containerName="registry-server" Sep 30 15:06:21 crc kubenswrapper[4676]: I0930 15:06:21.766123 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f1bd977-799f-432c-80c7-cec958829e37" containerName="registry-server" Sep 30 15:06:21 crc kubenswrapper[4676]: I0930 15:06:21.766287 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f1bd977-799f-432c-80c7-cec958829e37" containerName="registry-server" Sep 30 15:06:21 crc kubenswrapper[4676]: I0930 15:06:21.766303 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="26bc2079-4750-497e-a94b-c77e49611498" containerName="copy" Sep 30 15:06:21 crc kubenswrapper[4676]: I0930 15:06:21.766319 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="26bc2079-4750-497e-a94b-c77e49611498" containerName="gather" Sep 30 15:06:21 crc kubenswrapper[4676]: I0930 15:06:21.767688 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sr586" Sep 30 15:06:21 crc kubenswrapper[4676]: I0930 15:06:21.783847 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sr586"] Sep 30 15:06:21 crc kubenswrapper[4676]: I0930 15:06:21.836205 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06dc138c-5f30-4827-ba1e-6a7c83ffda97-utilities\") pod \"certified-operators-sr586\" (UID: \"06dc138c-5f30-4827-ba1e-6a7c83ffda97\") " pod="openshift-marketplace/certified-operators-sr586" Sep 30 15:06:21 crc kubenswrapper[4676]: I0930 15:06:21.836268 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06dc138c-5f30-4827-ba1e-6a7c83ffda97-catalog-content\") pod \"certified-operators-sr586\" (UID: \"06dc138c-5f30-4827-ba1e-6a7c83ffda97\") " pod="openshift-marketplace/certified-operators-sr586" Sep 30 15:06:21 crc kubenswrapper[4676]: I0930 15:06:21.836325 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qx6sg\" (UniqueName: \"kubernetes.io/projected/06dc138c-5f30-4827-ba1e-6a7c83ffda97-kube-api-access-qx6sg\") pod \"certified-operators-sr586\" (UID: \"06dc138c-5f30-4827-ba1e-6a7c83ffda97\") " pod="openshift-marketplace/certified-operators-sr586" Sep 30 15:06:21 crc kubenswrapper[4676]: I0930 15:06:21.938576 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06dc138c-5f30-4827-ba1e-6a7c83ffda97-catalog-content\") pod \"certified-operators-sr586\" (UID: \"06dc138c-5f30-4827-ba1e-6a7c83ffda97\") " pod="openshift-marketplace/certified-operators-sr586" Sep 30 15:06:21 crc kubenswrapper[4676]: I0930 15:06:21.939002 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qx6sg\" (UniqueName: \"kubernetes.io/projected/06dc138c-5f30-4827-ba1e-6a7c83ffda97-kube-api-access-qx6sg\") pod \"certified-operators-sr586\" (UID: \"06dc138c-5f30-4827-ba1e-6a7c83ffda97\") " pod="openshift-marketplace/certified-operators-sr586" Sep 30 15:06:21 crc kubenswrapper[4676]: I0930 15:06:21.939259 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06dc138c-5f30-4827-ba1e-6a7c83ffda97-utilities\") pod \"certified-operators-sr586\" (UID: \"06dc138c-5f30-4827-ba1e-6a7c83ffda97\") " pod="openshift-marketplace/certified-operators-sr586" Sep 30 15:06:21 crc kubenswrapper[4676]: I0930 15:06:21.945568 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06dc138c-5f30-4827-ba1e-6a7c83ffda97-catalog-content\") pod \"certified-operators-sr586\" (UID: \"06dc138c-5f30-4827-ba1e-6a7c83ffda97\") " pod="openshift-marketplace/certified-operators-sr586" Sep 30 15:06:21 crc kubenswrapper[4676]: I0930 15:06:21.945665 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06dc138c-5f30-4827-ba1e-6a7c83ffda97-utilities\") pod \"certified-operators-sr586\" (UID: \"06dc138c-5f30-4827-ba1e-6a7c83ffda97\") " pod="openshift-marketplace/certified-operators-sr586" Sep 30 15:06:21 crc kubenswrapper[4676]: I0930 15:06:21.981028 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qx6sg\" (UniqueName: \"kubernetes.io/projected/06dc138c-5f30-4827-ba1e-6a7c83ffda97-kube-api-access-qx6sg\") pod \"certified-operators-sr586\" (UID: \"06dc138c-5f30-4827-ba1e-6a7c83ffda97\") " pod="openshift-marketplace/certified-operators-sr586" Sep 30 15:06:22 crc kubenswrapper[4676]: I0930 15:06:22.089760 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sr586" Sep 30 15:06:22 crc kubenswrapper[4676]: I0930 15:06:22.665303 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sr586"] Sep 30 15:06:22 crc kubenswrapper[4676]: I0930 15:06:22.749694 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sr586" event={"ID":"06dc138c-5f30-4827-ba1e-6a7c83ffda97","Type":"ContainerStarted","Data":"71069e3fad428dcc70ddccf087bf88a394636306686c54f9ccbc4822f64cff0e"} Sep 30 15:06:23 crc kubenswrapper[4676]: I0930 15:06:23.386157 4676 scope.go:117] "RemoveContainer" containerID="9bcdc1a8d44a9a07a94c6f07e7996f5427a28a3704e17cea61c4ce265a85651d" Sep 30 15:06:23 crc kubenswrapper[4676]: I0930 15:06:23.758736 4676 generic.go:334] "Generic (PLEG): container finished" podID="06dc138c-5f30-4827-ba1e-6a7c83ffda97" containerID="4b3115df2e8143807fd264f72ddfb737cff9b853d5505ead77d4c2f5e9f33963" exitCode=0 Sep 30 15:06:23 crc kubenswrapper[4676]: I0930 15:06:23.758788 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sr586" event={"ID":"06dc138c-5f30-4827-ba1e-6a7c83ffda97","Type":"ContainerDied","Data":"4b3115df2e8143807fd264f72ddfb737cff9b853d5505ead77d4c2f5e9f33963"} Sep 30 15:06:24 crc kubenswrapper[4676]: I0930 15:06:24.966506 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-v8fdp"] Sep 30 15:06:24 crc kubenswrapper[4676]: I0930 15:06:24.969442 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v8fdp" Sep 30 15:06:24 crc kubenswrapper[4676]: I0930 15:06:24.977404 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v8fdp"] Sep 30 15:06:25 crc kubenswrapper[4676]: I0930 15:06:25.002185 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bpkh\" (UniqueName: \"kubernetes.io/projected/d29f2572-6f3b-4af1-8628-915d588f9ec3-kube-api-access-9bpkh\") pod \"redhat-marketplace-v8fdp\" (UID: \"d29f2572-6f3b-4af1-8628-915d588f9ec3\") " pod="openshift-marketplace/redhat-marketplace-v8fdp" Sep 30 15:06:25 crc kubenswrapper[4676]: I0930 15:06:25.002236 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d29f2572-6f3b-4af1-8628-915d588f9ec3-utilities\") pod \"redhat-marketplace-v8fdp\" (UID: \"d29f2572-6f3b-4af1-8628-915d588f9ec3\") " pod="openshift-marketplace/redhat-marketplace-v8fdp" Sep 30 15:06:25 crc kubenswrapper[4676]: I0930 15:06:25.002274 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d29f2572-6f3b-4af1-8628-915d588f9ec3-catalog-content\") pod \"redhat-marketplace-v8fdp\" (UID: \"d29f2572-6f3b-4af1-8628-915d588f9ec3\") " pod="openshift-marketplace/redhat-marketplace-v8fdp" Sep 30 15:06:25 crc kubenswrapper[4676]: I0930 15:06:25.104629 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bpkh\" (UniqueName: \"kubernetes.io/projected/d29f2572-6f3b-4af1-8628-915d588f9ec3-kube-api-access-9bpkh\") pod \"redhat-marketplace-v8fdp\" (UID: \"d29f2572-6f3b-4af1-8628-915d588f9ec3\") " pod="openshift-marketplace/redhat-marketplace-v8fdp" Sep 30 15:06:25 crc kubenswrapper[4676]: I0930 15:06:25.104702 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d29f2572-6f3b-4af1-8628-915d588f9ec3-utilities\") pod \"redhat-marketplace-v8fdp\" (UID: \"d29f2572-6f3b-4af1-8628-915d588f9ec3\") " pod="openshift-marketplace/redhat-marketplace-v8fdp" Sep 30 15:06:25 crc kubenswrapper[4676]: I0930 15:06:25.104749 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d29f2572-6f3b-4af1-8628-915d588f9ec3-catalog-content\") pod \"redhat-marketplace-v8fdp\" (UID: \"d29f2572-6f3b-4af1-8628-915d588f9ec3\") " pod="openshift-marketplace/redhat-marketplace-v8fdp" Sep 30 15:06:25 crc kubenswrapper[4676]: I0930 15:06:25.105261 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d29f2572-6f3b-4af1-8628-915d588f9ec3-utilities\") pod \"redhat-marketplace-v8fdp\" (UID: \"d29f2572-6f3b-4af1-8628-915d588f9ec3\") " pod="openshift-marketplace/redhat-marketplace-v8fdp" Sep 30 15:06:25 crc kubenswrapper[4676]: I0930 15:06:25.105293 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d29f2572-6f3b-4af1-8628-915d588f9ec3-catalog-content\") pod \"redhat-marketplace-v8fdp\" (UID: \"d29f2572-6f3b-4af1-8628-915d588f9ec3\") " pod="openshift-marketplace/redhat-marketplace-v8fdp" Sep 30 15:06:25 crc kubenswrapper[4676]: I0930 15:06:25.132117 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bpkh\" (UniqueName: \"kubernetes.io/projected/d29f2572-6f3b-4af1-8628-915d588f9ec3-kube-api-access-9bpkh\") pod \"redhat-marketplace-v8fdp\" (UID: \"d29f2572-6f3b-4af1-8628-915d588f9ec3\") " pod="openshift-marketplace/redhat-marketplace-v8fdp" Sep 30 15:06:25 crc kubenswrapper[4676]: I0930 15:06:25.302600 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v8fdp" Sep 30 15:06:25 crc kubenswrapper[4676]: I0930 15:06:25.754362 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v8fdp"] Sep 30 15:06:25 crc kubenswrapper[4676]: I0930 15:06:25.778948 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sr586" event={"ID":"06dc138c-5f30-4827-ba1e-6a7c83ffda97","Type":"ContainerStarted","Data":"94e8fcf6fcbd2b07d01ff407f27d76617b578a07c82f9ec110cdb2c85b9522b0"} Sep 30 15:06:26 crc kubenswrapper[4676]: I0930 15:06:26.795333 4676 generic.go:334] "Generic (PLEG): container finished" podID="06dc138c-5f30-4827-ba1e-6a7c83ffda97" containerID="94e8fcf6fcbd2b07d01ff407f27d76617b578a07c82f9ec110cdb2c85b9522b0" exitCode=0 Sep 30 15:06:26 crc kubenswrapper[4676]: I0930 15:06:26.795586 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sr586" event={"ID":"06dc138c-5f30-4827-ba1e-6a7c83ffda97","Type":"ContainerDied","Data":"94e8fcf6fcbd2b07d01ff407f27d76617b578a07c82f9ec110cdb2c85b9522b0"} Sep 30 15:06:26 crc kubenswrapper[4676]: I0930 15:06:26.800300 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v8fdp" event={"ID":"d29f2572-6f3b-4af1-8628-915d588f9ec3","Type":"ContainerStarted","Data":"8a6004fef06a658de189d7831384b7c53fb64ec7454421c237e33faddbb59f6a"} Sep 30 15:06:27 crc kubenswrapper[4676]: I0930 15:06:27.811946 4676 generic.go:334] "Generic (PLEG): container finished" podID="d29f2572-6f3b-4af1-8628-915d588f9ec3" containerID="073e530e3558badb88bc3163fdf74a252bc0796dcb48a7cbb2bc283c3a63c864" exitCode=0 Sep 30 15:06:27 crc kubenswrapper[4676]: I0930 15:06:27.812305 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v8fdp" event={"ID":"d29f2572-6f3b-4af1-8628-915d588f9ec3","Type":"ContainerDied","Data":"073e530e3558badb88bc3163fdf74a252bc0796dcb48a7cbb2bc283c3a63c864"} Sep 30 15:06:28 crc kubenswrapper[4676]: I0930 15:06:28.828161 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sr586" event={"ID":"06dc138c-5f30-4827-ba1e-6a7c83ffda97","Type":"ContainerStarted","Data":"69d591432c34cd5d862ae97144affeae9ff4c46df08497f94150eab78a71099e"} Sep 30 15:06:29 crc kubenswrapper[4676]: I0930 15:06:29.433573 4676 scope.go:117] "RemoveContainer" containerID="01200c66e62b433cacb49e583f03caf0998b33671a3d5ec07da6a24fae8e6b7a" Sep 30 15:06:29 crc kubenswrapper[4676]: E0930 15:06:29.434144 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 15:06:29 crc kubenswrapper[4676]: I0930 15:06:29.838906 4676 generic.go:334] "Generic (PLEG): container finished" podID="d29f2572-6f3b-4af1-8628-915d588f9ec3" containerID="571419cf4f21852ac386c09c8486cb81566f91b5490a3ee28105c6f373af3d0a" exitCode=0 Sep 30 15:06:29 crc kubenswrapper[4676]: I0930 15:06:29.839007 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v8fdp" event={"ID":"d29f2572-6f3b-4af1-8628-915d588f9ec3","Type":"ContainerDied","Data":"571419cf4f21852ac386c09c8486cb81566f91b5490a3ee28105c6f373af3d0a"} Sep 30 15:06:29 crc kubenswrapper[4676]: I0930 15:06:29.860521 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-sr586" podStartSLOduration=5.167689882 podStartE2EDuration="8.860504559s" podCreationTimestamp="2025-09-30 15:06:21 +0000 UTC" firstStartedPulling="2025-09-30 15:06:24.115418678 +0000 UTC m=+4088.098507107" lastFinishedPulling="2025-09-30 15:06:27.808233355 +0000 UTC m=+4091.791321784" observedRunningTime="2025-09-30 15:06:28.86503221 +0000 UTC m=+4092.848120639" watchObservedRunningTime="2025-09-30 15:06:29.860504559 +0000 UTC m=+4093.843592978" Sep 30 15:06:30 crc kubenswrapper[4676]: I0930 15:06:30.851736 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v8fdp" event={"ID":"d29f2572-6f3b-4af1-8628-915d588f9ec3","Type":"ContainerStarted","Data":"3b8fcc42aed90752d41026b7a1f726584f6a1a72bd4e8ebdfa88162f6086d2e8"} Sep 30 15:06:30 crc kubenswrapper[4676]: I0930 15:06:30.856383 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-n4b4l/must-gather-4bsdb"] Sep 30 15:06:30 crc kubenswrapper[4676]: I0930 15:06:30.858385 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n4b4l/must-gather-4bsdb" Sep 30 15:06:30 crc kubenswrapper[4676]: I0930 15:06:30.872325 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-n4b4l"/"kube-root-ca.crt" Sep 30 15:06:30 crc kubenswrapper[4676]: I0930 15:06:30.874074 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-n4b4l"/"openshift-service-ca.crt" Sep 30 15:06:30 crc kubenswrapper[4676]: I0930 15:06:30.886062 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-n4b4l/must-gather-4bsdb"] Sep 30 15:06:30 crc kubenswrapper[4676]: I0930 15:06:30.941195 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wk8l\" (UniqueName: \"kubernetes.io/projected/181361c5-87f6-48f3-af90-6fa805a3ce1e-kube-api-access-7wk8l\") pod \"must-gather-4bsdb\" (UID: \"181361c5-87f6-48f3-af90-6fa805a3ce1e\") " pod="openshift-must-gather-n4b4l/must-gather-4bsdb" Sep 30 15:06:30 crc kubenswrapper[4676]: I0930 15:06:30.941697 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/181361c5-87f6-48f3-af90-6fa805a3ce1e-must-gather-output\") pod \"must-gather-4bsdb\" (UID: \"181361c5-87f6-48f3-af90-6fa805a3ce1e\") " pod="openshift-must-gather-n4b4l/must-gather-4bsdb" Sep 30 15:06:30 crc kubenswrapper[4676]: I0930 15:06:30.941748 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-v8fdp" podStartSLOduration=4.303585037 podStartE2EDuration="6.941734165s" podCreationTimestamp="2025-09-30 15:06:24 +0000 UTC" firstStartedPulling="2025-09-30 15:06:27.828059214 +0000 UTC m=+4091.811147643" lastFinishedPulling="2025-09-30 15:06:30.466208342 +0000 UTC m=+4094.449296771" observedRunningTime="2025-09-30 15:06:30.928930154 +0000 UTC m=+4094.912018593" watchObservedRunningTime="2025-09-30 15:06:30.941734165 +0000 UTC m=+4094.924822594" Sep 30 15:06:31 crc kubenswrapper[4676]: I0930 15:06:31.043387 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wk8l\" (UniqueName: \"kubernetes.io/projected/181361c5-87f6-48f3-af90-6fa805a3ce1e-kube-api-access-7wk8l\") pod \"must-gather-4bsdb\" (UID: \"181361c5-87f6-48f3-af90-6fa805a3ce1e\") " pod="openshift-must-gather-n4b4l/must-gather-4bsdb" Sep 30 15:06:31 crc kubenswrapper[4676]: I0930 15:06:31.043604 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/181361c5-87f6-48f3-af90-6fa805a3ce1e-must-gather-output\") pod \"must-gather-4bsdb\" (UID: \"181361c5-87f6-48f3-af90-6fa805a3ce1e\") " pod="openshift-must-gather-n4b4l/must-gather-4bsdb" Sep 30 15:06:31 crc kubenswrapper[4676]: I0930 15:06:31.044117 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/181361c5-87f6-48f3-af90-6fa805a3ce1e-must-gather-output\") pod \"must-gather-4bsdb\" (UID: \"181361c5-87f6-48f3-af90-6fa805a3ce1e\") " pod="openshift-must-gather-n4b4l/must-gather-4bsdb" Sep 30 15:06:31 crc kubenswrapper[4676]: I0930 15:06:31.066377 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wk8l\" (UniqueName: \"kubernetes.io/projected/181361c5-87f6-48f3-af90-6fa805a3ce1e-kube-api-access-7wk8l\") pod \"must-gather-4bsdb\" (UID: \"181361c5-87f6-48f3-af90-6fa805a3ce1e\") " pod="openshift-must-gather-n4b4l/must-gather-4bsdb" Sep 30 15:06:31 crc kubenswrapper[4676]: I0930 15:06:31.188679 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n4b4l/must-gather-4bsdb" Sep 30 15:06:31 crc kubenswrapper[4676]: I0930 15:06:31.774052 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-n4b4l/must-gather-4bsdb"] Sep 30 15:06:31 crc kubenswrapper[4676]: I0930 15:06:31.868421 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n4b4l/must-gather-4bsdb" event={"ID":"181361c5-87f6-48f3-af90-6fa805a3ce1e","Type":"ContainerStarted","Data":"d19967828f19efaf8e5f7a6d27321cf8569c61cade89ecdc664a0aab1a1e781b"} Sep 30 15:06:32 crc kubenswrapper[4676]: I0930 15:06:32.091101 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sr586" Sep 30 15:06:32 crc kubenswrapper[4676]: I0930 15:06:32.091218 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sr586" Sep 30 15:06:32 crc kubenswrapper[4676]: I0930 15:06:32.146406 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sr586" Sep 30 15:06:32 crc kubenswrapper[4676]: I0930 15:06:32.880951 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n4b4l/must-gather-4bsdb" event={"ID":"181361c5-87f6-48f3-af90-6fa805a3ce1e","Type":"ContainerStarted","Data":"0ebb3bb354cc00eaf9bfa136ef5ec932074beeb83f53c21de7030ae7ad0f2764"} Sep 30 15:06:32 crc kubenswrapper[4676]: I0930 15:06:32.881339 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n4b4l/must-gather-4bsdb" event={"ID":"181361c5-87f6-48f3-af90-6fa805a3ce1e","Type":"ContainerStarted","Data":"d206abf5e5622ace975e1f2162ef7a6afdc4030e43b1084febbd57c3473b9330"} Sep 30 15:06:32 crc kubenswrapper[4676]: I0930 15:06:32.903613 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-n4b4l/must-gather-4bsdb" podStartSLOduration=2.903595158 podStartE2EDuration="2.903595158s" podCreationTimestamp="2025-09-30 15:06:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 15:06:32.898277856 +0000 UTC m=+4096.881366285" watchObservedRunningTime="2025-09-30 15:06:32.903595158 +0000 UTC m=+4096.886683587" Sep 30 15:06:35 crc kubenswrapper[4676]: I0930 15:06:35.303101 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-v8fdp" Sep 30 15:06:35 crc kubenswrapper[4676]: I0930 15:06:35.306183 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-v8fdp" Sep 30 15:06:35 crc kubenswrapper[4676]: I0930 15:06:35.376617 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-v8fdp" Sep 30 15:06:35 crc kubenswrapper[4676]: I0930 15:06:35.958435 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-v8fdp" Sep 30 15:06:36 crc kubenswrapper[4676]: I0930 15:06:36.020251 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-v8fdp"] Sep 30 15:06:36 crc kubenswrapper[4676]: I0930 15:06:36.953214 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-n4b4l/crc-debug-rzz85"] Sep 30 15:06:36 crc kubenswrapper[4676]: I0930 15:06:36.955064 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n4b4l/crc-debug-rzz85" Sep 30 15:06:36 crc kubenswrapper[4676]: I0930 15:06:36.957541 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-n4b4l"/"default-dockercfg-6k2x6" Sep 30 15:06:37 crc kubenswrapper[4676]: I0930 15:06:37.066414 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9trn\" (UniqueName: \"kubernetes.io/projected/0d7a5463-fda2-48c6-b9b9-72a9f724298a-kube-api-access-r9trn\") pod \"crc-debug-rzz85\" (UID: \"0d7a5463-fda2-48c6-b9b9-72a9f724298a\") " pod="openshift-must-gather-n4b4l/crc-debug-rzz85" Sep 30 15:06:37 crc kubenswrapper[4676]: I0930 15:06:37.066727 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0d7a5463-fda2-48c6-b9b9-72a9f724298a-host\") pod \"crc-debug-rzz85\" (UID: \"0d7a5463-fda2-48c6-b9b9-72a9f724298a\") " pod="openshift-must-gather-n4b4l/crc-debug-rzz85" Sep 30 15:06:37 crc kubenswrapper[4676]: I0930 15:06:37.168318 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9trn\" (UniqueName: \"kubernetes.io/projected/0d7a5463-fda2-48c6-b9b9-72a9f724298a-kube-api-access-r9trn\") pod \"crc-debug-rzz85\" (UID: \"0d7a5463-fda2-48c6-b9b9-72a9f724298a\") " pod="openshift-must-gather-n4b4l/crc-debug-rzz85" Sep 30 15:06:37 crc kubenswrapper[4676]: I0930 15:06:37.168461 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0d7a5463-fda2-48c6-b9b9-72a9f724298a-host\") pod \"crc-debug-rzz85\" (UID: \"0d7a5463-fda2-48c6-b9b9-72a9f724298a\") " pod="openshift-must-gather-n4b4l/crc-debug-rzz85" Sep 30 15:06:37 crc kubenswrapper[4676]: I0930 15:06:37.168656 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0d7a5463-fda2-48c6-b9b9-72a9f724298a-host\") pod \"crc-debug-rzz85\" (UID: \"0d7a5463-fda2-48c6-b9b9-72a9f724298a\") " pod="openshift-must-gather-n4b4l/crc-debug-rzz85" Sep 30 15:06:37 crc kubenswrapper[4676]: I0930 15:06:37.192977 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9trn\" (UniqueName: \"kubernetes.io/projected/0d7a5463-fda2-48c6-b9b9-72a9f724298a-kube-api-access-r9trn\") pod \"crc-debug-rzz85\" (UID: \"0d7a5463-fda2-48c6-b9b9-72a9f724298a\") " pod="openshift-must-gather-n4b4l/crc-debug-rzz85" Sep 30 15:06:37 crc kubenswrapper[4676]: I0930 15:06:37.282650 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n4b4l/crc-debug-rzz85" Sep 30 15:06:37 crc kubenswrapper[4676]: W0930 15:06:37.329233 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d7a5463_fda2_48c6_b9b9_72a9f724298a.slice/crio-47224d17136952b1bcd98b8c59d33f09a452c9e0c4df914dd044ce3d017373cd WatchSource:0}: Error finding container 47224d17136952b1bcd98b8c59d33f09a452c9e0c4df914dd044ce3d017373cd: Status 404 returned error can't find the container with id 47224d17136952b1bcd98b8c59d33f09a452c9e0c4df914dd044ce3d017373cd Sep 30 15:06:37 crc kubenswrapper[4676]: I0930 15:06:37.941806 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-v8fdp" podUID="d29f2572-6f3b-4af1-8628-915d588f9ec3" containerName="registry-server" containerID="cri-o://3b8fcc42aed90752d41026b7a1f726584f6a1a72bd4e8ebdfa88162f6086d2e8" gracePeriod=2 Sep 30 15:06:37 crc kubenswrapper[4676]: I0930 15:06:37.942454 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n4b4l/crc-debug-rzz85" event={"ID":"0d7a5463-fda2-48c6-b9b9-72a9f724298a","Type":"ContainerStarted","Data":"b132d4d4418c8be10fd84cd8a531657cbf9cd19b789c6b8577188db054e51fa7"} Sep 30 15:06:37 crc kubenswrapper[4676]: I0930 15:06:37.942541 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n4b4l/crc-debug-rzz85" event={"ID":"0d7a5463-fda2-48c6-b9b9-72a9f724298a","Type":"ContainerStarted","Data":"47224d17136952b1bcd98b8c59d33f09a452c9e0c4df914dd044ce3d017373cd"} Sep 30 15:06:37 crc kubenswrapper[4676]: I0930 15:06:37.981293 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-n4b4l/crc-debug-rzz85" podStartSLOduration=1.98127162 podStartE2EDuration="1.98127162s" podCreationTimestamp="2025-09-30 15:06:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 15:06:37.965438417 +0000 UTC m=+4101.948526846" watchObservedRunningTime="2025-09-30 15:06:37.98127162 +0000 UTC m=+4101.964360049" Sep 30 15:06:38 crc kubenswrapper[4676]: I0930 15:06:38.526133 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v8fdp" Sep 30 15:06:38 crc kubenswrapper[4676]: I0930 15:06:38.593537 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9bpkh\" (UniqueName: \"kubernetes.io/projected/d29f2572-6f3b-4af1-8628-915d588f9ec3-kube-api-access-9bpkh\") pod \"d29f2572-6f3b-4af1-8628-915d588f9ec3\" (UID: \"d29f2572-6f3b-4af1-8628-915d588f9ec3\") " Sep 30 15:06:38 crc kubenswrapper[4676]: I0930 15:06:38.593855 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d29f2572-6f3b-4af1-8628-915d588f9ec3-utilities\") pod \"d29f2572-6f3b-4af1-8628-915d588f9ec3\" (UID: \"d29f2572-6f3b-4af1-8628-915d588f9ec3\") " Sep 30 15:06:38 crc kubenswrapper[4676]: I0930 15:06:38.593936 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d29f2572-6f3b-4af1-8628-915d588f9ec3-catalog-content\") pod \"d29f2572-6f3b-4af1-8628-915d588f9ec3\" (UID: \"d29f2572-6f3b-4af1-8628-915d588f9ec3\") " Sep 30 15:06:38 crc kubenswrapper[4676]: I0930 15:06:38.596733 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d29f2572-6f3b-4af1-8628-915d588f9ec3-utilities" (OuterVolumeSpecName: "utilities") pod "d29f2572-6f3b-4af1-8628-915d588f9ec3" (UID: "d29f2572-6f3b-4af1-8628-915d588f9ec3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 15:06:38 crc kubenswrapper[4676]: I0930 15:06:38.606622 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d29f2572-6f3b-4af1-8628-915d588f9ec3-kube-api-access-9bpkh" (OuterVolumeSpecName: "kube-api-access-9bpkh") pod "d29f2572-6f3b-4af1-8628-915d588f9ec3" (UID: "d29f2572-6f3b-4af1-8628-915d588f9ec3"). InnerVolumeSpecName "kube-api-access-9bpkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 15:06:38 crc kubenswrapper[4676]: I0930 15:06:38.611490 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d29f2572-6f3b-4af1-8628-915d588f9ec3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d29f2572-6f3b-4af1-8628-915d588f9ec3" (UID: "d29f2572-6f3b-4af1-8628-915d588f9ec3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 15:06:38 crc kubenswrapper[4676]: I0930 15:06:38.696348 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9bpkh\" (UniqueName: \"kubernetes.io/projected/d29f2572-6f3b-4af1-8628-915d588f9ec3-kube-api-access-9bpkh\") on node \"crc\" DevicePath \"\"" Sep 30 15:06:38 crc kubenswrapper[4676]: I0930 15:06:38.696390 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d29f2572-6f3b-4af1-8628-915d588f9ec3-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 15:06:38 crc kubenswrapper[4676]: I0930 15:06:38.696403 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d29f2572-6f3b-4af1-8628-915d588f9ec3-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 15:06:38 crc kubenswrapper[4676]: I0930 15:06:38.962334 4676 generic.go:334] "Generic (PLEG): container finished" podID="d29f2572-6f3b-4af1-8628-915d588f9ec3" containerID="3b8fcc42aed90752d41026b7a1f726584f6a1a72bd4e8ebdfa88162f6086d2e8" exitCode=0 Sep 30 15:06:38 crc kubenswrapper[4676]: I0930 15:06:38.962394 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v8fdp" event={"ID":"d29f2572-6f3b-4af1-8628-915d588f9ec3","Type":"ContainerDied","Data":"3b8fcc42aed90752d41026b7a1f726584f6a1a72bd4e8ebdfa88162f6086d2e8"} Sep 30 15:06:38 crc kubenswrapper[4676]: I0930 15:06:38.962429 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v8fdp" event={"ID":"d29f2572-6f3b-4af1-8628-915d588f9ec3","Type":"ContainerDied","Data":"8a6004fef06a658de189d7831384b7c53fb64ec7454421c237e33faddbb59f6a"} Sep 30 15:06:38 crc kubenswrapper[4676]: I0930 15:06:38.962453 4676 scope.go:117] "RemoveContainer" containerID="3b8fcc42aed90752d41026b7a1f726584f6a1a72bd4e8ebdfa88162f6086d2e8" Sep 30 15:06:38 crc kubenswrapper[4676]: I0930 15:06:38.962621 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v8fdp" Sep 30 15:06:38 crc kubenswrapper[4676]: I0930 15:06:38.988231 4676 scope.go:117] "RemoveContainer" containerID="571419cf4f21852ac386c09c8486cb81566f91b5490a3ee28105c6f373af3d0a" Sep 30 15:06:39 crc kubenswrapper[4676]: I0930 15:06:39.008178 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-v8fdp"] Sep 30 15:06:39 crc kubenswrapper[4676]: I0930 15:06:39.019170 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-v8fdp"] Sep 30 15:06:39 crc kubenswrapper[4676]: I0930 15:06:39.040976 4676 scope.go:117] "RemoveContainer" containerID="073e530e3558badb88bc3163fdf74a252bc0796dcb48a7cbb2bc283c3a63c864" Sep 30 15:06:39 crc kubenswrapper[4676]: I0930 15:06:39.077444 4676 scope.go:117] "RemoveContainer" containerID="3b8fcc42aed90752d41026b7a1f726584f6a1a72bd4e8ebdfa88162f6086d2e8" Sep 30 15:06:39 crc kubenswrapper[4676]: E0930 15:06:39.078011 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b8fcc42aed90752d41026b7a1f726584f6a1a72bd4e8ebdfa88162f6086d2e8\": container with ID starting with 3b8fcc42aed90752d41026b7a1f726584f6a1a72bd4e8ebdfa88162f6086d2e8 not found: ID does not exist" containerID="3b8fcc42aed90752d41026b7a1f726584f6a1a72bd4e8ebdfa88162f6086d2e8" Sep 30 15:06:39 crc kubenswrapper[4676]: I0930 15:06:39.078175 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b8fcc42aed90752d41026b7a1f726584f6a1a72bd4e8ebdfa88162f6086d2e8"} err="failed to get container status \"3b8fcc42aed90752d41026b7a1f726584f6a1a72bd4e8ebdfa88162f6086d2e8\": rpc error: code = NotFound desc = could not find container \"3b8fcc42aed90752d41026b7a1f726584f6a1a72bd4e8ebdfa88162f6086d2e8\": container with ID starting with 3b8fcc42aed90752d41026b7a1f726584f6a1a72bd4e8ebdfa88162f6086d2e8 not found: ID does not exist" Sep 30 15:06:39 crc kubenswrapper[4676]: I0930 15:06:39.078295 4676 scope.go:117] "RemoveContainer" containerID="571419cf4f21852ac386c09c8486cb81566f91b5490a3ee28105c6f373af3d0a" Sep 30 15:06:39 crc kubenswrapper[4676]: E0930 15:06:39.078761 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"571419cf4f21852ac386c09c8486cb81566f91b5490a3ee28105c6f373af3d0a\": container with ID starting with 571419cf4f21852ac386c09c8486cb81566f91b5490a3ee28105c6f373af3d0a not found: ID does not exist" containerID="571419cf4f21852ac386c09c8486cb81566f91b5490a3ee28105c6f373af3d0a" Sep 30 15:06:39 crc kubenswrapper[4676]: I0930 15:06:39.078865 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"571419cf4f21852ac386c09c8486cb81566f91b5490a3ee28105c6f373af3d0a"} err="failed to get container status \"571419cf4f21852ac386c09c8486cb81566f91b5490a3ee28105c6f373af3d0a\": rpc error: code = NotFound desc = could not find container \"571419cf4f21852ac386c09c8486cb81566f91b5490a3ee28105c6f373af3d0a\": container with ID starting with 571419cf4f21852ac386c09c8486cb81566f91b5490a3ee28105c6f373af3d0a not found: ID does not exist" Sep 30 15:06:39 crc kubenswrapper[4676]: I0930 15:06:39.078979 4676 scope.go:117] "RemoveContainer" containerID="073e530e3558badb88bc3163fdf74a252bc0796dcb48a7cbb2bc283c3a63c864" Sep 30 15:06:39 crc kubenswrapper[4676]: E0930 15:06:39.079999 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"073e530e3558badb88bc3163fdf74a252bc0796dcb48a7cbb2bc283c3a63c864\": container with ID starting with 073e530e3558badb88bc3163fdf74a252bc0796dcb48a7cbb2bc283c3a63c864 not found: ID does not exist" containerID="073e530e3558badb88bc3163fdf74a252bc0796dcb48a7cbb2bc283c3a63c864" Sep 30 15:06:39 crc kubenswrapper[4676]: I0930 15:06:39.080105 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"073e530e3558badb88bc3163fdf74a252bc0796dcb48a7cbb2bc283c3a63c864"} err="failed to get container status \"073e530e3558badb88bc3163fdf74a252bc0796dcb48a7cbb2bc283c3a63c864\": rpc error: code = NotFound desc = could not find container \"073e530e3558badb88bc3163fdf74a252bc0796dcb48a7cbb2bc283c3a63c864\": container with ID starting with 073e530e3558badb88bc3163fdf74a252bc0796dcb48a7cbb2bc283c3a63c864 not found: ID does not exist" Sep 30 15:06:39 crc kubenswrapper[4676]: I0930 15:06:39.444054 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d29f2572-6f3b-4af1-8628-915d588f9ec3" path="/var/lib/kubelet/pods/d29f2572-6f3b-4af1-8628-915d588f9ec3/volumes" Sep 30 15:06:42 crc kubenswrapper[4676]: I0930 15:06:42.140973 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sr586" Sep 30 15:06:42 crc kubenswrapper[4676]: I0930 15:06:42.192920 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sr586"] Sep 30 15:06:43 crc kubenswrapper[4676]: I0930 15:06:43.000391 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-sr586" podUID="06dc138c-5f30-4827-ba1e-6a7c83ffda97" containerName="registry-server" containerID="cri-o://69d591432c34cd5d862ae97144affeae9ff4c46df08497f94150eab78a71099e" gracePeriod=2 Sep 30 15:06:43 crc kubenswrapper[4676]: I0930 15:06:43.438506 4676 scope.go:117] "RemoveContainer" containerID="01200c66e62b433cacb49e583f03caf0998b33671a3d5ec07da6a24fae8e6b7a" Sep 30 15:06:43 crc kubenswrapper[4676]: E0930 15:06:43.439173 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 15:06:43 crc kubenswrapper[4676]: I0930 15:06:43.604575 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sr586" Sep 30 15:06:43 crc kubenswrapper[4676]: I0930 15:06:43.705317 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qx6sg\" (UniqueName: \"kubernetes.io/projected/06dc138c-5f30-4827-ba1e-6a7c83ffda97-kube-api-access-qx6sg\") pod \"06dc138c-5f30-4827-ba1e-6a7c83ffda97\" (UID: \"06dc138c-5f30-4827-ba1e-6a7c83ffda97\") " Sep 30 15:06:43 crc kubenswrapper[4676]: I0930 15:06:43.705373 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06dc138c-5f30-4827-ba1e-6a7c83ffda97-catalog-content\") pod \"06dc138c-5f30-4827-ba1e-6a7c83ffda97\" (UID: \"06dc138c-5f30-4827-ba1e-6a7c83ffda97\") " Sep 30 15:06:43 crc kubenswrapper[4676]: I0930 15:06:43.705497 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06dc138c-5f30-4827-ba1e-6a7c83ffda97-utilities\") pod \"06dc138c-5f30-4827-ba1e-6a7c83ffda97\" (UID: \"06dc138c-5f30-4827-ba1e-6a7c83ffda97\") " Sep 30 15:06:43 crc kubenswrapper[4676]: I0930 15:06:43.706686 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06dc138c-5f30-4827-ba1e-6a7c83ffda97-utilities" (OuterVolumeSpecName: "utilities") pod "06dc138c-5f30-4827-ba1e-6a7c83ffda97" (UID: "06dc138c-5f30-4827-ba1e-6a7c83ffda97"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 15:06:43 crc kubenswrapper[4676]: I0930 15:06:43.712357 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06dc138c-5f30-4827-ba1e-6a7c83ffda97-kube-api-access-qx6sg" (OuterVolumeSpecName: "kube-api-access-qx6sg") pod "06dc138c-5f30-4827-ba1e-6a7c83ffda97" (UID: "06dc138c-5f30-4827-ba1e-6a7c83ffda97"). InnerVolumeSpecName "kube-api-access-qx6sg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 15:06:43 crc kubenswrapper[4676]: I0930 15:06:43.758263 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06dc138c-5f30-4827-ba1e-6a7c83ffda97-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "06dc138c-5f30-4827-ba1e-6a7c83ffda97" (UID: "06dc138c-5f30-4827-ba1e-6a7c83ffda97"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 15:06:43 crc kubenswrapper[4676]: I0930 15:06:43.807608 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qx6sg\" (UniqueName: \"kubernetes.io/projected/06dc138c-5f30-4827-ba1e-6a7c83ffda97-kube-api-access-qx6sg\") on node \"crc\" DevicePath \"\"" Sep 30 15:06:43 crc kubenswrapper[4676]: I0930 15:06:43.807649 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06dc138c-5f30-4827-ba1e-6a7c83ffda97-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 15:06:43 crc kubenswrapper[4676]: I0930 15:06:43.807667 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06dc138c-5f30-4827-ba1e-6a7c83ffda97-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 15:06:44 crc kubenswrapper[4676]: I0930 15:06:44.014607 4676 generic.go:334] "Generic (PLEG): container finished" podID="06dc138c-5f30-4827-ba1e-6a7c83ffda97" containerID="69d591432c34cd5d862ae97144affeae9ff4c46df08497f94150eab78a71099e" exitCode=0 Sep 30 15:06:44 crc kubenswrapper[4676]: I0930 15:06:44.014678 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sr586" event={"ID":"06dc138c-5f30-4827-ba1e-6a7c83ffda97","Type":"ContainerDied","Data":"69d591432c34cd5d862ae97144affeae9ff4c46df08497f94150eab78a71099e"} Sep 30 15:06:44 crc kubenswrapper[4676]: I0930 15:06:44.014959 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sr586" event={"ID":"06dc138c-5f30-4827-ba1e-6a7c83ffda97","Type":"ContainerDied","Data":"71069e3fad428dcc70ddccf087bf88a394636306686c54f9ccbc4822f64cff0e"} Sep 30 15:06:44 crc kubenswrapper[4676]: I0930 15:06:44.014737 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sr586" Sep 30 15:06:44 crc kubenswrapper[4676]: I0930 15:06:44.015074 4676 scope.go:117] "RemoveContainer" containerID="69d591432c34cd5d862ae97144affeae9ff4c46df08497f94150eab78a71099e" Sep 30 15:06:44 crc kubenswrapper[4676]: I0930 15:06:44.050281 4676 scope.go:117] "RemoveContainer" containerID="94e8fcf6fcbd2b07d01ff407f27d76617b578a07c82f9ec110cdb2c85b9522b0" Sep 30 15:06:44 crc kubenswrapper[4676]: I0930 15:06:44.068947 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sr586"] Sep 30 15:06:44 crc kubenswrapper[4676]: I0930 15:06:44.079962 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-sr586"] Sep 30 15:06:44 crc kubenswrapper[4676]: I0930 15:06:44.081850 4676 scope.go:117] "RemoveContainer" containerID="4b3115df2e8143807fd264f72ddfb737cff9b853d5505ead77d4c2f5e9f33963" Sep 30 15:06:44 crc kubenswrapper[4676]: I0930 15:06:44.137138 4676 scope.go:117] "RemoveContainer" containerID="69d591432c34cd5d862ae97144affeae9ff4c46df08497f94150eab78a71099e" Sep 30 15:06:44 crc kubenswrapper[4676]: E0930 15:06:44.138004 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69d591432c34cd5d862ae97144affeae9ff4c46df08497f94150eab78a71099e\": container with ID starting with 69d591432c34cd5d862ae97144affeae9ff4c46df08497f94150eab78a71099e not found: ID does not exist" containerID="69d591432c34cd5d862ae97144affeae9ff4c46df08497f94150eab78a71099e" Sep 30 15:06:44 crc kubenswrapper[4676]: I0930 15:06:44.138064 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69d591432c34cd5d862ae97144affeae9ff4c46df08497f94150eab78a71099e"} err="failed to get container status \"69d591432c34cd5d862ae97144affeae9ff4c46df08497f94150eab78a71099e\": rpc error: code = NotFound desc = could not find container \"69d591432c34cd5d862ae97144affeae9ff4c46df08497f94150eab78a71099e\": container with ID starting with 69d591432c34cd5d862ae97144affeae9ff4c46df08497f94150eab78a71099e not found: ID does not exist" Sep 30 15:06:44 crc kubenswrapper[4676]: I0930 15:06:44.138092 4676 scope.go:117] "RemoveContainer" containerID="94e8fcf6fcbd2b07d01ff407f27d76617b578a07c82f9ec110cdb2c85b9522b0" Sep 30 15:06:44 crc kubenswrapper[4676]: E0930 15:06:44.138507 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94e8fcf6fcbd2b07d01ff407f27d76617b578a07c82f9ec110cdb2c85b9522b0\": container with ID starting with 94e8fcf6fcbd2b07d01ff407f27d76617b578a07c82f9ec110cdb2c85b9522b0 not found: ID does not exist" containerID="94e8fcf6fcbd2b07d01ff407f27d76617b578a07c82f9ec110cdb2c85b9522b0" Sep 30 15:06:44 crc kubenswrapper[4676]: I0930 15:06:44.138550 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94e8fcf6fcbd2b07d01ff407f27d76617b578a07c82f9ec110cdb2c85b9522b0"} err="failed to get container status \"94e8fcf6fcbd2b07d01ff407f27d76617b578a07c82f9ec110cdb2c85b9522b0\": rpc error: code = NotFound desc = could not find container \"94e8fcf6fcbd2b07d01ff407f27d76617b578a07c82f9ec110cdb2c85b9522b0\": container with ID starting with 94e8fcf6fcbd2b07d01ff407f27d76617b578a07c82f9ec110cdb2c85b9522b0 not found: ID does not exist" Sep 30 15:06:44 crc kubenswrapper[4676]: I0930 15:06:44.138587 4676 scope.go:117] "RemoveContainer" containerID="4b3115df2e8143807fd264f72ddfb737cff9b853d5505ead77d4c2f5e9f33963" Sep 30 15:06:44 crc kubenswrapper[4676]: E0930 15:06:44.139117 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b3115df2e8143807fd264f72ddfb737cff9b853d5505ead77d4c2f5e9f33963\": container with ID starting with 4b3115df2e8143807fd264f72ddfb737cff9b853d5505ead77d4c2f5e9f33963 not found: ID does not exist" containerID="4b3115df2e8143807fd264f72ddfb737cff9b853d5505ead77d4c2f5e9f33963" Sep 30 15:06:44 crc kubenswrapper[4676]: I0930 15:06:44.139154 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b3115df2e8143807fd264f72ddfb737cff9b853d5505ead77d4c2f5e9f33963"} err="failed to get container status \"4b3115df2e8143807fd264f72ddfb737cff9b853d5505ead77d4c2f5e9f33963\": rpc error: code = NotFound desc = could not find container \"4b3115df2e8143807fd264f72ddfb737cff9b853d5505ead77d4c2f5e9f33963\": container with ID starting with 4b3115df2e8143807fd264f72ddfb737cff9b853d5505ead77d4c2f5e9f33963 not found: ID does not exist" Sep 30 15:06:45 crc kubenswrapper[4676]: I0930 15:06:45.445200 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06dc138c-5f30-4827-ba1e-6a7c83ffda97" path="/var/lib/kubelet/pods/06dc138c-5f30-4827-ba1e-6a7c83ffda97/volumes" Sep 30 15:06:56 crc kubenswrapper[4676]: I0930 15:06:56.432918 4676 scope.go:117] "RemoveContainer" containerID="01200c66e62b433cacb49e583f03caf0998b33671a3d5ec07da6a24fae8e6b7a" Sep 30 15:06:56 crc kubenswrapper[4676]: E0930 15:06:56.434700 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 15:07:09 crc kubenswrapper[4676]: I0930 15:07:09.436103 4676 scope.go:117] "RemoveContainer" containerID="01200c66e62b433cacb49e583f03caf0998b33671a3d5ec07da6a24fae8e6b7a" Sep 30 15:07:09 crc kubenswrapper[4676]: E0930 15:07:09.436891 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 15:07:21 crc kubenswrapper[4676]: I0930 15:07:21.435106 4676 scope.go:117] "RemoveContainer" containerID="01200c66e62b433cacb49e583f03caf0998b33671a3d5ec07da6a24fae8e6b7a" Sep 30 15:07:21 crc kubenswrapper[4676]: E0930 15:07:21.436835 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 15:07:33 crc kubenswrapper[4676]: I0930 15:07:33.433605 4676 scope.go:117] "RemoveContainer" containerID="01200c66e62b433cacb49e583f03caf0998b33671a3d5ec07da6a24fae8e6b7a" Sep 30 15:07:33 crc kubenswrapper[4676]: E0930 15:07:33.434546 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 15:07:39 crc kubenswrapper[4676]: I0930 15:07:39.942805 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5fbdbfbbdb-mjhjg_1e134bd5-ad40-427d-ba65-7cf9a5a25104/barbican-api/0.log" Sep 30 15:07:40 crc kubenswrapper[4676]: I0930 15:07:40.035710 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5fbdbfbbdb-mjhjg_1e134bd5-ad40-427d-ba65-7cf9a5a25104/barbican-api-log/0.log" Sep 30 15:07:40 crc kubenswrapper[4676]: I0930 15:07:40.133231 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-777c6c994b-kk5rn_70c7535f-4b3b-438f-9470-c857ece73452/barbican-keystone-listener/0.log" Sep 30 15:07:40 crc kubenswrapper[4676]: I0930 15:07:40.242556 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-777c6c994b-kk5rn_70c7535f-4b3b-438f-9470-c857ece73452/barbican-keystone-listener-log/0.log" Sep 30 15:07:40 crc kubenswrapper[4676]: I0930 15:07:40.340115 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-bf6466755-m2t9t_39f521d2-b195-4179-a114-1c1611e4ba2f/barbican-worker/0.log" Sep 30 15:07:40 crc kubenswrapper[4676]: I0930 15:07:40.455946 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-bf6466755-m2t9t_39f521d2-b195-4179-a114-1c1611e4ba2f/barbican-worker-log/0.log" Sep 30 15:07:40 crc kubenswrapper[4676]: I0930 15:07:40.565308 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-9p8tn_9093887d-1e08-4208-9584-a78c329fd7b0/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 15:07:40 crc kubenswrapper[4676]: I0930 15:07:40.781188 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2a301731-560b-431e-b265-ef436fa8eccb/ceilometer-central-agent/0.log" Sep 30 15:07:40 crc kubenswrapper[4676]: I0930 15:07:40.809718 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2a301731-560b-431e-b265-ef436fa8eccb/ceilometer-notification-agent/0.log" Sep 30 15:07:40 crc kubenswrapper[4676]: I0930 15:07:40.921109 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2a301731-560b-431e-b265-ef436fa8eccb/proxy-httpd/0.log" Sep 30 15:07:40 crc kubenswrapper[4676]: I0930 15:07:40.958325 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2a301731-560b-431e-b265-ef436fa8eccb/sg-core/0.log" Sep 30 15:07:41 crc kubenswrapper[4676]: I0930 15:07:41.150123 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_f490182f-5ea6-45fa-85d0-a6b1c02c5849/cinder-api/0.log" Sep 30 15:07:41 crc kubenswrapper[4676]: I0930 15:07:41.167148 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_f490182f-5ea6-45fa-85d0-a6b1c02c5849/cinder-api-log/0.log" Sep 30 15:07:41 crc kubenswrapper[4676]: I0930 15:07:41.373629 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_21c274fe-4499-4294-b725-96e48b657186/probe/0.log" Sep 30 15:07:41 crc kubenswrapper[4676]: I0930 15:07:41.400238 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_21c274fe-4499-4294-b725-96e48b657186/cinder-scheduler/0.log" Sep 30 15:07:41 crc kubenswrapper[4676]: I0930 15:07:41.578486 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-56d9g_24983a6b-dac1-4567-b8b8-ded54e7287bb/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 15:07:41 crc kubenswrapper[4676]: I0930 15:07:41.765451 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-gq8qc_2ecdd5ea-0ceb-47e4-9e2a-3782b3050fa7/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 15:07:41 crc kubenswrapper[4676]: I0930 15:07:41.883390 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-zhdfz_6f7ebae7-0748-4052-859c-fb6a5fa89d33/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 15:07:42 crc kubenswrapper[4676]: I0930 15:07:42.059991 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-qglmb_a391dbc4-4a80-4a26-9e6d-5903b425ae97/init/0.log" Sep 30 15:07:42 crc kubenswrapper[4676]: I0930 15:07:42.223303 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-qglmb_a391dbc4-4a80-4a26-9e6d-5903b425ae97/init/0.log" Sep 30 15:07:42 crc kubenswrapper[4676]: I0930 15:07:42.292235 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-qglmb_a391dbc4-4a80-4a26-9e6d-5903b425ae97/dnsmasq-dns/0.log" Sep 30 15:07:42 crc kubenswrapper[4676]: I0930 15:07:42.509786 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-p4p5c_8b08c117-d7d7-4bc3-89a0-8a05169688fa/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 15:07:42 crc kubenswrapper[4676]: I0930 15:07:42.528461 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_a22bcf65-b8af-4f8a-845c-31b1b3609e05/glance-httpd/0.log" Sep 30 15:07:42 crc kubenswrapper[4676]: I0930 15:07:42.721090 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_a22bcf65-b8af-4f8a-845c-31b1b3609e05/glance-log/0.log" Sep 30 15:07:42 crc kubenswrapper[4676]: I0930 15:07:42.781671 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_cec7cd30-e0cb-41bb-a620-8d3fad4e2338/glance-httpd/0.log" Sep 30 15:07:42 crc kubenswrapper[4676]: I0930 15:07:42.973798 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_cec7cd30-e0cb-41bb-a620-8d3fad4e2338/glance-log/0.log" Sep 30 15:07:43 crc kubenswrapper[4676]: I0930 15:07:43.127827 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-68fc47cdb4-6758j_a020c8ba-b848-4a3f-80e4-b3692cf99ffa/horizon/0.log" Sep 30 15:07:43 crc kubenswrapper[4676]: I0930 15:07:43.529757 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-524mx_425308e0-6300-4e6a-922e-dc9ef39d61f8/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 15:07:43 crc kubenswrapper[4676]: I0930 15:07:43.609826 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-68fc47cdb4-6758j_a020c8ba-b848-4a3f-80e4-b3692cf99ffa/horizon-log/0.log" Sep 30 15:07:43 crc kubenswrapper[4676]: I0930 15:07:43.856186 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-j8kmb_29932146-0fdd-4717-8a42-2b04967df9ce/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 15:07:44 crc kubenswrapper[4676]: I0930 15:07:44.017348 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29320741-8mjdr_a4a69823-fcbd-4141-83bd-6f242e8304be/keystone-cron/0.log" Sep 30 15:07:44 crc kubenswrapper[4676]: I0930 15:07:44.195817 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-d85658b59-96mgj_4e10025d-8396-4100-8652-3358d52c3199/keystone-api/0.log" Sep 30 15:07:44 crc kubenswrapper[4676]: I0930 15:07:44.299643 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_ab292b94-70ab-4d77-9100-d6db2654e3e2/kube-state-metrics/0.log" Sep 30 15:07:44 crc kubenswrapper[4676]: I0930 15:07:44.474928 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-dr7m6_53729d22-521b-4f61-a225-832492a98b7b/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 15:07:44 crc kubenswrapper[4676]: I0930 15:07:44.878260 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-56956855f5-jwqlp_af5d35f3-c607-4084-9585-0a750ea54db5/neutron-api/0.log" Sep 30 15:07:44 crc kubenswrapper[4676]: I0930 15:07:44.973763 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-56956855f5-jwqlp_af5d35f3-c607-4084-9585-0a750ea54db5/neutron-httpd/0.log" Sep 30 15:07:45 crc kubenswrapper[4676]: I0930 15:07:45.433530 4676 scope.go:117] "RemoveContainer" containerID="01200c66e62b433cacb49e583f03caf0998b33671a3d5ec07da6a24fae8e6b7a" Sep 30 15:07:45 crc kubenswrapper[4676]: E0930 15:07:45.433783 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 15:07:45 crc kubenswrapper[4676]: I0930 15:07:45.746782 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-w6g6m_87967da4-c3f2-46e1-ae80-230612ebe6af/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 15:07:46 crc kubenswrapper[4676]: I0930 15:07:46.422944 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_b49d9f0a-d50a-409b-b985-c09b657e9ba2/nova-api-log/0.log" Sep 30 15:07:46 crc kubenswrapper[4676]: I0930 15:07:46.709428 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_7c38ef92-eb09-4ce7-b23f-10886d83860c/nova-cell0-conductor-conductor/0.log" Sep 30 15:07:46 crc kubenswrapper[4676]: I0930 15:07:46.818199 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_b49d9f0a-d50a-409b-b985-c09b657e9ba2/nova-api-api/0.log" Sep 30 15:07:47 crc kubenswrapper[4676]: I0930 15:07:47.111032 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_f3250c50-5426-440d-a8ba-9a4f75001b16/nova-cell1-conductor-conductor/0.log" Sep 30 15:07:47 crc kubenswrapper[4676]: I0930 15:07:47.221593 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_1cc6f007-8ed9-4512-8b1b-70e2081f873a/nova-cell1-novncproxy-novncproxy/0.log" Sep 30 15:07:47 crc kubenswrapper[4676]: I0930 15:07:47.868643 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-gjtn5_e4cb3ae8-cc50-4de4-b279-51105c6fc45c/nova-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 15:07:47 crc kubenswrapper[4676]: I0930 15:07:47.937719 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_b9578818-8dfa-4aec-8923-d1d9424068be/nova-metadata-log/0.log" Sep 30 15:07:48 crc kubenswrapper[4676]: I0930 15:07:48.437089 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_92d9c3ce-1a0b-48d0-a88b-ce25162e54b0/nova-scheduler-scheduler/0.log" Sep 30 15:07:48 crc kubenswrapper[4676]: I0930 15:07:48.723753 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_9762912c-8ca3-4791-93c0-4d5728543998/mysql-bootstrap/0.log" Sep 30 15:07:48 crc kubenswrapper[4676]: I0930 15:07:48.779401 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_9762912c-8ca3-4791-93c0-4d5728543998/mysql-bootstrap/0.log" Sep 30 15:07:48 crc kubenswrapper[4676]: I0930 15:07:48.959733 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_9762912c-8ca3-4791-93c0-4d5728543998/galera/0.log" Sep 30 15:07:49 crc kubenswrapper[4676]: I0930 15:07:49.178106 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_d32e1e85-6a70-4751-9223-85e7018c3cc7/mysql-bootstrap/0.log" Sep 30 15:07:49 crc kubenswrapper[4676]: I0930 15:07:49.431216 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_d32e1e85-6a70-4751-9223-85e7018c3cc7/galera/0.log" Sep 30 15:07:49 crc kubenswrapper[4676]: I0930 15:07:49.481545 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_d32e1e85-6a70-4751-9223-85e7018c3cc7/mysql-bootstrap/0.log" Sep 30 15:07:49 crc kubenswrapper[4676]: I0930 15:07:49.695950 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_b9578818-8dfa-4aec-8923-d1d9424068be/nova-metadata-metadata/0.log" Sep 30 15:07:49 crc kubenswrapper[4676]: I0930 15:07:49.717962 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_f682d501-bba0-4b08-98aa-0ee2a0603939/openstackclient/0.log" Sep 30 15:07:49 crc kubenswrapper[4676]: I0930 15:07:49.998773 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-9nlxb_72431bf0-bd8f-431d-81a8-082f9ef654e1/ovn-controller/0.log" Sep 30 15:07:50 crc kubenswrapper[4676]: I0930 15:07:50.174827 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-w8c76_39632f21-878d-4bb9-ba72-afcac2cd0b5d/openstack-network-exporter/0.log" Sep 30 15:07:50 crc kubenswrapper[4676]: I0930 15:07:50.315847 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-7t792_a69f5429-b9b9-47c3-b720-1a59c5d87b27/ovsdb-server-init/0.log" Sep 30 15:07:50 crc kubenswrapper[4676]: I0930 15:07:50.476308 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-7t792_a69f5429-b9b9-47c3-b720-1a59c5d87b27/ovsdb-server-init/0.log" Sep 30 15:07:50 crc kubenswrapper[4676]: I0930 15:07:50.521515 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-7t792_a69f5429-b9b9-47c3-b720-1a59c5d87b27/ovsdb-server/0.log" Sep 30 15:07:50 crc kubenswrapper[4676]: I0930 15:07:50.569722 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-7t792_a69f5429-b9b9-47c3-b720-1a59c5d87b27/ovs-vswitchd/0.log" Sep 30 15:07:50 crc kubenswrapper[4676]: I0930 15:07:50.764768 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-js2fj_ac2b4c79-3867-4f1b-bb55-d0978cffaded/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 15:07:50 crc kubenswrapper[4676]: I0930 15:07:50.926560 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_2aa43108-6602-4fc3-b8b1-ce07a8ef0b31/openstack-network-exporter/0.log" Sep 30 15:07:50 crc kubenswrapper[4676]: I0930 15:07:50.965400 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_2aa43108-6602-4fc3-b8b1-ce07a8ef0b31/ovn-northd/0.log" Sep 30 15:07:51 crc kubenswrapper[4676]: I0930 15:07:51.137799 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_35e313d1-3779-4eb1-b12f-c3b5432dfd1d/openstack-network-exporter/0.log" Sep 30 15:07:51 crc kubenswrapper[4676]: I0930 15:07:51.179808 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_35e313d1-3779-4eb1-b12f-c3b5432dfd1d/ovsdbserver-nb/0.log" Sep 30 15:07:51 crc kubenswrapper[4676]: I0930 15:07:51.371052 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_e73b3580-31d5-4c06-9bd8-acbd16c5c48d/ovsdbserver-sb/0.log" Sep 30 15:07:51 crc kubenswrapper[4676]: I0930 15:07:51.405144 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_e73b3580-31d5-4c06-9bd8-acbd16c5c48d/openstack-network-exporter/0.log" Sep 30 15:07:51 crc kubenswrapper[4676]: I0930 15:07:51.694670 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6558bbc9d4-wdcbn_c5505b25-a501-44f0-8b24-6630fb71d41b/placement-api/0.log" Sep 30 15:07:51 crc kubenswrapper[4676]: I0930 15:07:51.744506 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6558bbc9d4-wdcbn_c5505b25-a501-44f0-8b24-6630fb71d41b/placement-log/0.log" Sep 30 15:07:51 crc kubenswrapper[4676]: I0930 15:07:51.935685 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_56d86f95-7d72-42d4-84d7-fdca29b1270f/setup-container/0.log" Sep 30 15:07:52 crc kubenswrapper[4676]: I0930 15:07:52.108362 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_56d86f95-7d72-42d4-84d7-fdca29b1270f/rabbitmq/0.log" Sep 30 15:07:52 crc kubenswrapper[4676]: I0930 15:07:52.196957 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_56d86f95-7d72-42d4-84d7-fdca29b1270f/setup-container/0.log" Sep 30 15:07:52 crc kubenswrapper[4676]: I0930 15:07:52.359697 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_140577c7-99f4-4dc1-85dd-1bec990df549/setup-container/0.log" Sep 30 15:07:52 crc kubenswrapper[4676]: I0930 15:07:52.587638 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_140577c7-99f4-4dc1-85dd-1bec990df549/rabbitmq/0.log" Sep 30 15:07:52 crc kubenswrapper[4676]: I0930 15:07:52.615173 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_140577c7-99f4-4dc1-85dd-1bec990df549/setup-container/0.log" Sep 30 15:07:52 crc kubenswrapper[4676]: I0930 15:07:52.823375 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-sgkqz_fc6b249a-7b71-4b2e-9d6a-6b6b635b8ddc/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 15:07:53 crc kubenswrapper[4676]: I0930 15:07:53.061244 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-cj72b_bcad236c-a0f7-47a2-ae9e-e52839eaee9d/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 15:07:53 crc kubenswrapper[4676]: I0930 15:07:53.070532 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-54lsm_8d063f50-40de-47eb-9849-8b29cee35392/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 15:07:53 crc kubenswrapper[4676]: I0930 15:07:53.284792 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-flzt4_44bcdced-a8cf-4b1d-baa4-31988a1ca72d/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 15:07:53 crc kubenswrapper[4676]: I0930 15:07:53.510251 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-kgg8r_c9aca039-cea1-4fe5-8ee1-226f22cbefd2/ssh-known-hosts-edpm-deployment/0.log" Sep 30 15:07:53 crc kubenswrapper[4676]: I0930 15:07:53.779578 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6f9769955f-6wd7m_b18c2fcd-dc66-434b-b3ef-61215f24a511/proxy-server/0.log" Sep 30 15:07:53 crc kubenswrapper[4676]: I0930 15:07:53.805728 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6f9769955f-6wd7m_b18c2fcd-dc66-434b-b3ef-61215f24a511/proxy-httpd/0.log" Sep 30 15:07:53 crc kubenswrapper[4676]: I0930 15:07:53.976769 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-vbg62_fb668321-0ea7-4c30-9773-6c7f511959f4/swift-ring-rebalance/0.log" Sep 30 15:07:54 crc kubenswrapper[4676]: I0930 15:07:54.081380 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6c0230d2-8bbc-4ad0-8f3d-062d2d940013/account-auditor/0.log" Sep 30 15:07:54 crc kubenswrapper[4676]: I0930 15:07:54.199081 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6c0230d2-8bbc-4ad0-8f3d-062d2d940013/account-reaper/0.log" Sep 30 15:07:54 crc kubenswrapper[4676]: I0930 15:07:54.351712 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6c0230d2-8bbc-4ad0-8f3d-062d2d940013/account-replicator/0.log" Sep 30 15:07:54 crc kubenswrapper[4676]: I0930 15:07:54.383524 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6c0230d2-8bbc-4ad0-8f3d-062d2d940013/account-server/0.log" Sep 30 15:07:54 crc kubenswrapper[4676]: I0930 15:07:54.395193 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6c0230d2-8bbc-4ad0-8f3d-062d2d940013/container-auditor/0.log" Sep 30 15:07:54 crc kubenswrapper[4676]: I0930 15:07:54.578369 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6c0230d2-8bbc-4ad0-8f3d-062d2d940013/container-replicator/0.log" Sep 30 15:07:54 crc kubenswrapper[4676]: I0930 15:07:54.606991 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6c0230d2-8bbc-4ad0-8f3d-062d2d940013/container-updater/0.log" Sep 30 15:07:54 crc kubenswrapper[4676]: I0930 15:07:54.625332 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6c0230d2-8bbc-4ad0-8f3d-062d2d940013/container-server/0.log" Sep 30 15:07:54 crc kubenswrapper[4676]: I0930 15:07:54.779260 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6c0230d2-8bbc-4ad0-8f3d-062d2d940013/object-auditor/0.log" Sep 30 15:07:54 crc kubenswrapper[4676]: I0930 15:07:54.832072 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6c0230d2-8bbc-4ad0-8f3d-062d2d940013/object-replicator/0.log" Sep 30 15:07:54 crc kubenswrapper[4676]: I0930 15:07:54.863493 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6c0230d2-8bbc-4ad0-8f3d-062d2d940013/object-expirer/0.log" Sep 30 15:07:55 crc kubenswrapper[4676]: I0930 15:07:55.036145 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6c0230d2-8bbc-4ad0-8f3d-062d2d940013/object-server/0.log" Sep 30 15:07:55 crc kubenswrapper[4676]: I0930 15:07:55.090545 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6c0230d2-8bbc-4ad0-8f3d-062d2d940013/object-updater/0.log" Sep 30 15:07:55 crc kubenswrapper[4676]: I0930 15:07:55.095460 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6c0230d2-8bbc-4ad0-8f3d-062d2d940013/rsync/0.log" Sep 30 15:07:55 crc kubenswrapper[4676]: I0930 15:07:55.272280 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6c0230d2-8bbc-4ad0-8f3d-062d2d940013/swift-recon-cron/0.log" Sep 30 15:07:55 crc kubenswrapper[4676]: I0930 15:07:55.379170 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-hzvwk_6b813464-177e-4354-af90-edefef63c05c/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 15:07:55 crc kubenswrapper[4676]: I0930 15:07:55.609685 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_10b046c1-241e-4dfd-9aa3-d3e5532a6190/tempest-tests-tempest-tests-runner/0.log" Sep 30 15:07:56 crc kubenswrapper[4676]: I0930 15:07:56.305364 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_0def7648-4733-47fb-bd01-0dba601ea3cc/test-operator-logs-container/0.log" Sep 30 15:07:56 crc kubenswrapper[4676]: I0930 15:07:56.315257 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-6v5pp_d889c774-3b4f-4b05-9fd4-f18ad2aa2e4b/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 15:07:56 crc kubenswrapper[4676]: I0930 15:07:56.433648 4676 scope.go:117] "RemoveContainer" containerID="01200c66e62b433cacb49e583f03caf0998b33671a3d5ec07da6a24fae8e6b7a" Sep 30 15:07:56 crc kubenswrapper[4676]: E0930 15:07:56.434024 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 15:08:08 crc kubenswrapper[4676]: I0930 15:08:08.061104 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_7aef6349-6b33-4f9e-972d-a990cb3ff62e/memcached/0.log" Sep 30 15:08:09 crc kubenswrapper[4676]: I0930 15:08:09.433852 4676 scope.go:117] "RemoveContainer" containerID="01200c66e62b433cacb49e583f03caf0998b33671a3d5ec07da6a24fae8e6b7a" Sep 30 15:08:09 crc kubenswrapper[4676]: E0930 15:08:09.434449 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 15:08:24 crc kubenswrapper[4676]: I0930 15:08:24.252899 4676 scope.go:117] "RemoveContainer" containerID="88af6369ea5f80e4b1fd977c9f74cbf68fbbef393507358f88acbbfc54fc70c2" Sep 30 15:08:24 crc kubenswrapper[4676]: I0930 15:08:24.433640 4676 scope.go:117] "RemoveContainer" containerID="01200c66e62b433cacb49e583f03caf0998b33671a3d5ec07da6a24fae8e6b7a" Sep 30 15:08:24 crc kubenswrapper[4676]: E0930 15:08:24.433942 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4k2dp_openshift-machine-config-operator(af133cb7-f0e4-428e-b348-c6e81493fc1d)\"" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" Sep 30 15:08:38 crc kubenswrapper[4676]: I0930 15:08:38.163004 4676 generic.go:334] "Generic (PLEG): container finished" podID="0d7a5463-fda2-48c6-b9b9-72a9f724298a" containerID="b132d4d4418c8be10fd84cd8a531657cbf9cd19b789c6b8577188db054e51fa7" exitCode=0 Sep 30 15:08:38 crc kubenswrapper[4676]: I0930 15:08:38.163090 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n4b4l/crc-debug-rzz85" event={"ID":"0d7a5463-fda2-48c6-b9b9-72a9f724298a","Type":"ContainerDied","Data":"b132d4d4418c8be10fd84cd8a531657cbf9cd19b789c6b8577188db054e51fa7"} Sep 30 15:08:38 crc kubenswrapper[4676]: I0930 15:08:38.433666 4676 scope.go:117] "RemoveContainer" containerID="01200c66e62b433cacb49e583f03caf0998b33671a3d5ec07da6a24fae8e6b7a" Sep 30 15:08:39 crc kubenswrapper[4676]: I0930 15:08:39.175209 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" event={"ID":"af133cb7-f0e4-428e-b348-c6e81493fc1d","Type":"ContainerStarted","Data":"409286e2182fc474d7835ff7696ee8a19195b801dc6244941fe03db1195dad01"} Sep 30 15:08:39 crc kubenswrapper[4676]: I0930 15:08:39.292295 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n4b4l/crc-debug-rzz85" Sep 30 15:08:39 crc kubenswrapper[4676]: I0930 15:08:39.328399 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-n4b4l/crc-debug-rzz85"] Sep 30 15:08:39 crc kubenswrapper[4676]: I0930 15:08:39.338474 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-n4b4l/crc-debug-rzz85"] Sep 30 15:08:39 crc kubenswrapper[4676]: I0930 15:08:39.407121 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9trn\" (UniqueName: \"kubernetes.io/projected/0d7a5463-fda2-48c6-b9b9-72a9f724298a-kube-api-access-r9trn\") pod \"0d7a5463-fda2-48c6-b9b9-72a9f724298a\" (UID: \"0d7a5463-fda2-48c6-b9b9-72a9f724298a\") " Sep 30 15:08:39 crc kubenswrapper[4676]: I0930 15:08:39.407271 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0d7a5463-fda2-48c6-b9b9-72a9f724298a-host\") pod \"0d7a5463-fda2-48c6-b9b9-72a9f724298a\" (UID: \"0d7a5463-fda2-48c6-b9b9-72a9f724298a\") " Sep 30 15:08:39 crc kubenswrapper[4676]: I0930 15:08:39.407368 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0d7a5463-fda2-48c6-b9b9-72a9f724298a-host" (OuterVolumeSpecName: "host") pod "0d7a5463-fda2-48c6-b9b9-72a9f724298a" (UID: "0d7a5463-fda2-48c6-b9b9-72a9f724298a"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 15:08:39 crc kubenswrapper[4676]: I0930 15:08:39.407756 4676 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0d7a5463-fda2-48c6-b9b9-72a9f724298a-host\") on node \"crc\" DevicePath \"\"" Sep 30 15:08:39 crc kubenswrapper[4676]: I0930 15:08:39.416480 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d7a5463-fda2-48c6-b9b9-72a9f724298a-kube-api-access-r9trn" (OuterVolumeSpecName: "kube-api-access-r9trn") pod "0d7a5463-fda2-48c6-b9b9-72a9f724298a" (UID: "0d7a5463-fda2-48c6-b9b9-72a9f724298a"). InnerVolumeSpecName "kube-api-access-r9trn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 15:08:39 crc kubenswrapper[4676]: I0930 15:08:39.445978 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d7a5463-fda2-48c6-b9b9-72a9f724298a" path="/var/lib/kubelet/pods/0d7a5463-fda2-48c6-b9b9-72a9f724298a/volumes" Sep 30 15:08:39 crc kubenswrapper[4676]: I0930 15:08:39.511366 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9trn\" (UniqueName: \"kubernetes.io/projected/0d7a5463-fda2-48c6-b9b9-72a9f724298a-kube-api-access-r9trn\") on node \"crc\" DevicePath \"\"" Sep 30 15:08:40 crc kubenswrapper[4676]: I0930 15:08:40.191647 4676 scope.go:117] "RemoveContainer" containerID="b132d4d4418c8be10fd84cd8a531657cbf9cd19b789c6b8577188db054e51fa7" Sep 30 15:08:40 crc kubenswrapper[4676]: I0930 15:08:40.191714 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n4b4l/crc-debug-rzz85" Sep 30 15:08:40 crc kubenswrapper[4676]: I0930 15:08:40.516091 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-n4b4l/crc-debug-wkvkr"] Sep 30 15:08:40 crc kubenswrapper[4676]: E0930 15:08:40.516960 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d29f2572-6f3b-4af1-8628-915d588f9ec3" containerName="extract-utilities" Sep 30 15:08:40 crc kubenswrapper[4676]: I0930 15:08:40.516983 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="d29f2572-6f3b-4af1-8628-915d588f9ec3" containerName="extract-utilities" Sep 30 15:08:40 crc kubenswrapper[4676]: E0930 15:08:40.517001 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06dc138c-5f30-4827-ba1e-6a7c83ffda97" containerName="extract-utilities" Sep 30 15:08:40 crc kubenswrapper[4676]: I0930 15:08:40.517009 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="06dc138c-5f30-4827-ba1e-6a7c83ffda97" containerName="extract-utilities" Sep 30 15:08:40 crc kubenswrapper[4676]: E0930 15:08:40.517025 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06dc138c-5f30-4827-ba1e-6a7c83ffda97" containerName="registry-server" Sep 30 15:08:40 crc kubenswrapper[4676]: I0930 15:08:40.517033 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="06dc138c-5f30-4827-ba1e-6a7c83ffda97" containerName="registry-server" Sep 30 15:08:40 crc kubenswrapper[4676]: E0930 15:08:40.517047 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d29f2572-6f3b-4af1-8628-915d588f9ec3" containerName="extract-content" Sep 30 15:08:40 crc kubenswrapper[4676]: I0930 15:08:40.517053 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="d29f2572-6f3b-4af1-8628-915d588f9ec3" containerName="extract-content" Sep 30 15:08:40 crc kubenswrapper[4676]: E0930 15:08:40.517065 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d29f2572-6f3b-4af1-8628-915d588f9ec3" containerName="registry-server" Sep 30 15:08:40 crc kubenswrapper[4676]: I0930 15:08:40.517072 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="d29f2572-6f3b-4af1-8628-915d588f9ec3" containerName="registry-server" Sep 30 15:08:40 crc kubenswrapper[4676]: E0930 15:08:40.517099 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d7a5463-fda2-48c6-b9b9-72a9f724298a" containerName="container-00" Sep 30 15:08:40 crc kubenswrapper[4676]: I0930 15:08:40.517107 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d7a5463-fda2-48c6-b9b9-72a9f724298a" containerName="container-00" Sep 30 15:08:40 crc kubenswrapper[4676]: E0930 15:08:40.517123 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06dc138c-5f30-4827-ba1e-6a7c83ffda97" containerName="extract-content" Sep 30 15:08:40 crc kubenswrapper[4676]: I0930 15:08:40.517132 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="06dc138c-5f30-4827-ba1e-6a7c83ffda97" containerName="extract-content" Sep 30 15:08:40 crc kubenswrapper[4676]: I0930 15:08:40.517360 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="d29f2572-6f3b-4af1-8628-915d588f9ec3" containerName="registry-server" Sep 30 15:08:40 crc kubenswrapper[4676]: I0930 15:08:40.517395 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="06dc138c-5f30-4827-ba1e-6a7c83ffda97" containerName="registry-server" Sep 30 15:08:40 crc kubenswrapper[4676]: I0930 15:08:40.517410 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d7a5463-fda2-48c6-b9b9-72a9f724298a" containerName="container-00" Sep 30 15:08:40 crc kubenswrapper[4676]: I0930 15:08:40.518210 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n4b4l/crc-debug-wkvkr" Sep 30 15:08:40 crc kubenswrapper[4676]: I0930 15:08:40.520172 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-n4b4l"/"default-dockercfg-6k2x6" Sep 30 15:08:40 crc kubenswrapper[4676]: I0930 15:08:40.639220 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2cbee192-6582-44fe-9288-823abe05d10b-host\") pod \"crc-debug-wkvkr\" (UID: \"2cbee192-6582-44fe-9288-823abe05d10b\") " pod="openshift-must-gather-n4b4l/crc-debug-wkvkr" Sep 30 15:08:40 crc kubenswrapper[4676]: I0930 15:08:40.639301 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t486j\" (UniqueName: \"kubernetes.io/projected/2cbee192-6582-44fe-9288-823abe05d10b-kube-api-access-t486j\") pod \"crc-debug-wkvkr\" (UID: \"2cbee192-6582-44fe-9288-823abe05d10b\") " pod="openshift-must-gather-n4b4l/crc-debug-wkvkr" Sep 30 15:08:40 crc kubenswrapper[4676]: I0930 15:08:40.740973 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2cbee192-6582-44fe-9288-823abe05d10b-host\") pod \"crc-debug-wkvkr\" (UID: \"2cbee192-6582-44fe-9288-823abe05d10b\") " pod="openshift-must-gather-n4b4l/crc-debug-wkvkr" Sep 30 15:08:40 crc kubenswrapper[4676]: I0930 15:08:40.741040 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t486j\" (UniqueName: \"kubernetes.io/projected/2cbee192-6582-44fe-9288-823abe05d10b-kube-api-access-t486j\") pod \"crc-debug-wkvkr\" (UID: \"2cbee192-6582-44fe-9288-823abe05d10b\") " pod="openshift-must-gather-n4b4l/crc-debug-wkvkr" Sep 30 15:08:40 crc kubenswrapper[4676]: I0930 15:08:40.741111 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2cbee192-6582-44fe-9288-823abe05d10b-host\") pod \"crc-debug-wkvkr\" (UID: \"2cbee192-6582-44fe-9288-823abe05d10b\") " pod="openshift-must-gather-n4b4l/crc-debug-wkvkr" Sep 30 15:08:40 crc kubenswrapper[4676]: I0930 15:08:40.763131 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t486j\" (UniqueName: \"kubernetes.io/projected/2cbee192-6582-44fe-9288-823abe05d10b-kube-api-access-t486j\") pod \"crc-debug-wkvkr\" (UID: \"2cbee192-6582-44fe-9288-823abe05d10b\") " pod="openshift-must-gather-n4b4l/crc-debug-wkvkr" Sep 30 15:08:40 crc kubenswrapper[4676]: I0930 15:08:40.837073 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n4b4l/crc-debug-wkvkr" Sep 30 15:08:40 crc kubenswrapper[4676]: W0930 15:08:40.868640 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2cbee192_6582_44fe_9288_823abe05d10b.slice/crio-9878f9a8d99a865616029774abc04153c3ebfd8bc5e51710c126bd341a82ef6f WatchSource:0}: Error finding container 9878f9a8d99a865616029774abc04153c3ebfd8bc5e51710c126bd341a82ef6f: Status 404 returned error can't find the container with id 9878f9a8d99a865616029774abc04153c3ebfd8bc5e51710c126bd341a82ef6f Sep 30 15:08:41 crc kubenswrapper[4676]: I0930 15:08:41.217106 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n4b4l/crc-debug-wkvkr" event={"ID":"2cbee192-6582-44fe-9288-823abe05d10b","Type":"ContainerStarted","Data":"8a9bf9ca19aea007f057e9c9e8a46e3f841a6ba9cf9b4b8b2541009fc7387c8d"} Sep 30 15:08:41 crc kubenswrapper[4676]: I0930 15:08:41.217464 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n4b4l/crc-debug-wkvkr" event={"ID":"2cbee192-6582-44fe-9288-823abe05d10b","Type":"ContainerStarted","Data":"9878f9a8d99a865616029774abc04153c3ebfd8bc5e51710c126bd341a82ef6f"} Sep 30 15:08:41 crc kubenswrapper[4676]: I0930 15:08:41.241329 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-n4b4l/crc-debug-wkvkr" podStartSLOduration=1.241307492 podStartE2EDuration="1.241307492s" podCreationTimestamp="2025-09-30 15:08:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 15:08:41.232780048 +0000 UTC m=+4225.215868497" watchObservedRunningTime="2025-09-30 15:08:41.241307492 +0000 UTC m=+4225.224395921" Sep 30 15:08:42 crc kubenswrapper[4676]: I0930 15:08:42.229552 4676 generic.go:334] "Generic (PLEG): container finished" podID="2cbee192-6582-44fe-9288-823abe05d10b" containerID="8a9bf9ca19aea007f057e9c9e8a46e3f841a6ba9cf9b4b8b2541009fc7387c8d" exitCode=0 Sep 30 15:08:42 crc kubenswrapper[4676]: I0930 15:08:42.229627 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n4b4l/crc-debug-wkvkr" event={"ID":"2cbee192-6582-44fe-9288-823abe05d10b","Type":"ContainerDied","Data":"8a9bf9ca19aea007f057e9c9e8a46e3f841a6ba9cf9b4b8b2541009fc7387c8d"} Sep 30 15:08:43 crc kubenswrapper[4676]: I0930 15:08:43.348669 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n4b4l/crc-debug-wkvkr" Sep 30 15:08:43 crc kubenswrapper[4676]: I0930 15:08:43.480637 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t486j\" (UniqueName: \"kubernetes.io/projected/2cbee192-6582-44fe-9288-823abe05d10b-kube-api-access-t486j\") pod \"2cbee192-6582-44fe-9288-823abe05d10b\" (UID: \"2cbee192-6582-44fe-9288-823abe05d10b\") " Sep 30 15:08:43 crc kubenswrapper[4676]: I0930 15:08:43.481139 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2cbee192-6582-44fe-9288-823abe05d10b-host\") pod \"2cbee192-6582-44fe-9288-823abe05d10b\" (UID: \"2cbee192-6582-44fe-9288-823abe05d10b\") " Sep 30 15:08:43 crc kubenswrapper[4676]: I0930 15:08:43.481357 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2cbee192-6582-44fe-9288-823abe05d10b-host" (OuterVolumeSpecName: "host") pod "2cbee192-6582-44fe-9288-823abe05d10b" (UID: "2cbee192-6582-44fe-9288-823abe05d10b"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 15:08:43 crc kubenswrapper[4676]: I0930 15:08:43.481804 4676 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2cbee192-6582-44fe-9288-823abe05d10b-host\") on node \"crc\" DevicePath \"\"" Sep 30 15:08:43 crc kubenswrapper[4676]: I0930 15:08:43.487055 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cbee192-6582-44fe-9288-823abe05d10b-kube-api-access-t486j" (OuterVolumeSpecName: "kube-api-access-t486j") pod "2cbee192-6582-44fe-9288-823abe05d10b" (UID: "2cbee192-6582-44fe-9288-823abe05d10b"). InnerVolumeSpecName "kube-api-access-t486j". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 15:08:43 crc kubenswrapper[4676]: I0930 15:08:43.583396 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t486j\" (UniqueName: \"kubernetes.io/projected/2cbee192-6582-44fe-9288-823abe05d10b-kube-api-access-t486j\") on node \"crc\" DevicePath \"\"" Sep 30 15:08:44 crc kubenswrapper[4676]: I0930 15:08:44.261005 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n4b4l/crc-debug-wkvkr" event={"ID":"2cbee192-6582-44fe-9288-823abe05d10b","Type":"ContainerDied","Data":"9878f9a8d99a865616029774abc04153c3ebfd8bc5e51710c126bd341a82ef6f"} Sep 30 15:08:44 crc kubenswrapper[4676]: I0930 15:08:44.261076 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9878f9a8d99a865616029774abc04153c3ebfd8bc5e51710c126bd341a82ef6f" Sep 30 15:08:44 crc kubenswrapper[4676]: I0930 15:08:44.261081 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n4b4l/crc-debug-wkvkr" Sep 30 15:08:48 crc kubenswrapper[4676]: I0930 15:08:48.836334 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-n4b4l/crc-debug-wkvkr"] Sep 30 15:08:48 crc kubenswrapper[4676]: I0930 15:08:48.845482 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-n4b4l/crc-debug-wkvkr"] Sep 30 15:08:49 crc kubenswrapper[4676]: I0930 15:08:49.447573 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cbee192-6582-44fe-9288-823abe05d10b" path="/var/lib/kubelet/pods/2cbee192-6582-44fe-9288-823abe05d10b/volumes" Sep 30 15:08:49 crc kubenswrapper[4676]: I0930 15:08:49.995020 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-n4b4l/crc-debug-6k692"] Sep 30 15:08:49 crc kubenswrapper[4676]: E0930 15:08:49.995907 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cbee192-6582-44fe-9288-823abe05d10b" containerName="container-00" Sep 30 15:08:49 crc kubenswrapper[4676]: I0930 15:08:49.995926 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cbee192-6582-44fe-9288-823abe05d10b" containerName="container-00" Sep 30 15:08:49 crc kubenswrapper[4676]: I0930 15:08:49.996163 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cbee192-6582-44fe-9288-823abe05d10b" containerName="container-00" Sep 30 15:08:49 crc kubenswrapper[4676]: I0930 15:08:49.997050 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n4b4l/crc-debug-6k692" Sep 30 15:08:49 crc kubenswrapper[4676]: I0930 15:08:49.999349 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-n4b4l"/"default-dockercfg-6k2x6" Sep 30 15:08:50 crc kubenswrapper[4676]: I0930 15:08:50.090521 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/440c6042-5b85-4a6a-9b4f-4db7306d08ba-host\") pod \"crc-debug-6k692\" (UID: \"440c6042-5b85-4a6a-9b4f-4db7306d08ba\") " pod="openshift-must-gather-n4b4l/crc-debug-6k692" Sep 30 15:08:50 crc kubenswrapper[4676]: I0930 15:08:50.090595 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k49wj\" (UniqueName: \"kubernetes.io/projected/440c6042-5b85-4a6a-9b4f-4db7306d08ba-kube-api-access-k49wj\") pod \"crc-debug-6k692\" (UID: \"440c6042-5b85-4a6a-9b4f-4db7306d08ba\") " pod="openshift-must-gather-n4b4l/crc-debug-6k692" Sep 30 15:08:50 crc kubenswrapper[4676]: I0930 15:08:50.192329 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/440c6042-5b85-4a6a-9b4f-4db7306d08ba-host\") pod \"crc-debug-6k692\" (UID: \"440c6042-5b85-4a6a-9b4f-4db7306d08ba\") " pod="openshift-must-gather-n4b4l/crc-debug-6k692" Sep 30 15:08:50 crc kubenswrapper[4676]: I0930 15:08:50.192417 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k49wj\" (UniqueName: \"kubernetes.io/projected/440c6042-5b85-4a6a-9b4f-4db7306d08ba-kube-api-access-k49wj\") pod \"crc-debug-6k692\" (UID: \"440c6042-5b85-4a6a-9b4f-4db7306d08ba\") " pod="openshift-must-gather-n4b4l/crc-debug-6k692" Sep 30 15:08:50 crc kubenswrapper[4676]: I0930 15:08:50.192538 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/440c6042-5b85-4a6a-9b4f-4db7306d08ba-host\") pod \"crc-debug-6k692\" (UID: \"440c6042-5b85-4a6a-9b4f-4db7306d08ba\") " pod="openshift-must-gather-n4b4l/crc-debug-6k692" Sep 30 15:08:50 crc kubenswrapper[4676]: I0930 15:08:50.213255 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k49wj\" (UniqueName: \"kubernetes.io/projected/440c6042-5b85-4a6a-9b4f-4db7306d08ba-kube-api-access-k49wj\") pod \"crc-debug-6k692\" (UID: \"440c6042-5b85-4a6a-9b4f-4db7306d08ba\") " pod="openshift-must-gather-n4b4l/crc-debug-6k692" Sep 30 15:08:50 crc kubenswrapper[4676]: I0930 15:08:50.316062 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n4b4l/crc-debug-6k692" Sep 30 15:08:51 crc kubenswrapper[4676]: I0930 15:08:51.326374 4676 generic.go:334] "Generic (PLEG): container finished" podID="440c6042-5b85-4a6a-9b4f-4db7306d08ba" containerID="38f5ea999decc4b25984a3439c5b50b1513f244ebe0e7ad34aedd30fd3c98870" exitCode=0 Sep 30 15:08:51 crc kubenswrapper[4676]: I0930 15:08:51.326459 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n4b4l/crc-debug-6k692" event={"ID":"440c6042-5b85-4a6a-9b4f-4db7306d08ba","Type":"ContainerDied","Data":"38f5ea999decc4b25984a3439c5b50b1513f244ebe0e7ad34aedd30fd3c98870"} Sep 30 15:08:51 crc kubenswrapper[4676]: I0930 15:08:51.326745 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n4b4l/crc-debug-6k692" event={"ID":"440c6042-5b85-4a6a-9b4f-4db7306d08ba","Type":"ContainerStarted","Data":"9d841f7236c781e855b05118e5d4029095972f16488a7a80cfe8dde1824bf0fb"} Sep 30 15:08:51 crc kubenswrapper[4676]: I0930 15:08:51.369214 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-n4b4l/crc-debug-6k692"] Sep 30 15:08:51 crc kubenswrapper[4676]: I0930 15:08:51.378855 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-n4b4l/crc-debug-6k692"] Sep 30 15:08:52 crc kubenswrapper[4676]: I0930 15:08:52.454723 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n4b4l/crc-debug-6k692" Sep 30 15:08:52 crc kubenswrapper[4676]: I0930 15:08:52.538558 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/440c6042-5b85-4a6a-9b4f-4db7306d08ba-host\") pod \"440c6042-5b85-4a6a-9b4f-4db7306d08ba\" (UID: \"440c6042-5b85-4a6a-9b4f-4db7306d08ba\") " Sep 30 15:08:52 crc kubenswrapper[4676]: I0930 15:08:52.538690 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/440c6042-5b85-4a6a-9b4f-4db7306d08ba-host" (OuterVolumeSpecName: "host") pod "440c6042-5b85-4a6a-9b4f-4db7306d08ba" (UID: "440c6042-5b85-4a6a-9b4f-4db7306d08ba"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 15:08:52 crc kubenswrapper[4676]: I0930 15:08:52.538712 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k49wj\" (UniqueName: \"kubernetes.io/projected/440c6042-5b85-4a6a-9b4f-4db7306d08ba-kube-api-access-k49wj\") pod \"440c6042-5b85-4a6a-9b4f-4db7306d08ba\" (UID: \"440c6042-5b85-4a6a-9b4f-4db7306d08ba\") " Sep 30 15:08:52 crc kubenswrapper[4676]: I0930 15:08:52.540213 4676 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/440c6042-5b85-4a6a-9b4f-4db7306d08ba-host\") on node \"crc\" DevicePath \"\"" Sep 30 15:08:52 crc kubenswrapper[4676]: I0930 15:08:52.553219 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/440c6042-5b85-4a6a-9b4f-4db7306d08ba-kube-api-access-k49wj" (OuterVolumeSpecName: "kube-api-access-k49wj") pod "440c6042-5b85-4a6a-9b4f-4db7306d08ba" (UID: "440c6042-5b85-4a6a-9b4f-4db7306d08ba"). InnerVolumeSpecName "kube-api-access-k49wj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 15:08:52 crc kubenswrapper[4676]: I0930 15:08:52.642002 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k49wj\" (UniqueName: \"kubernetes.io/projected/440c6042-5b85-4a6a-9b4f-4db7306d08ba-kube-api-access-k49wj\") on node \"crc\" DevicePath \"\"" Sep 30 15:08:53 crc kubenswrapper[4676]: I0930 15:08:53.076137 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_64d7a47084c6b06ecfa189b2173a2aa65d74cada1beaa569dc58135644wj2pm_0c22d3ee-9eda-44ac-b7af-367b71fc5505/util/0.log" Sep 30 15:08:53 crc kubenswrapper[4676]: I0930 15:08:53.277258 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_64d7a47084c6b06ecfa189b2173a2aa65d74cada1beaa569dc58135644wj2pm_0c22d3ee-9eda-44ac-b7af-367b71fc5505/util/0.log" Sep 30 15:08:53 crc kubenswrapper[4676]: I0930 15:08:53.288054 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_64d7a47084c6b06ecfa189b2173a2aa65d74cada1beaa569dc58135644wj2pm_0c22d3ee-9eda-44ac-b7af-367b71fc5505/pull/0.log" Sep 30 15:08:53 crc kubenswrapper[4676]: I0930 15:08:53.288197 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_64d7a47084c6b06ecfa189b2173a2aa65d74cada1beaa569dc58135644wj2pm_0c22d3ee-9eda-44ac-b7af-367b71fc5505/pull/0.log" Sep 30 15:08:53 crc kubenswrapper[4676]: I0930 15:08:53.346523 4676 scope.go:117] "RemoveContainer" containerID="38f5ea999decc4b25984a3439c5b50b1513f244ebe0e7ad34aedd30fd3c98870" Sep 30 15:08:53 crc kubenswrapper[4676]: I0930 15:08:53.346559 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n4b4l/crc-debug-6k692" Sep 30 15:08:53 crc kubenswrapper[4676]: I0930 15:08:53.446379 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="440c6042-5b85-4a6a-9b4f-4db7306d08ba" path="/var/lib/kubelet/pods/440c6042-5b85-4a6a-9b4f-4db7306d08ba/volumes" Sep 30 15:08:53 crc kubenswrapper[4676]: I0930 15:08:53.480254 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_64d7a47084c6b06ecfa189b2173a2aa65d74cada1beaa569dc58135644wj2pm_0c22d3ee-9eda-44ac-b7af-367b71fc5505/extract/0.log" Sep 30 15:08:53 crc kubenswrapper[4676]: I0930 15:08:53.482761 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_64d7a47084c6b06ecfa189b2173a2aa65d74cada1beaa569dc58135644wj2pm_0c22d3ee-9eda-44ac-b7af-367b71fc5505/pull/0.log" Sep 30 15:08:53 crc kubenswrapper[4676]: I0930 15:08:53.483195 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_64d7a47084c6b06ecfa189b2173a2aa65d74cada1beaa569dc58135644wj2pm_0c22d3ee-9eda-44ac-b7af-367b71fc5505/util/0.log" Sep 30 15:08:53 crc kubenswrapper[4676]: I0930 15:08:53.643572 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-7b844_9e7d83e3-0f96-4a53-88ad-568d39435e5f/kube-rbac-proxy/0.log" Sep 30 15:08:53 crc kubenswrapper[4676]: I0930 15:08:53.739856 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-7b844_9e7d83e3-0f96-4a53-88ad-568d39435e5f/manager/0.log" Sep 30 15:08:53 crc kubenswrapper[4676]: I0930 15:08:53.765010 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-7tf2x_2342a742-ce41-4487-9d32-34fc69cb4445/kube-rbac-proxy/0.log" Sep 30 15:08:53 crc kubenswrapper[4676]: I0930 15:08:53.917158 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-7tf2x_2342a742-ce41-4487-9d32-34fc69cb4445/manager/0.log" Sep 30 15:08:53 crc kubenswrapper[4676]: I0930 15:08:53.941929 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-7qc9p_8f9d1069-29eb-42e5-8029-1ed616f31c4a/kube-rbac-proxy/0.log" Sep 30 15:08:53 crc kubenswrapper[4676]: I0930 15:08:53.978526 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-7qc9p_8f9d1069-29eb-42e5-8029-1ed616f31c4a/manager/0.log" Sep 30 15:08:54 crc kubenswrapper[4676]: I0930 15:08:54.112610 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-zn6zg_b5ead6b1-3f68-454e-847c-89cac8d7f1f0/kube-rbac-proxy/0.log" Sep 30 15:08:54 crc kubenswrapper[4676]: I0930 15:08:54.189627 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-zn6zg_b5ead6b1-3f68-454e-847c-89cac8d7f1f0/manager/0.log" Sep 30 15:08:54 crc kubenswrapper[4676]: I0930 15:08:54.320059 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-6qcf5_930c8b21-3bfd-497b-9bc7-60f2cf7abde6/kube-rbac-proxy/0.log" Sep 30 15:08:54 crc kubenswrapper[4676]: I0930 15:08:54.335208 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-6qcf5_930c8b21-3bfd-497b-9bc7-60f2cf7abde6/manager/0.log" Sep 30 15:08:54 crc kubenswrapper[4676]: I0930 15:08:54.465745 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-s9znt_1d51c97e-7c47-4274-8bd4-bc3d7402a378/kube-rbac-proxy/0.log" Sep 30 15:08:54 crc kubenswrapper[4676]: I0930 15:08:54.519707 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-s9znt_1d51c97e-7c47-4274-8bd4-bc3d7402a378/manager/0.log" Sep 30 15:08:54 crc kubenswrapper[4676]: I0930 15:08:54.565844 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7d857cc749-mg54v_c15f6efe-27f0-4f55-b1f0-957366ff23a4/kube-rbac-proxy/0.log" Sep 30 15:08:54 crc kubenswrapper[4676]: I0930 15:08:54.769833 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-7975b88857-4dzg7_5e535753-178a-4b7b-b20c-e13fa0be5ce1/kube-rbac-proxy/0.log" Sep 30 15:08:54 crc kubenswrapper[4676]: I0930 15:08:54.810586 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7d857cc749-mg54v_c15f6efe-27f0-4f55-b1f0-957366ff23a4/manager/0.log" Sep 30 15:08:54 crc kubenswrapper[4676]: I0930 15:08:54.830677 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-7975b88857-4dzg7_5e535753-178a-4b7b-b20c-e13fa0be5ce1/manager/0.log" Sep 30 15:08:54 crc kubenswrapper[4676]: I0930 15:08:54.974326 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-8pv7q_7e6672d2-5e94-4d5d-b927-ad3573b95469/kube-rbac-proxy/0.log" Sep 30 15:08:55 crc kubenswrapper[4676]: I0930 15:08:55.080762 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-8pv7q_7e6672d2-5e94-4d5d-b927-ad3573b95469/manager/0.log" Sep 30 15:08:55 crc kubenswrapper[4676]: I0930 15:08:55.162042 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6d68dbc695-gh2vw_daee0b60-331c-4108-8881-66cf4eb731e0/kube-rbac-proxy/0.log" Sep 30 15:08:55 crc kubenswrapper[4676]: I0930 15:08:55.183294 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6d68dbc695-gh2vw_daee0b60-331c-4108-8881-66cf4eb731e0/manager/0.log" Sep 30 15:08:55 crc kubenswrapper[4676]: I0930 15:08:55.287474 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-n5djz_d9d6dec4-1cc3-40c5-b90a-71b5f7a7da85/kube-rbac-proxy/0.log" Sep 30 15:08:55 crc kubenswrapper[4676]: I0930 15:08:55.389307 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-n5djz_d9d6dec4-1cc3-40c5-b90a-71b5f7a7da85/manager/0.log" Sep 30 15:08:55 crc kubenswrapper[4676]: I0930 15:08:55.514718 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64d7b59854-t6h2t_dbe0db98-4cbd-49d2-9f6a-f54a8189c64b/kube-rbac-proxy/0.log" Sep 30 15:08:55 crc kubenswrapper[4676]: I0930 15:08:55.570901 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64d7b59854-t6h2t_dbe0db98-4cbd-49d2-9f6a-f54a8189c64b/manager/0.log" Sep 30 15:08:56 crc kubenswrapper[4676]: I0930 15:08:56.307094 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-c7c776c96-8fx9n_5dbc4210-e31a-4bf8-a5cb-6f00a7406743/kube-rbac-proxy/0.log" Sep 30 15:08:56 crc kubenswrapper[4676]: I0930 15:08:56.351754 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-c7c776c96-8fx9n_5dbc4210-e31a-4bf8-a5cb-6f00a7406743/manager/0.log" Sep 30 15:08:56 crc kubenswrapper[4676]: I0930 15:08:56.463012 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-76fcc6dc7c-8m45v_43f93725-c577-4253-ae9c-7d14e8aec0b9/kube-rbac-proxy/0.log" Sep 30 15:08:56 crc kubenswrapper[4676]: I0930 15:08:56.469351 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-76fcc6dc7c-8m45v_43f93725-c577-4253-ae9c-7d14e8aec0b9/manager/0.log" Sep 30 15:08:56 crc kubenswrapper[4676]: I0930 15:08:56.537282 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6d776955-qd7b8_96c8a26f-c044-429d-90eb-d0342486c32f/kube-rbac-proxy/0.log" Sep 30 15:08:56 crc kubenswrapper[4676]: I0930 15:08:56.539861 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6d776955-qd7b8_96c8a26f-c044-429d-90eb-d0342486c32f/manager/0.log" Sep 30 15:08:56 crc kubenswrapper[4676]: I0930 15:08:56.690355 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5f5687bfdd-7nt8g_72b68346-543d-4b80-ba31-9bcb856b6989/kube-rbac-proxy/0.log" Sep 30 15:08:56 crc kubenswrapper[4676]: I0930 15:08:56.729655 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-55ccb8ddf4-slxtv_df3c9717-78cf-49b2-a967-7177da8f2e17/kube-rbac-proxy/0.log" Sep 30 15:08:57 crc kubenswrapper[4676]: I0930 15:08:57.018059 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-fhxvs_2bf0efcb-3cb6-491b-961f-6655a84de268/registry-server/0.log" Sep 30 15:08:57 crc kubenswrapper[4676]: I0930 15:08:57.091490 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-55ccb8ddf4-slxtv_df3c9717-78cf-49b2-a967-7177da8f2e17/operator/0.log" Sep 30 15:08:57 crc kubenswrapper[4676]: I0930 15:08:57.227837 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-g2s9t_e51ea15a-8d04-4d56-956d-0fcf41846eb8/kube-rbac-proxy/0.log" Sep 30 15:08:57 crc kubenswrapper[4676]: I0930 15:08:57.311194 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-g2s9t_e51ea15a-8d04-4d56-956d-0fcf41846eb8/manager/0.log" Sep 30 15:08:57 crc kubenswrapper[4676]: I0930 15:08:57.427323 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-9dzn5_a3762232-4e9f-452e-aea4-c5feb443ad75/kube-rbac-proxy/0.log" Sep 30 15:08:57 crc kubenswrapper[4676]: I0930 15:08:57.472267 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-9dzn5_a3762232-4e9f-452e-aea4-c5feb443ad75/manager/0.log" Sep 30 15:08:57 crc kubenswrapper[4676]: I0930 15:08:57.944545 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5f5687bfdd-7nt8g_72b68346-543d-4b80-ba31-9bcb856b6989/manager/0.log" Sep 30 15:08:58 crc kubenswrapper[4676]: I0930 15:08:58.004074 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-79d8469568-xkkwv_677c476e-c8df-4a21-9968-b2bd23b246f6/operator/0.log" Sep 30 15:08:58 crc kubenswrapper[4676]: I0930 15:08:58.063769 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-bc7dc7bd9-l4p8f_cedc986e-ac92-45e8-862a-fc4dcb60455d/kube-rbac-proxy/0.log" Sep 30 15:08:58 crc kubenswrapper[4676]: I0930 15:08:58.161249 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-bc7dc7bd9-l4p8f_cedc986e-ac92-45e8-862a-fc4dcb60455d/manager/0.log" Sep 30 15:08:58 crc kubenswrapper[4676]: I0930 15:08:58.168956 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-b8d54b5d7-wf6fb_aa6dd699-ccd5-476f-ab9c-3d4841ed591a/kube-rbac-proxy/0.log" Sep 30 15:08:58 crc kubenswrapper[4676]: I0930 15:08:58.283685 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-b8d54b5d7-wf6fb_aa6dd699-ccd5-476f-ab9c-3d4841ed591a/manager/0.log" Sep 30 15:08:58 crc kubenswrapper[4676]: I0930 15:08:58.371428 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-f66b554c6-bvj6l_129d5672-c8dd-4a63-8d48-dc95c84a45b2/kube-rbac-proxy/0.log" Sep 30 15:08:58 crc kubenswrapper[4676]: I0930 15:08:58.420751 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-f66b554c6-bvj6l_129d5672-c8dd-4a63-8d48-dc95c84a45b2/manager/0.log" Sep 30 15:08:58 crc kubenswrapper[4676]: I0930 15:08:58.451800 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-76669f99c-bwq8t_347e3ac8-4477-4bab-a64b-a443098bb400/kube-rbac-proxy/0.log" Sep 30 15:08:58 crc kubenswrapper[4676]: I0930 15:08:58.507798 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-76669f99c-bwq8t_347e3ac8-4477-4bab-a64b-a443098bb400/manager/0.log" Sep 30 15:09:13 crc kubenswrapper[4676]: I0930 15:09:13.752254 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-wb8d2_97f344fc-42a3-4630-af31-ea25b72941e6/control-plane-machine-set-operator/0.log" Sep 30 15:09:13 crc kubenswrapper[4676]: I0930 15:09:13.884387 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-c9kjj_ebd6b987-d54a-4692-800a-8eadc5e8690c/kube-rbac-proxy/0.log" Sep 30 15:09:13 crc kubenswrapper[4676]: I0930 15:09:13.938785 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-c9kjj_ebd6b987-d54a-4692-800a-8eadc5e8690c/machine-api-operator/0.log" Sep 30 15:09:26 crc kubenswrapper[4676]: I0930 15:09:26.084071 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-7k5cw_74eb08ee-8ebc-4f31-a952-22b99cfb68ac/cert-manager-controller/0.log" Sep 30 15:09:26 crc kubenswrapper[4676]: I0930 15:09:26.322806 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-srpgg_faf2d116-cfb0-451e-b919-5ff1e93ee944/cert-manager-webhook/0.log" Sep 30 15:09:26 crc kubenswrapper[4676]: I0930 15:09:26.358619 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-qd84c_f593d783-2014-4870-b97a-66bd22eba1b4/cert-manager-cainjector/0.log" Sep 30 15:09:38 crc kubenswrapper[4676]: I0930 15:09:38.705170 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-864bb6dfb5-l7jdz_87e24d2c-e308-4f03-a28c-eb3ca52bb5f6/nmstate-console-plugin/0.log" Sep 30 15:09:38 crc kubenswrapper[4676]: I0930 15:09:38.737187 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-w7g4n_ec2f5c4c-d407-4e7d-9f0b-406feff7cdcc/nmstate-handler/0.log" Sep 30 15:09:38 crc kubenswrapper[4676]: I0930 15:09:38.906767 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-w5b6p_6c1f56f8-8d1c-47f2-9099-25a15fdaee77/kube-rbac-proxy/0.log" Sep 30 15:09:38 crc kubenswrapper[4676]: I0930 15:09:38.947764 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-w5b6p_6c1f56f8-8d1c-47f2-9099-25a15fdaee77/nmstate-metrics/0.log" Sep 30 15:09:39 crc kubenswrapper[4676]: I0930 15:09:39.112778 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5d6f6cfd66-4v459_73fea0d7-e7c7-4db0-8205-1c86203f6a88/nmstate-operator/0.log" Sep 30 15:09:39 crc kubenswrapper[4676]: I0930 15:09:39.133570 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6d689559c5-rb9kz_9820f583-f5b4-4642-ade6-683242648b4d/nmstate-webhook/0.log" Sep 30 15:09:53 crc kubenswrapper[4676]: I0930 15:09:53.249327 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-c24th_43a57b66-554a-40f3-ae9c-1f8dd4053405/kube-rbac-proxy/0.log" Sep 30 15:09:53 crc kubenswrapper[4676]: I0930 15:09:53.376593 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-c24th_43a57b66-554a-40f3-ae9c-1f8dd4053405/controller/0.log" Sep 30 15:09:53 crc kubenswrapper[4676]: I0930 15:09:53.464138 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-5478bdb765-rdndd_518077e0-6a46-480c-9cdd-d5d5c64814b7/frr-k8s-webhook-server/0.log" Sep 30 15:09:53 crc kubenswrapper[4676]: I0930 15:09:53.587004 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x4mlw_5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5/cp-frr-files/0.log" Sep 30 15:09:53 crc kubenswrapper[4676]: I0930 15:09:53.791399 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x4mlw_5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5/cp-frr-files/0.log" Sep 30 15:09:53 crc kubenswrapper[4676]: I0930 15:09:53.804667 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x4mlw_5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5/cp-reloader/0.log" Sep 30 15:09:53 crc kubenswrapper[4676]: I0930 15:09:53.806103 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x4mlw_5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5/cp-reloader/0.log" Sep 30 15:09:53 crc kubenswrapper[4676]: I0930 15:09:53.815764 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x4mlw_5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5/cp-metrics/0.log" Sep 30 15:09:54 crc kubenswrapper[4676]: I0930 15:09:54.057134 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x4mlw_5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5/cp-frr-files/0.log" Sep 30 15:09:54 crc kubenswrapper[4676]: I0930 15:09:54.079817 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x4mlw_5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5/cp-metrics/0.log" Sep 30 15:09:54 crc kubenswrapper[4676]: I0930 15:09:54.084421 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x4mlw_5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5/cp-metrics/0.log" Sep 30 15:09:54 crc kubenswrapper[4676]: I0930 15:09:54.123020 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x4mlw_5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5/cp-reloader/0.log" Sep 30 15:09:54 crc kubenswrapper[4676]: I0930 15:09:54.258780 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x4mlw_5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5/cp-reloader/0.log" Sep 30 15:09:54 crc kubenswrapper[4676]: I0930 15:09:54.260100 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x4mlw_5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5/cp-metrics/0.log" Sep 30 15:09:54 crc kubenswrapper[4676]: I0930 15:09:54.289632 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x4mlw_5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5/cp-frr-files/0.log" Sep 30 15:09:54 crc kubenswrapper[4676]: I0930 15:09:54.347361 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x4mlw_5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5/controller/0.log" Sep 30 15:09:54 crc kubenswrapper[4676]: I0930 15:09:54.460294 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x4mlw_5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5/frr-metrics/0.log" Sep 30 15:09:54 crc kubenswrapper[4676]: I0930 15:09:54.529556 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x4mlw_5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5/kube-rbac-proxy/0.log" Sep 30 15:09:54 crc kubenswrapper[4676]: I0930 15:09:54.579964 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x4mlw_5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5/kube-rbac-proxy-frr/0.log" Sep 30 15:09:54 crc kubenswrapper[4676]: I0930 15:09:54.688292 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x4mlw_5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5/reloader/0.log" Sep 30 15:09:54 crc kubenswrapper[4676]: I0930 15:09:54.818724 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-76c7cc4875-dchh6_2a1324e4-02a6-4c81-b533-086cbd21e10f/manager/0.log" Sep 30 15:09:54 crc kubenswrapper[4676]: I0930 15:09:54.987944 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-fb5687d59-w22bc_eb9ac7e3-b48e-44d4-9053-7e5ecec6a138/webhook-server/0.log" Sep 30 15:09:55 crc kubenswrapper[4676]: I0930 15:09:55.188143 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-4gcd5_32604773-f635-41b2-a665-740ace937075/kube-rbac-proxy/0.log" Sep 30 15:09:55 crc kubenswrapper[4676]: I0930 15:09:55.866964 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-4gcd5_32604773-f635-41b2-a665-740ace937075/speaker/0.log" Sep 30 15:09:56 crc kubenswrapper[4676]: I0930 15:09:56.211852 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x4mlw_5d8ff8dd-f29a-4701-abdd-0cc1751a0ca5/frr/0.log" Sep 30 15:10:06 crc kubenswrapper[4676]: I0930 15:10:06.700980 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcksm7x_bc584293-077a-42bb-a90c-5f1931728034/util/0.log" Sep 30 15:10:06 crc kubenswrapper[4676]: I0930 15:10:06.945360 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcksm7x_bc584293-077a-42bb-a90c-5f1931728034/util/0.log" Sep 30 15:10:06 crc kubenswrapper[4676]: I0930 15:10:06.956500 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcksm7x_bc584293-077a-42bb-a90c-5f1931728034/pull/0.log" Sep 30 15:10:06 crc kubenswrapper[4676]: I0930 15:10:06.966018 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcksm7x_bc584293-077a-42bb-a90c-5f1931728034/pull/0.log" Sep 30 15:10:07 crc kubenswrapper[4676]: I0930 15:10:07.143272 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcksm7x_bc584293-077a-42bb-a90c-5f1931728034/util/0.log" Sep 30 15:10:07 crc kubenswrapper[4676]: I0930 15:10:07.171963 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcksm7x_bc584293-077a-42bb-a90c-5f1931728034/pull/0.log" Sep 30 15:10:07 crc kubenswrapper[4676]: I0930 15:10:07.183406 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcksm7x_bc584293-077a-42bb-a90c-5f1931728034/extract/0.log" Sep 30 15:10:07 crc kubenswrapper[4676]: I0930 15:10:07.318956 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qxnqk_6b3ca6ec-bdbc-44be-90e5-21fa0d2cd5f1/extract-utilities/0.log" Sep 30 15:10:07 crc kubenswrapper[4676]: I0930 15:10:07.494436 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qxnqk_6b3ca6ec-bdbc-44be-90e5-21fa0d2cd5f1/extract-content/0.log" Sep 30 15:10:07 crc kubenswrapper[4676]: I0930 15:10:07.543666 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qxnqk_6b3ca6ec-bdbc-44be-90e5-21fa0d2cd5f1/extract-content/0.log" Sep 30 15:10:07 crc kubenswrapper[4676]: I0930 15:10:07.553320 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qxnqk_6b3ca6ec-bdbc-44be-90e5-21fa0d2cd5f1/extract-utilities/0.log" Sep 30 15:10:07 crc kubenswrapper[4676]: I0930 15:10:07.673533 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qxnqk_6b3ca6ec-bdbc-44be-90e5-21fa0d2cd5f1/extract-content/0.log" Sep 30 15:10:07 crc kubenswrapper[4676]: I0930 15:10:07.680065 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qxnqk_6b3ca6ec-bdbc-44be-90e5-21fa0d2cd5f1/extract-utilities/0.log" Sep 30 15:10:07 crc kubenswrapper[4676]: I0930 15:10:07.900431 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wr58n_3fc5df74-e04c-45d8-92d4-a8e15bd54315/extract-utilities/0.log" Sep 30 15:10:08 crc kubenswrapper[4676]: I0930 15:10:08.152684 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wr58n_3fc5df74-e04c-45d8-92d4-a8e15bd54315/extract-content/0.log" Sep 30 15:10:08 crc kubenswrapper[4676]: I0930 15:10:08.174011 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wr58n_3fc5df74-e04c-45d8-92d4-a8e15bd54315/extract-utilities/0.log" Sep 30 15:10:08 crc kubenswrapper[4676]: I0930 15:10:08.255798 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wr58n_3fc5df74-e04c-45d8-92d4-a8e15bd54315/extract-content/0.log" Sep 30 15:10:08 crc kubenswrapper[4676]: I0930 15:10:08.336472 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qxnqk_6b3ca6ec-bdbc-44be-90e5-21fa0d2cd5f1/registry-server/0.log" Sep 30 15:10:08 crc kubenswrapper[4676]: I0930 15:10:08.451164 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wr58n_3fc5df74-e04c-45d8-92d4-a8e15bd54315/extract-content/0.log" Sep 30 15:10:08 crc kubenswrapper[4676]: I0930 15:10:08.480160 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wr58n_3fc5df74-e04c-45d8-92d4-a8e15bd54315/extract-utilities/0.log" Sep 30 15:10:08 crc kubenswrapper[4676]: I0930 15:10:08.667717 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96stcjg_c6917646-fb37-4aa8-bd63-a09bdb713ea1/util/0.log" Sep 30 15:10:08 crc kubenswrapper[4676]: I0930 15:10:08.670858 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wr58n_3fc5df74-e04c-45d8-92d4-a8e15bd54315/registry-server/0.log" Sep 30 15:10:08 crc kubenswrapper[4676]: I0930 15:10:08.847807 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96stcjg_c6917646-fb37-4aa8-bd63-a09bdb713ea1/util/0.log" Sep 30 15:10:08 crc kubenswrapper[4676]: I0930 15:10:08.850090 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96stcjg_c6917646-fb37-4aa8-bd63-a09bdb713ea1/pull/0.log" Sep 30 15:10:08 crc kubenswrapper[4676]: I0930 15:10:08.897502 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96stcjg_c6917646-fb37-4aa8-bd63-a09bdb713ea1/pull/0.log" Sep 30 15:10:09 crc kubenswrapper[4676]: I0930 15:10:09.044492 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96stcjg_c6917646-fb37-4aa8-bd63-a09bdb713ea1/pull/0.log" Sep 30 15:10:09 crc kubenswrapper[4676]: I0930 15:10:09.047378 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96stcjg_c6917646-fb37-4aa8-bd63-a09bdb713ea1/util/0.log" Sep 30 15:10:09 crc kubenswrapper[4676]: I0930 15:10:09.087464 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96stcjg_c6917646-fb37-4aa8-bd63-a09bdb713ea1/extract/0.log" Sep 30 15:10:09 crc kubenswrapper[4676]: I0930 15:10:09.310338 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-vnxs8_fbc2525f-9c5f-4639-908d-35fed61607f5/marketplace-operator/0.log" Sep 30 15:10:09 crc kubenswrapper[4676]: I0930 15:10:09.359320 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pw8ff_6ffc9090-e9c1-4f42-b73c-1ec86c49e317/extract-utilities/0.log" Sep 30 15:10:09 crc kubenswrapper[4676]: I0930 15:10:09.511076 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pw8ff_6ffc9090-e9c1-4f42-b73c-1ec86c49e317/extract-utilities/0.log" Sep 30 15:10:09 crc kubenswrapper[4676]: I0930 15:10:09.516745 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pw8ff_6ffc9090-e9c1-4f42-b73c-1ec86c49e317/extract-content/0.log" Sep 30 15:10:09 crc kubenswrapper[4676]: I0930 15:10:09.530915 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pw8ff_6ffc9090-e9c1-4f42-b73c-1ec86c49e317/extract-content/0.log" Sep 30 15:10:09 crc kubenswrapper[4676]: I0930 15:10:09.781176 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pw8ff_6ffc9090-e9c1-4f42-b73c-1ec86c49e317/extract-content/0.log" Sep 30 15:10:09 crc kubenswrapper[4676]: I0930 15:10:09.812329 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pw8ff_6ffc9090-e9c1-4f42-b73c-1ec86c49e317/extract-utilities/0.log" Sep 30 15:10:09 crc kubenswrapper[4676]: I0930 15:10:09.988710 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pw8ff_6ffc9090-e9c1-4f42-b73c-1ec86c49e317/registry-server/0.log" Sep 30 15:10:10 crc kubenswrapper[4676]: I0930 15:10:10.040975 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8kfpw_239015d7-f8f2-4823-9094-6a3248c8e0a0/extract-utilities/0.log" Sep 30 15:10:10 crc kubenswrapper[4676]: I0930 15:10:10.173641 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8kfpw_239015d7-f8f2-4823-9094-6a3248c8e0a0/extract-utilities/0.log" Sep 30 15:10:10 crc kubenswrapper[4676]: I0930 15:10:10.191650 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8kfpw_239015d7-f8f2-4823-9094-6a3248c8e0a0/extract-content/0.log" Sep 30 15:10:10 crc kubenswrapper[4676]: I0930 15:10:10.239688 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8kfpw_239015d7-f8f2-4823-9094-6a3248c8e0a0/extract-content/0.log" Sep 30 15:10:10 crc kubenswrapper[4676]: I0930 15:10:10.366213 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8kfpw_239015d7-f8f2-4823-9094-6a3248c8e0a0/extract-content/0.log" Sep 30 15:10:10 crc kubenswrapper[4676]: I0930 15:10:10.366223 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8kfpw_239015d7-f8f2-4823-9094-6a3248c8e0a0/extract-utilities/0.log" Sep 30 15:10:10 crc kubenswrapper[4676]: I0930 15:10:10.883605 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8kfpw_239015d7-f8f2-4823-9094-6a3248c8e0a0/registry-server/0.log" Sep 30 15:10:59 crc kubenswrapper[4676]: I0930 15:10:59.920037 4676 patch_prober.go:28] interesting pod/machine-config-daemon-4k2dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 15:10:59 crc kubenswrapper[4676]: I0930 15:10:59.920693 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 15:11:29 crc kubenswrapper[4676]: I0930 15:11:29.919789 4676 patch_prober.go:28] interesting pod/machine-config-daemon-4k2dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 15:11:29 crc kubenswrapper[4676]: I0930 15:11:29.920423 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 15:11:59 crc kubenswrapper[4676]: I0930 15:11:59.919049 4676 patch_prober.go:28] interesting pod/machine-config-daemon-4k2dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 15:11:59 crc kubenswrapper[4676]: I0930 15:11:59.919703 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 15:11:59 crc kubenswrapper[4676]: I0930 15:11:59.919757 4676 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" Sep 30 15:11:59 crc kubenswrapper[4676]: I0930 15:11:59.920497 4676 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"409286e2182fc474d7835ff7696ee8a19195b801dc6244941fe03db1195dad01"} pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 15:11:59 crc kubenswrapper[4676]: I0930 15:11:59.920551 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerName="machine-config-daemon" containerID="cri-o://409286e2182fc474d7835ff7696ee8a19195b801dc6244941fe03db1195dad01" gracePeriod=600 Sep 30 15:12:01 crc kubenswrapper[4676]: I0930 15:12:01.034333 4676 generic.go:334] "Generic (PLEG): container finished" podID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerID="409286e2182fc474d7835ff7696ee8a19195b801dc6244941fe03db1195dad01" exitCode=0 Sep 30 15:12:01 crc kubenswrapper[4676]: I0930 15:12:01.034414 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" event={"ID":"af133cb7-f0e4-428e-b348-c6e81493fc1d","Type":"ContainerDied","Data":"409286e2182fc474d7835ff7696ee8a19195b801dc6244941fe03db1195dad01"} Sep 30 15:12:01 crc kubenswrapper[4676]: I0930 15:12:01.035430 4676 scope.go:117] "RemoveContainer" containerID="01200c66e62b433cacb49e583f03caf0998b33671a3d5ec07da6a24fae8e6b7a" Sep 30 15:12:01 crc kubenswrapper[4676]: I0930 15:12:01.036120 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" event={"ID":"af133cb7-f0e4-428e-b348-c6e81493fc1d","Type":"ContainerStarted","Data":"36d54d6658c8decdd62b5ffe451382af7f4fcc73f21cc6de9367b952d0768440"} Sep 30 15:12:13 crc kubenswrapper[4676]: I0930 15:12:13.142331 4676 generic.go:334] "Generic (PLEG): container finished" podID="181361c5-87f6-48f3-af90-6fa805a3ce1e" containerID="d206abf5e5622ace975e1f2162ef7a6afdc4030e43b1084febbd57c3473b9330" exitCode=0 Sep 30 15:12:13 crc kubenswrapper[4676]: I0930 15:12:13.142418 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n4b4l/must-gather-4bsdb" event={"ID":"181361c5-87f6-48f3-af90-6fa805a3ce1e","Type":"ContainerDied","Data":"d206abf5e5622ace975e1f2162ef7a6afdc4030e43b1084febbd57c3473b9330"} Sep 30 15:12:13 crc kubenswrapper[4676]: I0930 15:12:13.144455 4676 scope.go:117] "RemoveContainer" containerID="d206abf5e5622ace975e1f2162ef7a6afdc4030e43b1084febbd57c3473b9330" Sep 30 15:12:13 crc kubenswrapper[4676]: I0930 15:12:13.829206 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-n4b4l_must-gather-4bsdb_181361c5-87f6-48f3-af90-6fa805a3ce1e/gather/0.log" Sep 30 15:12:25 crc kubenswrapper[4676]: I0930 15:12:25.826650 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-n4b4l/must-gather-4bsdb"] Sep 30 15:12:25 crc kubenswrapper[4676]: I0930 15:12:25.827739 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-n4b4l/must-gather-4bsdb" podUID="181361c5-87f6-48f3-af90-6fa805a3ce1e" containerName="copy" containerID="cri-o://0ebb3bb354cc00eaf9bfa136ef5ec932074beeb83f53c21de7030ae7ad0f2764" gracePeriod=2 Sep 30 15:12:25 crc kubenswrapper[4676]: I0930 15:12:25.838111 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-n4b4l/must-gather-4bsdb"] Sep 30 15:12:26 crc kubenswrapper[4676]: I0930 15:12:26.278828 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-n4b4l_must-gather-4bsdb_181361c5-87f6-48f3-af90-6fa805a3ce1e/copy/0.log" Sep 30 15:12:26 crc kubenswrapper[4676]: I0930 15:12:26.279624 4676 generic.go:334] "Generic (PLEG): container finished" podID="181361c5-87f6-48f3-af90-6fa805a3ce1e" containerID="0ebb3bb354cc00eaf9bfa136ef5ec932074beeb83f53c21de7030ae7ad0f2764" exitCode=143 Sep 30 15:12:26 crc kubenswrapper[4676]: I0930 15:12:26.279690 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d19967828f19efaf8e5f7a6d27321cf8569c61cade89ecdc664a0aab1a1e781b" Sep 30 15:12:26 crc kubenswrapper[4676]: I0930 15:12:26.315072 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-n4b4l_must-gather-4bsdb_181361c5-87f6-48f3-af90-6fa805a3ce1e/copy/0.log" Sep 30 15:12:26 crc kubenswrapper[4676]: I0930 15:12:26.315565 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n4b4l/must-gather-4bsdb" Sep 30 15:12:26 crc kubenswrapper[4676]: I0930 15:12:26.391938 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wk8l\" (UniqueName: \"kubernetes.io/projected/181361c5-87f6-48f3-af90-6fa805a3ce1e-kube-api-access-7wk8l\") pod \"181361c5-87f6-48f3-af90-6fa805a3ce1e\" (UID: \"181361c5-87f6-48f3-af90-6fa805a3ce1e\") " Sep 30 15:12:26 crc kubenswrapper[4676]: I0930 15:12:26.392175 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/181361c5-87f6-48f3-af90-6fa805a3ce1e-must-gather-output\") pod \"181361c5-87f6-48f3-af90-6fa805a3ce1e\" (UID: \"181361c5-87f6-48f3-af90-6fa805a3ce1e\") " Sep 30 15:12:26 crc kubenswrapper[4676]: I0930 15:12:26.404363 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/181361c5-87f6-48f3-af90-6fa805a3ce1e-kube-api-access-7wk8l" (OuterVolumeSpecName: "kube-api-access-7wk8l") pod "181361c5-87f6-48f3-af90-6fa805a3ce1e" (UID: "181361c5-87f6-48f3-af90-6fa805a3ce1e"). InnerVolumeSpecName "kube-api-access-7wk8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 15:12:26 crc kubenswrapper[4676]: I0930 15:12:26.494493 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wk8l\" (UniqueName: \"kubernetes.io/projected/181361c5-87f6-48f3-af90-6fa805a3ce1e-kube-api-access-7wk8l\") on node \"crc\" DevicePath \"\"" Sep 30 15:12:26 crc kubenswrapper[4676]: I0930 15:12:26.549451 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/181361c5-87f6-48f3-af90-6fa805a3ce1e-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "181361c5-87f6-48f3-af90-6fa805a3ce1e" (UID: "181361c5-87f6-48f3-af90-6fa805a3ce1e"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 15:12:26 crc kubenswrapper[4676]: I0930 15:12:26.596121 4676 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/181361c5-87f6-48f3-af90-6fa805a3ce1e-must-gather-output\") on node \"crc\" DevicePath \"\"" Sep 30 15:12:27 crc kubenswrapper[4676]: I0930 15:12:27.287856 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n4b4l/must-gather-4bsdb" Sep 30 15:12:27 crc kubenswrapper[4676]: I0930 15:12:27.446010 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="181361c5-87f6-48f3-af90-6fa805a3ce1e" path="/var/lib/kubelet/pods/181361c5-87f6-48f3-af90-6fa805a3ce1e/volumes" Sep 30 15:13:24 crc kubenswrapper[4676]: I0930 15:13:24.433130 4676 scope.go:117] "RemoveContainer" containerID="0ebb3bb354cc00eaf9bfa136ef5ec932074beeb83f53c21de7030ae7ad0f2764" Sep 30 15:13:24 crc kubenswrapper[4676]: I0930 15:13:24.454386 4676 scope.go:117] "RemoveContainer" containerID="d206abf5e5622ace975e1f2162ef7a6afdc4030e43b1084febbd57c3473b9330" Sep 30 15:13:46 crc kubenswrapper[4676]: I0930 15:13:46.198023 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gr2k8"] Sep 30 15:13:46 crc kubenswrapper[4676]: E0930 15:13:46.201971 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="181361c5-87f6-48f3-af90-6fa805a3ce1e" containerName="copy" Sep 30 15:13:46 crc kubenswrapper[4676]: I0930 15:13:46.202010 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="181361c5-87f6-48f3-af90-6fa805a3ce1e" containerName="copy" Sep 30 15:13:46 crc kubenswrapper[4676]: E0930 15:13:46.202022 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="181361c5-87f6-48f3-af90-6fa805a3ce1e" containerName="gather" Sep 30 15:13:46 crc kubenswrapper[4676]: I0930 15:13:46.202029 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="181361c5-87f6-48f3-af90-6fa805a3ce1e" containerName="gather" Sep 30 15:13:46 crc kubenswrapper[4676]: E0930 15:13:46.202052 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="440c6042-5b85-4a6a-9b4f-4db7306d08ba" containerName="container-00" Sep 30 15:13:46 crc kubenswrapper[4676]: I0930 15:13:46.202064 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="440c6042-5b85-4a6a-9b4f-4db7306d08ba" containerName="container-00" Sep 30 15:13:46 crc kubenswrapper[4676]: I0930 15:13:46.202443 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="440c6042-5b85-4a6a-9b4f-4db7306d08ba" containerName="container-00" Sep 30 15:13:46 crc kubenswrapper[4676]: I0930 15:13:46.202478 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="181361c5-87f6-48f3-af90-6fa805a3ce1e" containerName="copy" Sep 30 15:13:46 crc kubenswrapper[4676]: I0930 15:13:46.202497 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="181361c5-87f6-48f3-af90-6fa805a3ce1e" containerName="gather" Sep 30 15:13:46 crc kubenswrapper[4676]: I0930 15:13:46.204338 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gr2k8" Sep 30 15:13:46 crc kubenswrapper[4676]: I0930 15:13:46.238312 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gr2k8"] Sep 30 15:13:46 crc kubenswrapper[4676]: I0930 15:13:46.338806 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmf4q\" (UniqueName: \"kubernetes.io/projected/2f9338fc-d902-462e-96f0-9b595e8d3e42-kube-api-access-xmf4q\") pod \"redhat-operators-gr2k8\" (UID: \"2f9338fc-d902-462e-96f0-9b595e8d3e42\") " pod="openshift-marketplace/redhat-operators-gr2k8" Sep 30 15:13:46 crc kubenswrapper[4676]: I0930 15:13:46.338890 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f9338fc-d902-462e-96f0-9b595e8d3e42-catalog-content\") pod \"redhat-operators-gr2k8\" (UID: \"2f9338fc-d902-462e-96f0-9b595e8d3e42\") " pod="openshift-marketplace/redhat-operators-gr2k8" Sep 30 15:13:46 crc kubenswrapper[4676]: I0930 15:13:46.338929 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f9338fc-d902-462e-96f0-9b595e8d3e42-utilities\") pod \"redhat-operators-gr2k8\" (UID: \"2f9338fc-d902-462e-96f0-9b595e8d3e42\") " pod="openshift-marketplace/redhat-operators-gr2k8" Sep 30 15:13:46 crc kubenswrapper[4676]: I0930 15:13:46.443232 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmf4q\" (UniqueName: \"kubernetes.io/projected/2f9338fc-d902-462e-96f0-9b595e8d3e42-kube-api-access-xmf4q\") pod \"redhat-operators-gr2k8\" (UID: \"2f9338fc-d902-462e-96f0-9b595e8d3e42\") " pod="openshift-marketplace/redhat-operators-gr2k8" Sep 30 15:13:46 crc kubenswrapper[4676]: I0930 15:13:46.443390 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f9338fc-d902-462e-96f0-9b595e8d3e42-catalog-content\") pod \"redhat-operators-gr2k8\" (UID: \"2f9338fc-d902-462e-96f0-9b595e8d3e42\") " pod="openshift-marketplace/redhat-operators-gr2k8" Sep 30 15:13:46 crc kubenswrapper[4676]: I0930 15:13:46.443499 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f9338fc-d902-462e-96f0-9b595e8d3e42-utilities\") pod \"redhat-operators-gr2k8\" (UID: \"2f9338fc-d902-462e-96f0-9b595e8d3e42\") " pod="openshift-marketplace/redhat-operators-gr2k8" Sep 30 15:13:46 crc kubenswrapper[4676]: I0930 15:13:46.444108 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f9338fc-d902-462e-96f0-9b595e8d3e42-utilities\") pod \"redhat-operators-gr2k8\" (UID: \"2f9338fc-d902-462e-96f0-9b595e8d3e42\") " pod="openshift-marketplace/redhat-operators-gr2k8" Sep 30 15:13:46 crc kubenswrapper[4676]: I0930 15:13:46.444455 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f9338fc-d902-462e-96f0-9b595e8d3e42-catalog-content\") pod \"redhat-operators-gr2k8\" (UID: \"2f9338fc-d902-462e-96f0-9b595e8d3e42\") " pod="openshift-marketplace/redhat-operators-gr2k8" Sep 30 15:13:46 crc kubenswrapper[4676]: I0930 15:13:46.468324 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmf4q\" (UniqueName: \"kubernetes.io/projected/2f9338fc-d902-462e-96f0-9b595e8d3e42-kube-api-access-xmf4q\") pod \"redhat-operators-gr2k8\" (UID: \"2f9338fc-d902-462e-96f0-9b595e8d3e42\") " pod="openshift-marketplace/redhat-operators-gr2k8" Sep 30 15:13:46 crc kubenswrapper[4676]: I0930 15:13:46.540299 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gr2k8" Sep 30 15:13:47 crc kubenswrapper[4676]: I0930 15:13:47.005861 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gr2k8"] Sep 30 15:13:47 crc kubenswrapper[4676]: I0930 15:13:47.029907 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gr2k8" event={"ID":"2f9338fc-d902-462e-96f0-9b595e8d3e42","Type":"ContainerStarted","Data":"9f92f702efb6ab8a045317699d166a9bd1a51700ee9788d438108266aac6b8d9"} Sep 30 15:13:48 crc kubenswrapper[4676]: I0930 15:13:48.043477 4676 generic.go:334] "Generic (PLEG): container finished" podID="2f9338fc-d902-462e-96f0-9b595e8d3e42" containerID="ce2391b8cf0b3ed8147a360cf0790b2927e60228b32642d8d711230d09692209" exitCode=0 Sep 30 15:13:48 crc kubenswrapper[4676]: I0930 15:13:48.043805 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gr2k8" event={"ID":"2f9338fc-d902-462e-96f0-9b595e8d3e42","Type":"ContainerDied","Data":"ce2391b8cf0b3ed8147a360cf0790b2927e60228b32642d8d711230d09692209"} Sep 30 15:13:48 crc kubenswrapper[4676]: I0930 15:13:48.046999 4676 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 15:13:50 crc kubenswrapper[4676]: I0930 15:13:50.062834 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gr2k8" event={"ID":"2f9338fc-d902-462e-96f0-9b595e8d3e42","Type":"ContainerStarted","Data":"bd1fdf9a627b659f24e1681fa230b7d9682297d71386a8b05dd686a7afc95250"} Sep 30 15:13:52 crc kubenswrapper[4676]: I0930 15:13:52.082363 4676 generic.go:334] "Generic (PLEG): container finished" podID="2f9338fc-d902-462e-96f0-9b595e8d3e42" containerID="bd1fdf9a627b659f24e1681fa230b7d9682297d71386a8b05dd686a7afc95250" exitCode=0 Sep 30 15:13:52 crc kubenswrapper[4676]: I0930 15:13:52.082489 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gr2k8" event={"ID":"2f9338fc-d902-462e-96f0-9b595e8d3e42","Type":"ContainerDied","Data":"bd1fdf9a627b659f24e1681fa230b7d9682297d71386a8b05dd686a7afc95250"} Sep 30 15:13:53 crc kubenswrapper[4676]: I0930 15:13:53.093465 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gr2k8" event={"ID":"2f9338fc-d902-462e-96f0-9b595e8d3e42","Type":"ContainerStarted","Data":"5e6780de029fd5068ce791e914914228f06a1e40a793b255d0fd27c5c9f54fb4"} Sep 30 15:13:53 crc kubenswrapper[4676]: I0930 15:13:53.123287 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gr2k8" podStartSLOduration=2.573769265 podStartE2EDuration="7.123268136s" podCreationTimestamp="2025-09-30 15:13:46 +0000 UTC" firstStartedPulling="2025-09-30 15:13:48.046595328 +0000 UTC m=+4532.029683757" lastFinishedPulling="2025-09-30 15:13:52.596094199 +0000 UTC m=+4536.579182628" observedRunningTime="2025-09-30 15:13:53.113637414 +0000 UTC m=+4537.096725863" watchObservedRunningTime="2025-09-30 15:13:53.123268136 +0000 UTC m=+4537.106356565" Sep 30 15:13:56 crc kubenswrapper[4676]: I0930 15:13:56.541170 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gr2k8" Sep 30 15:13:56 crc kubenswrapper[4676]: I0930 15:13:56.543495 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gr2k8" Sep 30 15:13:57 crc kubenswrapper[4676]: I0930 15:13:57.595274 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gr2k8" podUID="2f9338fc-d902-462e-96f0-9b595e8d3e42" containerName="registry-server" probeResult="failure" output=< Sep 30 15:13:57 crc kubenswrapper[4676]: timeout: failed to connect service ":50051" within 1s Sep 30 15:13:57 crc kubenswrapper[4676]: > Sep 30 15:14:06 crc kubenswrapper[4676]: I0930 15:14:06.596900 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gr2k8" Sep 30 15:14:06 crc kubenswrapper[4676]: I0930 15:14:06.657205 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gr2k8" Sep 30 15:14:06 crc kubenswrapper[4676]: I0930 15:14:06.852044 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gr2k8"] Sep 30 15:14:08 crc kubenswrapper[4676]: I0930 15:14:08.241676 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gr2k8" podUID="2f9338fc-d902-462e-96f0-9b595e8d3e42" containerName="registry-server" containerID="cri-o://5e6780de029fd5068ce791e914914228f06a1e40a793b255d0fd27c5c9f54fb4" gracePeriod=2 Sep 30 15:14:08 crc kubenswrapper[4676]: I0930 15:14:08.696712 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gr2k8" Sep 30 15:14:08 crc kubenswrapper[4676]: I0930 15:14:08.820064 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f9338fc-d902-462e-96f0-9b595e8d3e42-catalog-content\") pod \"2f9338fc-d902-462e-96f0-9b595e8d3e42\" (UID: \"2f9338fc-d902-462e-96f0-9b595e8d3e42\") " Sep 30 15:14:08 crc kubenswrapper[4676]: I0930 15:14:08.820542 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f9338fc-d902-462e-96f0-9b595e8d3e42-utilities\") pod \"2f9338fc-d902-462e-96f0-9b595e8d3e42\" (UID: \"2f9338fc-d902-462e-96f0-9b595e8d3e42\") " Sep 30 15:14:08 crc kubenswrapper[4676]: I0930 15:14:08.820617 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmf4q\" (UniqueName: \"kubernetes.io/projected/2f9338fc-d902-462e-96f0-9b595e8d3e42-kube-api-access-xmf4q\") pod \"2f9338fc-d902-462e-96f0-9b595e8d3e42\" (UID: \"2f9338fc-d902-462e-96f0-9b595e8d3e42\") " Sep 30 15:14:08 crc kubenswrapper[4676]: I0930 15:14:08.821174 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f9338fc-d902-462e-96f0-9b595e8d3e42-utilities" (OuterVolumeSpecName: "utilities") pod "2f9338fc-d902-462e-96f0-9b595e8d3e42" (UID: "2f9338fc-d902-462e-96f0-9b595e8d3e42"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 15:14:08 crc kubenswrapper[4676]: I0930 15:14:08.826278 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f9338fc-d902-462e-96f0-9b595e8d3e42-kube-api-access-xmf4q" (OuterVolumeSpecName: "kube-api-access-xmf4q") pod "2f9338fc-d902-462e-96f0-9b595e8d3e42" (UID: "2f9338fc-d902-462e-96f0-9b595e8d3e42"). InnerVolumeSpecName "kube-api-access-xmf4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 15:14:08 crc kubenswrapper[4676]: I0930 15:14:08.911079 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f9338fc-d902-462e-96f0-9b595e8d3e42-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2f9338fc-d902-462e-96f0-9b595e8d3e42" (UID: "2f9338fc-d902-462e-96f0-9b595e8d3e42"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 15:14:08 crc kubenswrapper[4676]: I0930 15:14:08.923403 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmf4q\" (UniqueName: \"kubernetes.io/projected/2f9338fc-d902-462e-96f0-9b595e8d3e42-kube-api-access-xmf4q\") on node \"crc\" DevicePath \"\"" Sep 30 15:14:08 crc kubenswrapper[4676]: I0930 15:14:08.923456 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f9338fc-d902-462e-96f0-9b595e8d3e42-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 15:14:08 crc kubenswrapper[4676]: I0930 15:14:08.923470 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f9338fc-d902-462e-96f0-9b595e8d3e42-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 15:14:09 crc kubenswrapper[4676]: I0930 15:14:09.267771 4676 generic.go:334] "Generic (PLEG): container finished" podID="2f9338fc-d902-462e-96f0-9b595e8d3e42" containerID="5e6780de029fd5068ce791e914914228f06a1e40a793b255d0fd27c5c9f54fb4" exitCode=0 Sep 30 15:14:09 crc kubenswrapper[4676]: I0930 15:14:09.267824 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gr2k8" event={"ID":"2f9338fc-d902-462e-96f0-9b595e8d3e42","Type":"ContainerDied","Data":"5e6780de029fd5068ce791e914914228f06a1e40a793b255d0fd27c5c9f54fb4"} Sep 30 15:14:09 crc kubenswrapper[4676]: I0930 15:14:09.267856 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gr2k8" event={"ID":"2f9338fc-d902-462e-96f0-9b595e8d3e42","Type":"ContainerDied","Data":"9f92f702efb6ab8a045317699d166a9bd1a51700ee9788d438108266aac6b8d9"} Sep 30 15:14:09 crc kubenswrapper[4676]: I0930 15:14:09.267896 4676 scope.go:117] "RemoveContainer" containerID="5e6780de029fd5068ce791e914914228f06a1e40a793b255d0fd27c5c9f54fb4" Sep 30 15:14:09 crc kubenswrapper[4676]: I0930 15:14:09.267947 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gr2k8" Sep 30 15:14:09 crc kubenswrapper[4676]: I0930 15:14:09.299414 4676 scope.go:117] "RemoveContainer" containerID="bd1fdf9a627b659f24e1681fa230b7d9682297d71386a8b05dd686a7afc95250" Sep 30 15:14:09 crc kubenswrapper[4676]: I0930 15:14:09.311932 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gr2k8"] Sep 30 15:14:09 crc kubenswrapper[4676]: I0930 15:14:09.320125 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gr2k8"] Sep 30 15:14:09 crc kubenswrapper[4676]: I0930 15:14:09.327246 4676 scope.go:117] "RemoveContainer" containerID="ce2391b8cf0b3ed8147a360cf0790b2927e60228b32642d8d711230d09692209" Sep 30 15:14:09 crc kubenswrapper[4676]: I0930 15:14:09.384450 4676 scope.go:117] "RemoveContainer" containerID="5e6780de029fd5068ce791e914914228f06a1e40a793b255d0fd27c5c9f54fb4" Sep 30 15:14:09 crc kubenswrapper[4676]: E0930 15:14:09.384994 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e6780de029fd5068ce791e914914228f06a1e40a793b255d0fd27c5c9f54fb4\": container with ID starting with 5e6780de029fd5068ce791e914914228f06a1e40a793b255d0fd27c5c9f54fb4 not found: ID does not exist" containerID="5e6780de029fd5068ce791e914914228f06a1e40a793b255d0fd27c5c9f54fb4" Sep 30 15:14:09 crc kubenswrapper[4676]: I0930 15:14:09.385024 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e6780de029fd5068ce791e914914228f06a1e40a793b255d0fd27c5c9f54fb4"} err="failed to get container status \"5e6780de029fd5068ce791e914914228f06a1e40a793b255d0fd27c5c9f54fb4\": rpc error: code = NotFound desc = could not find container \"5e6780de029fd5068ce791e914914228f06a1e40a793b255d0fd27c5c9f54fb4\": container with ID starting with 5e6780de029fd5068ce791e914914228f06a1e40a793b255d0fd27c5c9f54fb4 not found: ID does not exist" Sep 30 15:14:09 crc kubenswrapper[4676]: I0930 15:14:09.385047 4676 scope.go:117] "RemoveContainer" containerID="bd1fdf9a627b659f24e1681fa230b7d9682297d71386a8b05dd686a7afc95250" Sep 30 15:14:09 crc kubenswrapper[4676]: E0930 15:14:09.385351 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd1fdf9a627b659f24e1681fa230b7d9682297d71386a8b05dd686a7afc95250\": container with ID starting with bd1fdf9a627b659f24e1681fa230b7d9682297d71386a8b05dd686a7afc95250 not found: ID does not exist" containerID="bd1fdf9a627b659f24e1681fa230b7d9682297d71386a8b05dd686a7afc95250" Sep 30 15:14:09 crc kubenswrapper[4676]: I0930 15:14:09.385396 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd1fdf9a627b659f24e1681fa230b7d9682297d71386a8b05dd686a7afc95250"} err="failed to get container status \"bd1fdf9a627b659f24e1681fa230b7d9682297d71386a8b05dd686a7afc95250\": rpc error: code = NotFound desc = could not find container \"bd1fdf9a627b659f24e1681fa230b7d9682297d71386a8b05dd686a7afc95250\": container with ID starting with bd1fdf9a627b659f24e1681fa230b7d9682297d71386a8b05dd686a7afc95250 not found: ID does not exist" Sep 30 15:14:09 crc kubenswrapper[4676]: I0930 15:14:09.385427 4676 scope.go:117] "RemoveContainer" containerID="ce2391b8cf0b3ed8147a360cf0790b2927e60228b32642d8d711230d09692209" Sep 30 15:14:09 crc kubenswrapper[4676]: E0930 15:14:09.385747 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce2391b8cf0b3ed8147a360cf0790b2927e60228b32642d8d711230d09692209\": container with ID starting with ce2391b8cf0b3ed8147a360cf0790b2927e60228b32642d8d711230d09692209 not found: ID does not exist" containerID="ce2391b8cf0b3ed8147a360cf0790b2927e60228b32642d8d711230d09692209" Sep 30 15:14:09 crc kubenswrapper[4676]: I0930 15:14:09.385776 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce2391b8cf0b3ed8147a360cf0790b2927e60228b32642d8d711230d09692209"} err="failed to get container status \"ce2391b8cf0b3ed8147a360cf0790b2927e60228b32642d8d711230d09692209\": rpc error: code = NotFound desc = could not find container \"ce2391b8cf0b3ed8147a360cf0790b2927e60228b32642d8d711230d09692209\": container with ID starting with ce2391b8cf0b3ed8147a360cf0790b2927e60228b32642d8d711230d09692209 not found: ID does not exist" Sep 30 15:14:09 crc kubenswrapper[4676]: I0930 15:14:09.445500 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f9338fc-d902-462e-96f0-9b595e8d3e42" path="/var/lib/kubelet/pods/2f9338fc-d902-462e-96f0-9b595e8d3e42/volumes" Sep 30 15:14:29 crc kubenswrapper[4676]: I0930 15:14:29.919373 4676 patch_prober.go:28] interesting pod/machine-config-daemon-4k2dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 15:14:29 crc kubenswrapper[4676]: I0930 15:14:29.920025 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 15:14:59 crc kubenswrapper[4676]: I0930 15:14:59.919056 4676 patch_prober.go:28] interesting pod/machine-config-daemon-4k2dp container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 15:14:59 crc kubenswrapper[4676]: I0930 15:14:59.919569 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4k2dp" podUID="af133cb7-f0e4-428e-b348-c6e81493fc1d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 15:15:00 crc kubenswrapper[4676]: I0930 15:15:00.146141 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320755-7622b"] Sep 30 15:15:00 crc kubenswrapper[4676]: E0930 15:15:00.146545 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f9338fc-d902-462e-96f0-9b595e8d3e42" containerName="registry-server" Sep 30 15:15:00 crc kubenswrapper[4676]: I0930 15:15:00.146572 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f9338fc-d902-462e-96f0-9b595e8d3e42" containerName="registry-server" Sep 30 15:15:00 crc kubenswrapper[4676]: E0930 15:15:00.146611 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f9338fc-d902-462e-96f0-9b595e8d3e42" containerName="extract-content" Sep 30 15:15:00 crc kubenswrapper[4676]: I0930 15:15:00.146619 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f9338fc-d902-462e-96f0-9b595e8d3e42" containerName="extract-content" Sep 30 15:15:00 crc kubenswrapper[4676]: E0930 15:15:00.146640 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f9338fc-d902-462e-96f0-9b595e8d3e42" containerName="extract-utilities" Sep 30 15:15:00 crc kubenswrapper[4676]: I0930 15:15:00.146650 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f9338fc-d902-462e-96f0-9b595e8d3e42" containerName="extract-utilities" Sep 30 15:15:00 crc kubenswrapper[4676]: I0930 15:15:00.146867 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f9338fc-d902-462e-96f0-9b595e8d3e42" containerName="registry-server" Sep 30 15:15:00 crc kubenswrapper[4676]: I0930 15:15:00.147607 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320755-7622b" Sep 30 15:15:00 crc kubenswrapper[4676]: I0930 15:15:00.150120 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 15:15:00 crc kubenswrapper[4676]: I0930 15:15:00.150651 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 15:15:00 crc kubenswrapper[4676]: I0930 15:15:00.157454 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320755-7622b"] Sep 30 15:15:00 crc kubenswrapper[4676]: I0930 15:15:00.244971 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/71a88c33-3009-4d1c-8019-49936682c1e1-secret-volume\") pod \"collect-profiles-29320755-7622b\" (UID: \"71a88c33-3009-4d1c-8019-49936682c1e1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320755-7622b" Sep 30 15:15:00 crc kubenswrapper[4676]: I0930 15:15:00.245035 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8j5z\" (UniqueName: \"kubernetes.io/projected/71a88c33-3009-4d1c-8019-49936682c1e1-kube-api-access-r8j5z\") pod \"collect-profiles-29320755-7622b\" (UID: \"71a88c33-3009-4d1c-8019-49936682c1e1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320755-7622b" Sep 30 15:15:00 crc kubenswrapper[4676]: I0930 15:15:00.245710 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/71a88c33-3009-4d1c-8019-49936682c1e1-config-volume\") pod \"collect-profiles-29320755-7622b\" (UID: \"71a88c33-3009-4d1c-8019-49936682c1e1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320755-7622b" Sep 30 15:15:00 crc kubenswrapper[4676]: I0930 15:15:00.348500 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/71a88c33-3009-4d1c-8019-49936682c1e1-secret-volume\") pod \"collect-profiles-29320755-7622b\" (UID: \"71a88c33-3009-4d1c-8019-49936682c1e1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320755-7622b" Sep 30 15:15:00 crc kubenswrapper[4676]: I0930 15:15:00.348574 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8j5z\" (UniqueName: \"kubernetes.io/projected/71a88c33-3009-4d1c-8019-49936682c1e1-kube-api-access-r8j5z\") pod \"collect-profiles-29320755-7622b\" (UID: \"71a88c33-3009-4d1c-8019-49936682c1e1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320755-7622b" Sep 30 15:15:00 crc kubenswrapper[4676]: I0930 15:15:00.348700 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/71a88c33-3009-4d1c-8019-49936682c1e1-config-volume\") pod \"collect-profiles-29320755-7622b\" (UID: \"71a88c33-3009-4d1c-8019-49936682c1e1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320755-7622b" Sep 30 15:15:00 crc kubenswrapper[4676]: I0930 15:15:00.351653 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/71a88c33-3009-4d1c-8019-49936682c1e1-config-volume\") pod \"collect-profiles-29320755-7622b\" (UID: \"71a88c33-3009-4d1c-8019-49936682c1e1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320755-7622b" Sep 30 15:15:00 crc kubenswrapper[4676]: I0930 15:15:00.360105 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/71a88c33-3009-4d1c-8019-49936682c1e1-secret-volume\") pod \"collect-profiles-29320755-7622b\" (UID: \"71a88c33-3009-4d1c-8019-49936682c1e1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320755-7622b" Sep 30 15:15:00 crc kubenswrapper[4676]: I0930 15:15:00.370077 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8j5z\" (UniqueName: \"kubernetes.io/projected/71a88c33-3009-4d1c-8019-49936682c1e1-kube-api-access-r8j5z\") pod \"collect-profiles-29320755-7622b\" (UID: \"71a88c33-3009-4d1c-8019-49936682c1e1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320755-7622b" Sep 30 15:15:00 crc kubenswrapper[4676]: I0930 15:15:00.478179 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320755-7622b" Sep 30 15:15:00 crc kubenswrapper[4676]: I0930 15:15:00.922085 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320755-7622b"] Sep 30 15:15:01 crc kubenswrapper[4676]: I0930 15:15:01.802763 4676 generic.go:334] "Generic (PLEG): container finished" podID="71a88c33-3009-4d1c-8019-49936682c1e1" containerID="9426bb5bbb36088622d0f623741019e806f6ef49eaefff9a3273461fa59a78fa" exitCode=0 Sep 30 15:15:01 crc kubenswrapper[4676]: I0930 15:15:01.803118 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320755-7622b" event={"ID":"71a88c33-3009-4d1c-8019-49936682c1e1","Type":"ContainerDied","Data":"9426bb5bbb36088622d0f623741019e806f6ef49eaefff9a3273461fa59a78fa"} Sep 30 15:15:01 crc kubenswrapper[4676]: I0930 15:15:01.803151 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320755-7622b" event={"ID":"71a88c33-3009-4d1c-8019-49936682c1e1","Type":"ContainerStarted","Data":"762dc83ce11d2e248556b941bbfb2707f60e3e4e4b180e2f39db47cbf8cf34ad"}